Mar 17 18:19:05.965954 kernel: Booting Linux on physical CPU 0x0000000000 [0x410fd083] Mar 17 18:19:05.965991 kernel: Linux version 5.15.179-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 11.3.1_p20221209 p3) 11.3.1 20221209, GNU ld (Gentoo 2.39 p5) 2.39.0) #1 SMP PREEMPT Mon Mar 17 17:11:44 -00 2025 Mar 17 18:19:05.966013 kernel: efi: EFI v2.70 by EDK II Mar 17 18:19:05.966028 kernel: efi: SMBIOS=0x7bed0000 SMBIOS 3.0=0x7beb0000 ACPI=0x786e0000 ACPI 2.0=0x786e0014 MEMATTR=0x7b003a98 MEMRESERVE=0x7171cf98 Mar 17 18:19:05.966042 kernel: ACPI: Early table checksum verification disabled Mar 17 18:19:05.966055 kernel: ACPI: RSDP 0x00000000786E0014 000024 (v02 AMAZON) Mar 17 18:19:05.966071 kernel: ACPI: XSDT 0x00000000786D00E8 000064 (v01 AMAZON AMZNFACP 00000001 01000013) Mar 17 18:19:05.966085 kernel: ACPI: FACP 0x00000000786B0000 000114 (v06 AMAZON AMZNFACP 00000001 AMZN 00000001) Mar 17 18:19:05.966099 kernel: ACPI: DSDT 0x0000000078640000 00159D (v02 AMAZON AMZNDSDT 00000001 INTL 20160527) Mar 17 18:19:05.966113 kernel: ACPI: APIC 0x00000000786C0000 000108 (v04 AMAZON AMZNAPIC 00000001 AMZN 00000001) Mar 17 18:19:05.966130 kernel: ACPI: SPCR 0x00000000786A0000 000050 (v02 AMAZON AMZNSPCR 00000001 AMZN 00000001) Mar 17 18:19:05.966144 kernel: ACPI: GTDT 0x0000000078690000 000060 (v02 AMAZON AMZNGTDT 00000001 AMZN 00000001) Mar 17 18:19:05.966158 kernel: ACPI: MCFG 0x0000000078680000 00003C (v02 AMAZON AMZNMCFG 00000001 AMZN 00000001) Mar 17 18:19:05.966172 kernel: ACPI: SLIT 0x0000000078670000 00002D (v01 AMAZON AMZNSLIT 00000001 AMZN 00000001) Mar 17 18:19:05.966188 kernel: ACPI: IORT 0x0000000078660000 000078 (v01 AMAZON AMZNIORT 00000001 AMZN 00000001) Mar 17 18:19:05.966206 kernel: ACPI: PPTT 0x0000000078650000 0000EC (v01 AMAZON AMZNPPTT 00000001 AMZN 00000001) Mar 17 18:19:05.966221 kernel: ACPI: SPCR: console: uart,mmio,0x90a0000,115200 Mar 17 18:19:05.966235 kernel: earlycon: uart0 at MMIO 0x00000000090a0000 (options '115200') Mar 17 18:19:05.966249 kernel: printk: bootconsole [uart0] enabled Mar 17 18:19:05.966264 kernel: NUMA: Failed to initialise from firmware Mar 17 18:19:05.966278 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x00000004b5ffffff] Mar 17 18:19:05.966293 kernel: NUMA: NODE_DATA [mem 0x4b5843900-0x4b5848fff] Mar 17 18:19:05.966307 kernel: Zone ranges: Mar 17 18:19:05.966322 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Mar 17 18:19:05.972951 kernel: DMA32 empty Mar 17 18:19:05.972982 kernel: Normal [mem 0x0000000100000000-0x00000004b5ffffff] Mar 17 18:19:05.973005 kernel: Movable zone start for each node Mar 17 18:19:05.973021 kernel: Early memory node ranges Mar 17 18:19:05.973036 kernel: node 0: [mem 0x0000000040000000-0x000000007862ffff] Mar 17 18:19:05.973051 kernel: node 0: [mem 0x0000000078630000-0x000000007863ffff] Mar 17 18:19:05.973065 kernel: node 0: [mem 0x0000000078640000-0x00000000786effff] Mar 17 18:19:05.973080 kernel: node 0: [mem 0x00000000786f0000-0x000000007872ffff] Mar 17 18:19:05.973095 kernel: node 0: [mem 0x0000000078730000-0x000000007bbfffff] Mar 17 18:19:05.973123 kernel: node 0: [mem 0x000000007bc00000-0x000000007bfdffff] Mar 17 18:19:05.973140 kernel: node 0: [mem 0x000000007bfe0000-0x000000007fffffff] Mar 17 18:19:05.973155 kernel: node 0: [mem 0x0000000400000000-0x00000004b5ffffff] Mar 17 18:19:05.973170 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x00000004b5ffffff] Mar 17 18:19:05.973185 kernel: On node 0, zone Normal: 8192 pages in unavailable ranges Mar 17 18:19:05.973205 kernel: psci: probing for conduit method from ACPI. Mar 17 18:19:05.973220 kernel: psci: PSCIv1.0 detected in firmware. Mar 17 18:19:05.973241 kernel: psci: Using standard PSCI v0.2 function IDs Mar 17 18:19:05.973257 kernel: psci: Trusted OS migration not required Mar 17 18:19:05.973272 kernel: psci: SMC Calling Convention v1.1 Mar 17 18:19:05.973307 kernel: ACPI: SRAT not present Mar 17 18:19:05.973325 kernel: percpu: Embedded 30 pages/cpu s83032 r8192 d31656 u122880 Mar 17 18:19:05.973374 kernel: pcpu-alloc: s83032 r8192 d31656 u122880 alloc=30*4096 Mar 17 18:19:05.973391 kernel: pcpu-alloc: [0] 0 [0] 1 Mar 17 18:19:05.973406 kernel: Detected PIPT I-cache on CPU0 Mar 17 18:19:05.973422 kernel: CPU features: detected: GIC system register CPU interface Mar 17 18:19:05.973437 kernel: CPU features: detected: Spectre-v2 Mar 17 18:19:05.973452 kernel: CPU features: detected: Spectre-v3a Mar 17 18:19:05.973467 kernel: CPU features: detected: Spectre-BHB Mar 17 18:19:05.973482 kernel: CPU features: kernel page table isolation forced ON by KASLR Mar 17 18:19:05.973497 kernel: CPU features: detected: Kernel page table isolation (KPTI) Mar 17 18:19:05.973518 kernel: CPU features: detected: ARM erratum 1742098 Mar 17 18:19:05.973533 kernel: CPU features: detected: ARM errata 1165522, 1319367, or 1530923 Mar 17 18:19:05.973548 kernel: Built 1 zonelists, mobility grouping on. Total pages: 991872 Mar 17 18:19:05.973564 kernel: Policy zone: Normal Mar 17 18:19:05.973581 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=e034db32d58fe7496a3db6ba3879dd9052cea2cf1597d65edfc7b26afc92530d Mar 17 18:19:05.973598 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 18:19:05.973613 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 18:19:05.973629 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 18:19:05.973644 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 18:19:05.973659 kernel: software IO TLB: mapped [mem 0x000000007c000000-0x0000000080000000] (64MB) Mar 17 18:19:05.973680 kernel: Memory: 3824524K/4030464K available (9792K kernel code, 2094K rwdata, 7584K rodata, 36416K init, 777K bss, 205940K reserved, 0K cma-reserved) Mar 17 18:19:05.973696 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 17 18:19:05.973711 kernel: trace event string verifier disabled Mar 17 18:19:05.973726 kernel: rcu: Preemptible hierarchical RCU implementation. Mar 17 18:19:05.973743 kernel: rcu: RCU event tracing is enabled. Mar 17 18:19:05.973759 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 17 18:19:05.973775 kernel: Trampoline variant of Tasks RCU enabled. Mar 17 18:19:05.973790 kernel: Tracing variant of Tasks RCU enabled. Mar 17 18:19:05.973806 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 18:19:05.973821 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 17 18:19:05.973836 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Mar 17 18:19:05.973851 kernel: GICv3: 96 SPIs implemented Mar 17 18:19:05.973871 kernel: GICv3: 0 Extended SPIs implemented Mar 17 18:19:05.973886 kernel: GICv3: Distributor has no Range Selector support Mar 17 18:19:05.973901 kernel: Root IRQ handler: gic_handle_irq Mar 17 18:19:05.973916 kernel: GICv3: 16 PPIs implemented Mar 17 18:19:05.973931 kernel: GICv3: CPU0: found redistributor 0 region 0:0x0000000010200000 Mar 17 18:19:05.973946 kernel: ACPI: SRAT not present Mar 17 18:19:05.973961 kernel: ITS [mem 0x10080000-0x1009ffff] Mar 17 18:19:05.973976 kernel: ITS@0x0000000010080000: allocated 8192 Devices @400090000 (indirect, esz 8, psz 64K, shr 1) Mar 17 18:19:05.973992 kernel: ITS@0x0000000010080000: allocated 8192 Interrupt Collections @4000a0000 (flat, esz 8, psz 64K, shr 1) Mar 17 18:19:05.974007 kernel: GICv3: using LPI property table @0x00000004000b0000 Mar 17 18:19:05.974022 kernel: ITS: Using hypervisor restricted LPI range [128] Mar 17 18:19:05.974042 kernel: GICv3: CPU0: using allocated LPI pending table @0x00000004000d0000 Mar 17 18:19:05.974058 kernel: arch_timer: cp15 timer(s) running at 83.33MHz (virt). Mar 17 18:19:05.974074 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x13381ebeec, max_idle_ns: 440795203145 ns Mar 17 18:19:05.974090 kernel: sched_clock: 56 bits at 83MHz, resolution 12ns, wraps every 4398046511100ns Mar 17 18:19:05.974105 kernel: Console: colour dummy device 80x25 Mar 17 18:19:05.974121 kernel: printk: console [tty1] enabled Mar 17 18:19:05.974137 kernel: ACPI: Core revision 20210730 Mar 17 18:19:05.974153 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 166.66 BogoMIPS (lpj=83333) Mar 17 18:19:05.974169 kernel: pid_max: default: 32768 minimum: 301 Mar 17 18:19:05.974185 kernel: LSM: Security Framework initializing Mar 17 18:19:05.974204 kernel: SELinux: Initializing. Mar 17 18:19:05.974221 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 18:19:05.974237 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 18:19:05.974252 kernel: rcu: Hierarchical SRCU implementation. Mar 17 18:19:05.974268 kernel: Platform MSI: ITS@0x10080000 domain created Mar 17 18:19:05.974284 kernel: PCI/MSI: ITS@0x10080000 domain created Mar 17 18:19:05.974300 kernel: Remapping and enabling EFI services. Mar 17 18:19:05.974316 kernel: smp: Bringing up secondary CPUs ... Mar 17 18:19:05.974354 kernel: Detected PIPT I-cache on CPU1 Mar 17 18:19:05.974376 kernel: GICv3: CPU1: found redistributor 1 region 0:0x0000000010220000 Mar 17 18:19:05.974401 kernel: GICv3: CPU1: using allocated LPI pending table @0x00000004000e0000 Mar 17 18:19:05.974417 kernel: CPU1: Booted secondary processor 0x0000000001 [0x410fd083] Mar 17 18:19:05.974433 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 18:19:05.974448 kernel: SMP: Total of 2 processors activated. Mar 17 18:19:05.974464 kernel: CPU features: detected: 32-bit EL0 Support Mar 17 18:19:05.974480 kernel: CPU features: detected: 32-bit EL1 Support Mar 17 18:19:05.974496 kernel: CPU features: detected: CRC32 instructions Mar 17 18:19:05.974511 kernel: CPU: All CPU(s) started at EL1 Mar 17 18:19:05.974527 kernel: alternatives: patching kernel code Mar 17 18:19:05.974546 kernel: devtmpfs: initialized Mar 17 18:19:05.974563 kernel: KASLR disabled due to lack of seed Mar 17 18:19:05.974588 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 18:19:05.974608 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 17 18:19:05.974624 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 18:19:05.974640 kernel: SMBIOS 3.0.0 present. Mar 17 18:19:05.974656 kernel: DMI: Amazon EC2 a1.large/, BIOS 1.0 11/1/2018 Mar 17 18:19:05.974672 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 18:19:05.974688 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Mar 17 18:19:05.974705 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Mar 17 18:19:05.974722 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Mar 17 18:19:05.974741 kernel: audit: initializing netlink subsys (disabled) Mar 17 18:19:05.974758 kernel: audit: type=2000 audit(0.248:1): state=initialized audit_enabled=0 res=1 Mar 17 18:19:05.974774 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 18:19:05.974790 kernel: cpuidle: using governor menu Mar 17 18:19:05.974806 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Mar 17 18:19:05.974826 kernel: ASID allocator initialised with 32768 entries Mar 17 18:19:05.974842 kernel: ACPI: bus type PCI registered Mar 17 18:19:05.974859 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 18:19:05.974875 kernel: Serial: AMBA PL011 UART driver Mar 17 18:19:05.974891 kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 18:19:05.974907 kernel: HugeTLB registered 32.0 MiB page size, pre-allocated 0 pages Mar 17 18:19:05.974923 kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 18:19:05.974939 kernel: HugeTLB registered 64.0 KiB page size, pre-allocated 0 pages Mar 17 18:19:05.974956 kernel: cryptd: max_cpu_qlen set to 1000 Mar 17 18:19:05.974975 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Mar 17 18:19:05.974992 kernel: ACPI: Added _OSI(Module Device) Mar 17 18:19:05.975008 kernel: ACPI: Added _OSI(Processor Device) Mar 17 18:19:05.975024 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 18:19:05.975040 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 18:19:05.975056 kernel: ACPI: Added _OSI(Linux-Dell-Video) Mar 17 18:19:05.975072 kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Mar 17 18:19:05.975088 kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Mar 17 18:19:05.975104 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 18:19:05.975123 kernel: ACPI: Interpreter enabled Mar 17 18:19:05.975140 kernel: ACPI: Using GIC for interrupt routing Mar 17 18:19:05.975156 kernel: ACPI: MCFG table detected, 1 entries Mar 17 18:19:05.975173 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-0f]) Mar 17 18:19:05.975471 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 17 18:19:05.975666 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Mar 17 18:19:05.975853 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Mar 17 18:19:05.976038 kernel: acpi PNP0A08:00: ECAM area [mem 0x20000000-0x20ffffff] reserved by PNP0C02:00 Mar 17 18:19:05.976257 kernel: acpi PNP0A08:00: ECAM at [mem 0x20000000-0x20ffffff] for [bus 00-0f] Mar 17 18:19:05.976282 kernel: ACPI: Remapped I/O 0x000000001fff0000 to [io 0x0000-0xffff window] Mar 17 18:19:05.976299 kernel: acpiphp: Slot [1] registered Mar 17 18:19:05.976315 kernel: acpiphp: Slot [2] registered Mar 17 18:19:05.976348 kernel: acpiphp: Slot [3] registered Mar 17 18:19:05.976368 kernel: acpiphp: Slot [4] registered Mar 17 18:19:05.976385 kernel: acpiphp: Slot [5] registered Mar 17 18:19:05.976401 kernel: acpiphp: Slot [6] registered Mar 17 18:19:05.976417 kernel: acpiphp: Slot [7] registered Mar 17 18:19:05.976438 kernel: acpiphp: Slot [8] registered Mar 17 18:19:05.976454 kernel: acpiphp: Slot [9] registered Mar 17 18:19:05.976470 kernel: acpiphp: Slot [10] registered Mar 17 18:19:05.976486 kernel: acpiphp: Slot [11] registered Mar 17 18:19:05.976502 kernel: acpiphp: Slot [12] registered Mar 17 18:19:05.976518 kernel: acpiphp: Slot [13] registered Mar 17 18:19:05.976534 kernel: acpiphp: Slot [14] registered Mar 17 18:19:05.976550 kernel: acpiphp: Slot [15] registered Mar 17 18:19:05.976566 kernel: acpiphp: Slot [16] registered Mar 17 18:19:05.976585 kernel: acpiphp: Slot [17] registered Mar 17 18:19:05.976602 kernel: acpiphp: Slot [18] registered Mar 17 18:19:05.976617 kernel: acpiphp: Slot [19] registered Mar 17 18:19:05.976633 kernel: acpiphp: Slot [20] registered Mar 17 18:19:05.976649 kernel: acpiphp: Slot [21] registered Mar 17 18:19:05.976665 kernel: acpiphp: Slot [22] registered Mar 17 18:19:05.976681 kernel: acpiphp: Slot [23] registered Mar 17 18:19:05.976697 kernel: acpiphp: Slot [24] registered Mar 17 18:19:05.976712 kernel: acpiphp: Slot [25] registered Mar 17 18:19:05.976728 kernel: acpiphp: Slot [26] registered Mar 17 18:19:05.976748 kernel: acpiphp: Slot [27] registered Mar 17 18:19:05.976764 kernel: acpiphp: Slot [28] registered Mar 17 18:19:05.976780 kernel: acpiphp: Slot [29] registered Mar 17 18:19:05.976795 kernel: acpiphp: Slot [30] registered Mar 17 18:19:05.976811 kernel: acpiphp: Slot [31] registered Mar 17 18:19:05.976827 kernel: PCI host bridge to bus 0000:00 Mar 17 18:19:05.977054 kernel: pci_bus 0000:00: root bus resource [mem 0x80000000-0xffffffff window] Mar 17 18:19:05.977233 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Mar 17 18:19:05.977526 kernel: pci_bus 0000:00: root bus resource [mem 0x400000000000-0x407fffffffff window] Mar 17 18:19:05.977731 kernel: pci_bus 0000:00: root bus resource [bus 00-0f] Mar 17 18:19:05.979299 kernel: pci 0000:00:00.0: [1d0f:0200] type 00 class 0x060000 Mar 17 18:19:05.982240 kernel: pci 0000:00:01.0: [1d0f:8250] type 00 class 0x070003 Mar 17 18:19:05.982490 kernel: pci 0000:00:01.0: reg 0x10: [mem 0x80118000-0x80118fff] Mar 17 18:19:05.982697 kernel: pci 0000:00:04.0: [1d0f:8061] type 00 class 0x010802 Mar 17 18:19:05.982898 kernel: pci 0000:00:04.0: reg 0x10: [mem 0x80114000-0x80117fff] Mar 17 18:19:05.983085 kernel: pci 0000:00:04.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 17 18:19:05.983321 kernel: pci 0000:00:05.0: [1d0f:ec20] type 00 class 0x020000 Mar 17 18:19:05.987749 kernel: pci 0000:00:05.0: reg 0x10: [mem 0x80110000-0x80113fff] Mar 17 18:19:05.987973 kernel: pci 0000:00:05.0: reg 0x18: [mem 0x80000000-0x800fffff pref] Mar 17 18:19:05.988169 kernel: pci 0000:00:05.0: reg 0x20: [mem 0x80100000-0x8010ffff] Mar 17 18:19:05.988498 kernel: pci 0000:00:05.0: PME# supported from D0 D1 D2 D3hot D3cold Mar 17 18:19:05.988705 kernel: pci 0000:00:05.0: BAR 2: assigned [mem 0x80000000-0x800fffff pref] Mar 17 18:19:05.988901 kernel: pci 0000:00:05.0: BAR 4: assigned [mem 0x80100000-0x8010ffff] Mar 17 18:19:05.989099 kernel: pci 0000:00:04.0: BAR 0: assigned [mem 0x80110000-0x80113fff] Mar 17 18:19:05.989314 kernel: pci 0000:00:05.0: BAR 0: assigned [mem 0x80114000-0x80117fff] Mar 17 18:19:05.989594 kernel: pci 0000:00:01.0: BAR 0: assigned [mem 0x80118000-0x80118fff] Mar 17 18:19:05.997306 kernel: pci_bus 0000:00: resource 4 [mem 0x80000000-0xffffffff window] Mar 17 18:19:05.997544 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Mar 17 18:19:05.997728 kernel: pci_bus 0000:00: resource 6 [mem 0x400000000000-0x407fffffffff window] Mar 17 18:19:05.997752 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Mar 17 18:19:05.997770 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Mar 17 18:19:05.997788 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Mar 17 18:19:05.997804 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Mar 17 18:19:05.997821 kernel: iommu: Default domain type: Translated Mar 17 18:19:05.997837 kernel: iommu: DMA domain TLB invalidation policy: strict mode Mar 17 18:19:05.997854 kernel: vgaarb: loaded Mar 17 18:19:05.997870 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 17 18:19:05.997892 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 17 18:19:05.997908 kernel: PTP clock support registered Mar 17 18:19:05.997925 kernel: Registered efivars operations Mar 17 18:19:05.997941 kernel: clocksource: Switched to clocksource arch_sys_counter Mar 17 18:19:05.997957 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 18:19:05.997974 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 18:19:05.997990 kernel: pnp: PnP ACPI init Mar 17 18:19:05.998191 kernel: system 00:00: [mem 0x20000000-0x2fffffff] could not be reserved Mar 17 18:19:05.998221 kernel: pnp: PnP ACPI: found 1 devices Mar 17 18:19:05.998237 kernel: NET: Registered PF_INET protocol family Mar 17 18:19:05.998254 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 18:19:05.998270 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 18:19:05.998287 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 18:19:05.998304 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 18:19:05.998320 kernel: TCP bind hash table entries: 32768 (order: 7, 524288 bytes, linear) Mar 17 18:19:05.998372 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 18:19:05.998391 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 18:19:05.998414 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 18:19:05.998431 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 18:19:05.998447 kernel: PCI: CLS 0 bytes, default 64 Mar 17 18:19:05.998463 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 3 counters available Mar 17 18:19:05.998480 kernel: kvm [1]: HYP mode not available Mar 17 18:19:05.998496 kernel: Initialise system trusted keyrings Mar 17 18:19:05.998513 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 18:19:05.998529 kernel: Key type asymmetric registered Mar 17 18:19:05.998545 kernel: Asymmetric key parser 'x509' registered Mar 17 18:19:05.998565 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 17 18:19:05.998581 kernel: io scheduler mq-deadline registered Mar 17 18:19:05.998597 kernel: io scheduler kyber registered Mar 17 18:19:05.998613 kernel: io scheduler bfq registered Mar 17 18:19:05.998834 kernel: pl061_gpio ARMH0061:00: PL061 GPIO chip registered Mar 17 18:19:05.998859 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Mar 17 18:19:05.998876 kernel: ACPI: button: Power Button [PWRB] Mar 17 18:19:05.998892 kernel: input: Sleep Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0E:00/input/input1 Mar 17 18:19:05.998913 kernel: ACPI: button: Sleep Button [SLPB] Mar 17 18:19:05.998929 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 18:19:05.998946 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Mar 17 18:19:05.999144 kernel: serial 0000:00:01.0: enabling device (0010 -> 0012) Mar 17 18:19:05.999168 kernel: printk: console [ttyS0] disabled Mar 17 18:19:05.999185 kernel: 0000:00:01.0: ttyS0 at MMIO 0x80118000 (irq = 14, base_baud = 115200) is a 16550A Mar 17 18:19:05.999201 kernel: printk: console [ttyS0] enabled Mar 17 18:19:05.999217 kernel: printk: bootconsole [uart0] disabled Mar 17 18:19:05.999233 kernel: thunder_xcv, ver 1.0 Mar 17 18:19:05.999249 kernel: thunder_bgx, ver 1.0 Mar 17 18:19:05.999269 kernel: nicpf, ver 1.0 Mar 17 18:19:05.999285 kernel: nicvf, ver 1.0 Mar 17 18:19:06.004959 kernel: rtc-efi rtc-efi.0: registered as rtc0 Mar 17 18:19:06.007530 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-03-17T18:19:05 UTC (1742235545) Mar 17 18:19:06.007563 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 18:19:06.007581 kernel: NET: Registered PF_INET6 protocol family Mar 17 18:19:06.007598 kernel: Segment Routing with IPv6 Mar 17 18:19:06.007615 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 18:19:06.007639 kernel: NET: Registered PF_PACKET protocol family Mar 17 18:19:06.007656 kernel: Key type dns_resolver registered Mar 17 18:19:06.007672 kernel: registered taskstats version 1 Mar 17 18:19:06.007688 kernel: Loading compiled-in X.509 certificates Mar 17 18:19:06.007705 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 5.15.179-flatcar: c6f3fb83dc6bb7052b07ec5b1ef41d12f9b3f7e4' Mar 17 18:19:06.007721 kernel: Key type .fscrypt registered Mar 17 18:19:06.007737 kernel: Key type fscrypt-provisioning registered Mar 17 18:19:06.007753 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 18:19:06.007770 kernel: ima: Allocated hash algorithm: sha1 Mar 17 18:19:06.007790 kernel: ima: No architecture policies found Mar 17 18:19:06.007806 kernel: clk: Disabling unused clocks Mar 17 18:19:06.007822 kernel: Freeing unused kernel memory: 36416K Mar 17 18:19:06.007838 kernel: Run /init as init process Mar 17 18:19:06.007854 kernel: with arguments: Mar 17 18:19:06.007870 kernel: /init Mar 17 18:19:06.007885 kernel: with environment: Mar 17 18:19:06.007901 kernel: HOME=/ Mar 17 18:19:06.007917 kernel: TERM=linux Mar 17 18:19:06.007936 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 18:19:06.007958 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Mar 17 18:19:06.007978 systemd[1]: Detected virtualization amazon. Mar 17 18:19:06.007996 systemd[1]: Detected architecture arm64. Mar 17 18:19:06.008014 systemd[1]: Running in initrd. Mar 17 18:19:06.008031 systemd[1]: No hostname configured, using default hostname. Mar 17 18:19:06.008048 systemd[1]: Hostname set to . Mar 17 18:19:06.008070 systemd[1]: Initializing machine ID from VM UUID. Mar 17 18:19:06.008087 systemd[1]: Queued start job for default target initrd.target. Mar 17 18:19:06.008105 systemd[1]: Started systemd-ask-password-console.path. Mar 17 18:19:06.008123 systemd[1]: Reached target cryptsetup.target. Mar 17 18:19:06.008140 systemd[1]: Reached target paths.target. Mar 17 18:19:06.008157 systemd[1]: Reached target slices.target. Mar 17 18:19:06.008174 systemd[1]: Reached target swap.target. Mar 17 18:19:06.008191 systemd[1]: Reached target timers.target. Mar 17 18:19:06.008213 systemd[1]: Listening on iscsid.socket. Mar 17 18:19:06.008231 systemd[1]: Listening on iscsiuio.socket. Mar 17 18:19:06.008248 systemd[1]: Listening on systemd-journald-audit.socket. Mar 17 18:19:06.008266 systemd[1]: Listening on systemd-journald-dev-log.socket. Mar 17 18:19:06.008283 systemd[1]: Listening on systemd-journald.socket. Mar 17 18:19:06.008301 systemd[1]: Listening on systemd-networkd.socket. Mar 17 18:19:06.008318 systemd[1]: Listening on systemd-udevd-control.socket. Mar 17 18:19:06.008358 systemd[1]: Listening on systemd-udevd-kernel.socket. Mar 17 18:19:06.008384 systemd[1]: Reached target sockets.target. Mar 17 18:19:06.008403 systemd[1]: Starting kmod-static-nodes.service... Mar 17 18:19:06.008421 systemd[1]: Finished network-cleanup.service. Mar 17 18:19:06.008438 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 18:19:06.008456 systemd[1]: Starting systemd-journald.service... Mar 17 18:19:06.008473 systemd[1]: Starting systemd-modules-load.service... Mar 17 18:19:06.008490 systemd[1]: Starting systemd-resolved.service... Mar 17 18:19:06.008508 systemd[1]: Starting systemd-vconsole-setup.service... Mar 17 18:19:06.008525 systemd[1]: Finished kmod-static-nodes.service. Mar 17 18:19:06.008547 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 18:19:06.008564 systemd[1]: Finished systemd-vconsole-setup.service. Mar 17 18:19:06.008583 kernel: audit: type=1130 audit(1742235545.965:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.008601 systemd[1]: Starting dracut-cmdline-ask.service... Mar 17 18:19:06.008619 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Mar 17 18:19:06.008637 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Mar 17 18:19:06.008658 systemd-journald[309]: Journal started Mar 17 18:19:06.008746 systemd-journald[309]: Runtime Journal (/run/log/journal/ec2a7866800b2618146ee04a7ee39c76) is 8.0M, max 75.4M, 67.4M free. Mar 17 18:19:05.965000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:05.962183 systemd-modules-load[310]: Inserted module 'overlay' Mar 17 18:19:06.012000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.023551 kernel: audit: type=1130 audit(1742235546.012:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.023609 systemd[1]: Started systemd-journald.service. Mar 17 18:19:06.023638 kernel: audit: type=1130 audit(1742235546.021:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.021000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.037734 systemd[1]: Finished dracut-cmdline-ask.service. Mar 17 18:19:06.038000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.046798 systemd-resolved[311]: Positive Trust Anchors: Mar 17 18:19:06.047151 systemd-resolved[311]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 18:19:06.047205 systemd-resolved[311]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Mar 17 18:19:06.066996 kernel: audit: type=1130 audit(1742235546.038:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.065626 systemd[1]: Starting dracut-cmdline.service... Mar 17 18:19:06.081691 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 18:19:06.094377 kernel: Bridge firewalling registered Mar 17 18:19:06.095489 systemd-modules-load[310]: Inserted module 'br_netfilter' Mar 17 18:19:06.099186 dracut-cmdline[326]: dracut-dracut-053 Mar 17 18:19:06.107649 dracut-cmdline[326]: Using kernel command line parameters: rd.driver.pre=btrfs BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlycon flatcar.first_boot=detected acpi=force flatcar.oem.id=ec2 modprobe.blacklist=xen_fbfront net.ifnames=0 nvme_core.io_timeout=4294967295 verity.usrhash=e034db32d58fe7496a3db6ba3879dd9052cea2cf1597d65edfc7b26afc92530d Mar 17 18:19:06.129797 kernel: SCSI subsystem initialized Mar 17 18:19:06.151552 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 18:19:06.151630 kernel: device-mapper: uevent: version 1.0.3 Mar 17 18:19:06.154446 kernel: device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com Mar 17 18:19:06.159955 systemd-modules-load[310]: Inserted module 'dm_multipath' Mar 17 18:19:06.161966 systemd[1]: Finished systemd-modules-load.service. Mar 17 18:19:06.167679 systemd[1]: Starting systemd-sysctl.service... Mar 17 18:19:06.165000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.179372 kernel: audit: type=1130 audit(1742235546.165:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.196858 systemd[1]: Finished systemd-sysctl.service. Mar 17 18:19:06.197000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.209365 kernel: audit: type=1130 audit(1742235546.197:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.281365 kernel: Loading iSCSI transport class v2.0-870. Mar 17 18:19:06.302376 kernel: iscsi: registered transport (tcp) Mar 17 18:19:06.328771 kernel: iscsi: registered transport (qla4xxx) Mar 17 18:19:06.328861 kernel: QLogic iSCSI HBA Driver Mar 17 18:19:06.515904 systemd-resolved[311]: Defaulting to hostname 'linux'. Mar 17 18:19:06.518211 kernel: random: crng init done Mar 17 18:19:06.519463 systemd[1]: Started systemd-resolved.service. Mar 17 18:19:06.529865 kernel: audit: type=1130 audit(1742235546.518:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.518000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.520475 systemd[1]: Reached target nss-lookup.target. Mar 17 18:19:06.548317 systemd[1]: Finished dracut-cmdline.service. Mar 17 18:19:06.549000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.557562 systemd[1]: Starting dracut-pre-udev.service... Mar 17 18:19:06.561379 kernel: audit: type=1130 audit(1742235546.549:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:06.623372 kernel: raid6: neonx8 gen() 6196 MB/s Mar 17 18:19:06.641367 kernel: raid6: neonx8 xor() 4579 MB/s Mar 17 18:19:06.659357 kernel: raid6: neonx4 gen() 6544 MB/s Mar 17 18:19:06.677368 kernel: raid6: neonx4 xor() 4729 MB/s Mar 17 18:19:06.695363 kernel: raid6: neonx2 gen() 5765 MB/s Mar 17 18:19:06.713365 kernel: raid6: neonx2 xor() 4376 MB/s Mar 17 18:19:06.731359 kernel: raid6: neonx1 gen() 4458 MB/s Mar 17 18:19:06.749362 kernel: raid6: neonx1 xor() 3564 MB/s Mar 17 18:19:06.767366 kernel: raid6: int64x8 gen() 3424 MB/s Mar 17 18:19:06.785365 kernel: raid6: int64x8 xor() 2055 MB/s Mar 17 18:19:06.803363 kernel: raid6: int64x4 gen() 3842 MB/s Mar 17 18:19:06.821364 kernel: raid6: int64x4 xor() 2154 MB/s Mar 17 18:19:06.839364 kernel: raid6: int64x2 gen() 3603 MB/s Mar 17 18:19:06.857364 kernel: raid6: int64x2 xor() 1918 MB/s Mar 17 18:19:06.875363 kernel: raid6: int64x1 gen() 2752 MB/s Mar 17 18:19:06.894489 kernel: raid6: int64x1 xor() 1431 MB/s Mar 17 18:19:06.894519 kernel: raid6: using algorithm neonx4 gen() 6544 MB/s Mar 17 18:19:06.894551 kernel: raid6: .... xor() 4729 MB/s, rmw enabled Mar 17 18:19:06.896092 kernel: raid6: using neon recovery algorithm Mar 17 18:19:06.915642 kernel: xor: measuring software checksum speed Mar 17 18:19:06.915706 kernel: 8regs : 9298 MB/sec Mar 17 18:19:06.917360 kernel: 32regs : 10851 MB/sec Mar 17 18:19:06.919131 kernel: arm64_neon : 9269 MB/sec Mar 17 18:19:06.919165 kernel: xor: using function: 32regs (10851 MB/sec) Mar 17 18:19:07.011388 kernel: Btrfs loaded, crc32c=crc32c-generic, zoned=no, fsverity=no Mar 17 18:19:07.028574 systemd[1]: Finished dracut-pre-udev.service. Mar 17 18:19:07.038238 kernel: audit: type=1130 audit(1742235547.028:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:07.028000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:07.036000 audit: BPF prog-id=7 op=LOAD Mar 17 18:19:07.036000 audit: BPF prog-id=8 op=LOAD Mar 17 18:19:07.039074 systemd[1]: Starting systemd-udevd.service... Mar 17 18:19:07.067308 systemd-udevd[509]: Using default interface naming scheme 'v252'. Mar 17 18:19:07.077661 systemd[1]: Started systemd-udevd.service. Mar 17 18:19:07.083000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:07.086180 systemd[1]: Starting dracut-pre-trigger.service... Mar 17 18:19:07.112842 dracut-pre-trigger[521]: rd.md=0: removing MD RAID activation Mar 17 18:19:07.172809 systemd[1]: Finished dracut-pre-trigger.service. Mar 17 18:19:07.174000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:07.177105 systemd[1]: Starting systemd-udev-trigger.service... Mar 17 18:19:07.275942 systemd[1]: Finished systemd-udev-trigger.service. Mar 17 18:19:07.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:07.391731 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Mar 17 18:19:07.391814 kernel: ena 0000:00:05.0: enabling device (0010 -> 0012) Mar 17 18:19:07.413590 kernel: ena 0000:00:05.0: ENA device version: 0.10 Mar 17 18:19:07.413911 kernel: ena 0000:00:05.0: ENA controller version: 0.0.1 implementation version 1 Mar 17 18:19:07.414118 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Mar 17 18:19:07.414144 kernel: nvme nvme0: pci function 0000:00:04.0 Mar 17 18:19:07.414431 kernel: ena 0000:00:05.0: Elastic Network Adapter (ENA) found at mem 80114000, mac addr 06:e8:8c:10:9b:2d Mar 17 18:19:07.417370 kernel: nvme nvme0: 2/0/0 default/read/poll queues Mar 17 18:19:07.425894 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 17 18:19:07.425948 kernel: GPT:9289727 != 16777215 Mar 17 18:19:07.425971 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 17 18:19:07.429083 kernel: GPT:9289727 != 16777215 Mar 17 18:19:07.429114 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 17 18:19:07.432180 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 17 18:19:07.438707 (udev-worker)[566]: Network interface NamePolicy= disabled on kernel command line. Mar 17 18:19:07.512381 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/nvme0n1p6 scanned by (udev-worker) (559) Mar 17 18:19:07.574492 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device. Mar 17 18:19:07.622068 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device. Mar 17 18:19:07.626243 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device. Mar 17 18:19:07.642865 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device. Mar 17 18:19:07.656091 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Mar 17 18:19:07.660599 systemd[1]: Starting disk-uuid.service... Mar 17 18:19:07.677359 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 17 18:19:07.680466 disk-uuid[663]: Primary Header is updated. Mar 17 18:19:07.680466 disk-uuid[663]: Secondary Entries is updated. Mar 17 18:19:07.680466 disk-uuid[663]: Secondary Header is updated. Mar 17 18:19:07.693375 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 17 18:19:08.705362 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 Mar 17 18:19:08.706717 disk-uuid[664]: The operation has completed successfully. Mar 17 18:19:08.892646 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 18:19:08.892878 systemd[1]: Finished disk-uuid.service. Mar 17 18:19:08.894000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:08.894000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:08.914865 systemd[1]: Starting verity-setup.service... Mar 17 18:19:08.942368 kernel: device-mapper: verity: sha256 using implementation "sha256-ce" Mar 17 18:19:09.027099 systemd[1]: Found device dev-mapper-usr.device. Mar 17 18:19:09.031561 systemd[1]: Mounting sysusr-usr.mount... Mar 17 18:19:09.035815 systemd[1]: Finished verity-setup.service. Mar 17 18:19:09.037000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.130378 kernel: EXT4-fs (dm-0): mounted filesystem without journal. Opts: norecovery. Quota mode: none. Mar 17 18:19:09.131094 systemd[1]: Mounted sysusr-usr.mount. Mar 17 18:19:09.133806 systemd[1]: afterburn-network-kargs.service was skipped because no trigger condition checks were met. Mar 17 18:19:09.137123 systemd[1]: Starting ignition-setup.service... Mar 17 18:19:09.144844 systemd[1]: Starting parse-ip-for-networkd.service... Mar 17 18:19:09.177369 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 17 18:19:09.177438 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 17 18:19:09.180405 kernel: BTRFS info (device nvme0n1p6): has skinny extents Mar 17 18:19:09.194368 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 17 18:19:09.211982 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 18:19:09.226383 systemd[1]: Finished ignition-setup.service. Mar 17 18:19:09.227000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.230787 systemd[1]: Starting ignition-fetch-offline.service... Mar 17 18:19:09.294029 systemd[1]: Finished parse-ip-for-networkd.service. Mar 17 18:19:09.296000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.297000 audit: BPF prog-id=9 op=LOAD Mar 17 18:19:09.299882 systemd[1]: Starting systemd-networkd.service... Mar 17 18:19:09.348185 systemd-networkd[1176]: lo: Link UP Mar 17 18:19:09.348659 systemd-networkd[1176]: lo: Gained carrier Mar 17 18:19:09.350754 systemd-networkd[1176]: Enumeration completed Mar 17 18:19:09.358000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.351186 systemd[1]: Started systemd-networkd.service. Mar 17 18:19:09.357944 systemd-networkd[1176]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:19:09.360378 systemd[1]: Reached target network.target. Mar 17 18:19:09.366838 systemd-networkd[1176]: eth0: Link UP Mar 17 18:19:09.366846 systemd-networkd[1176]: eth0: Gained carrier Mar 17 18:19:09.383089 systemd[1]: Starting iscsiuio.service... Mar 17 18:19:09.395279 systemd[1]: Started iscsiuio.service. Mar 17 18:19:09.395000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.398917 systemd-networkd[1176]: eth0: DHCPv4 address 172.31.21.220/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 17 18:19:09.402430 systemd[1]: Starting iscsid.service... Mar 17 18:19:09.410850 iscsid[1181]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Mar 17 18:19:09.410850 iscsid[1181]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Mar 17 18:19:09.410850 iscsid[1181]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Mar 17 18:19:09.410850 iscsid[1181]: If using hardware iscsi like qla4xxx this message can be ignored. Mar 17 18:19:09.410850 iscsid[1181]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Mar 17 18:19:09.429620 iscsid[1181]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Mar 17 18:19:09.423009 systemd[1]: Started iscsid.service. Mar 17 18:19:09.433000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.436028 systemd[1]: Starting dracut-initqueue.service... Mar 17 18:19:09.457000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.456682 systemd[1]: Finished dracut-initqueue.service. Mar 17 18:19:09.458401 systemd[1]: Reached target remote-fs-pre.target. Mar 17 18:19:09.459926 systemd[1]: Reached target remote-cryptsetup.target. Mar 17 18:19:09.463492 systemd[1]: Reached target remote-fs.target. Mar 17 18:19:09.466276 systemd[1]: Starting dracut-pre-mount.service... Mar 17 18:19:09.495000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.495462 systemd[1]: Finished dracut-pre-mount.service. Mar 17 18:19:09.603994 ignition[1122]: Ignition 2.14.0 Mar 17 18:19:09.604023 ignition[1122]: Stage: fetch-offline Mar 17 18:19:09.604374 ignition[1122]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:19:09.604437 ignition[1122]: parsing config with SHA512: 6629d8e825d60c9c9d4629d8547ef9a0b839d6b01b7f61a481a1f23308c924b8b0bbf10cae7f7fe3bcaf88b23d1a81baa7771c3670728d4d2a1e665216a1de7b Mar 17 18:19:09.627382 ignition[1122]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 18:19:09.629552 ignition[1122]: Ignition finished successfully Mar 17 18:19:09.632959 systemd[1]: Finished ignition-fetch-offline.service. Mar 17 18:19:09.633000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.636105 systemd[1]: Starting ignition-fetch.service... Mar 17 18:19:09.652919 ignition[1200]: Ignition 2.14.0 Mar 17 18:19:09.654547 ignition[1200]: Stage: fetch Mar 17 18:19:09.655983 ignition[1200]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:19:09.658123 ignition[1200]: parsing config with SHA512: 6629d8e825d60c9c9d4629d8547ef9a0b839d6b01b7f61a481a1f23308c924b8b0bbf10cae7f7fe3bcaf88b23d1a81baa7771c3670728d4d2a1e665216a1de7b Mar 17 18:19:09.668822 ignition[1200]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 18:19:09.671999 ignition[1200]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 18:19:09.678020 ignition[1200]: INFO : PUT result: OK Mar 17 18:19:09.681084 ignition[1200]: DEBUG : parsed url from cmdline: "" Mar 17 18:19:09.682824 ignition[1200]: INFO : no config URL provided Mar 17 18:19:09.684404 ignition[1200]: INFO : reading system config file "/usr/lib/ignition/user.ign" Mar 17 18:19:09.684404 ignition[1200]: INFO : no config at "/usr/lib/ignition/user.ign" Mar 17 18:19:09.688438 ignition[1200]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 18:19:09.691296 ignition[1200]: INFO : PUT result: OK Mar 17 18:19:09.692909 ignition[1200]: INFO : GET http://169.254.169.254/2019-10-01/user-data: attempt #1 Mar 17 18:19:09.695840 ignition[1200]: INFO : GET result: OK Mar 17 18:19:09.697393 ignition[1200]: DEBUG : parsing config with SHA512: f261bbddd16fbd9e783f2683f9ea27c7527e92adfbdcb3e0f958c024cbcab7af7a25b0b5a45308cbbe59230c862bc6f8db4bdc78dfb2a10d6ecf39093c21e3f0 Mar 17 18:19:09.710726 unknown[1200]: fetched base config from "system" Mar 17 18:19:09.712536 unknown[1200]: fetched base config from "system" Mar 17 18:19:09.714166 unknown[1200]: fetched user config from "aws" Mar 17 18:19:09.716866 ignition[1200]: fetch: fetch complete Mar 17 18:19:09.717978 ignition[1200]: fetch: fetch passed Mar 17 18:19:09.718069 ignition[1200]: Ignition finished successfully Mar 17 18:19:09.723456 systemd[1]: Finished ignition-fetch.service. Mar 17 18:19:09.726739 systemd[1]: Starting ignition-kargs.service... Mar 17 18:19:09.723000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.731482 kernel: kauditd_printk_skb: 17 callbacks suppressed Mar 17 18:19:09.731521 kernel: audit: type=1130 audit(1742235549.723:28): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.753306 ignition[1206]: Ignition 2.14.0 Mar 17 18:19:09.753353 ignition[1206]: Stage: kargs Mar 17 18:19:09.753655 ignition[1206]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:19:09.753713 ignition[1206]: parsing config with SHA512: 6629d8e825d60c9c9d4629d8547ef9a0b839d6b01b7f61a481a1f23308c924b8b0bbf10cae7f7fe3bcaf88b23d1a81baa7771c3670728d4d2a1e665216a1de7b Mar 17 18:19:09.767845 ignition[1206]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 18:19:09.770021 ignition[1206]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 18:19:09.773098 ignition[1206]: INFO : PUT result: OK Mar 17 18:19:09.778057 ignition[1206]: kargs: kargs passed Mar 17 18:19:09.778164 ignition[1206]: Ignition finished successfully Mar 17 18:19:09.782204 systemd[1]: Finished ignition-kargs.service. Mar 17 18:19:09.785405 systemd[1]: Starting ignition-disks.service... Mar 17 18:19:09.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.797361 kernel: audit: type=1130 audit(1742235549.782:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.801595 ignition[1212]: Ignition 2.14.0 Mar 17 18:19:09.801645 ignition[1212]: Stage: disks Mar 17 18:19:09.802156 ignition[1212]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:19:09.803146 ignition[1212]: parsing config with SHA512: 6629d8e825d60c9c9d4629d8547ef9a0b839d6b01b7f61a481a1f23308c924b8b0bbf10cae7f7fe3bcaf88b23d1a81baa7771c3670728d4d2a1e665216a1de7b Mar 17 18:19:09.818814 ignition[1212]: no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 18:19:09.820967 ignition[1212]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 18:19:09.823907 ignition[1212]: INFO : PUT result: OK Mar 17 18:19:09.828785 ignition[1212]: disks: disks passed Mar 17 18:19:09.828928 ignition[1212]: Ignition finished successfully Mar 17 18:19:09.833137 systemd[1]: Finished ignition-disks.service. Mar 17 18:19:09.845673 kernel: audit: type=1130 audit(1742235549.833:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.833000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.834972 systemd[1]: Reached target initrd-root-device.target. Mar 17 18:19:09.842913 systemd[1]: Reached target local-fs-pre.target. Mar 17 18:19:09.845729 systemd[1]: Reached target local-fs.target. Mar 17 18:19:09.847281 systemd[1]: Reached target sysinit.target. Mar 17 18:19:09.850097 systemd[1]: Reached target basic.target. Mar 17 18:19:09.852989 systemd[1]: Starting systemd-fsck-root.service... Mar 17 18:19:09.899828 systemd-fsck[1220]: ROOT: clean, 623/553520 files, 56021/553472 blocks Mar 17 18:19:09.905486 systemd[1]: Finished systemd-fsck-root.service. Mar 17 18:19:09.906000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.909812 systemd[1]: Mounting sysroot.mount... Mar 17 18:19:09.917006 kernel: audit: type=1130 audit(1742235549.906:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:09.937364 kernel: EXT4-fs (nvme0n1p9): mounted filesystem with ordered data mode. Opts: (null). Quota mode: none. Mar 17 18:19:09.938104 systemd[1]: Mounted sysroot.mount. Mar 17 18:19:09.940718 systemd[1]: Reached target initrd-root-fs.target. Mar 17 18:19:09.951952 systemd[1]: Mounting sysroot-usr.mount... Mar 17 18:19:09.955301 systemd[1]: flatcar-metadata-hostname.service was skipped because no trigger condition checks were met. Mar 17 18:19:09.955439 systemd[1]: ignition-remount-sysroot.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 18:19:09.957967 systemd[1]: Reached target ignition-diskful.target. Mar 17 18:19:09.974404 systemd[1]: Mounted sysroot-usr.mount. Mar 17 18:19:09.978625 systemd[1]: Starting initrd-setup-root.service... Mar 17 18:19:09.998245 initrd-setup-root[1241]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 18:19:10.028372 initrd-setup-root[1249]: cut: /sysroot/etc/group: No such file or directory Mar 17 18:19:10.037395 initrd-setup-root[1257]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 18:19:10.046278 initrd-setup-root[1265]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 18:19:10.065182 systemd[1]: Mounting sysroot-usr-share-oem.mount... Mar 17 18:19:10.089378 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/nvme0n1p6 scanned by mount (1273) Mar 17 18:19:10.095094 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 17 18:19:10.095214 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 17 18:19:10.097439 kernel: BTRFS info (device nvme0n1p6): has skinny extents Mar 17 18:19:10.105354 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 17 18:19:10.115814 systemd[1]: Mounted sysroot-usr-share-oem.mount. Mar 17 18:19:10.233204 systemd[1]: Finished initrd-setup-root.service. Mar 17 18:19:10.233000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:10.237454 systemd[1]: Starting ignition-mount.service... Mar 17 18:19:10.252424 kernel: audit: type=1130 audit(1742235550.233:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:10.248511 systemd[1]: Starting sysroot-boot.service... Mar 17 18:19:10.260216 systemd[1]: sysusr-usr-share-oem.mount: Deactivated successfully. Mar 17 18:19:10.262474 systemd[1]: sysroot-usr-share-oem.mount: Deactivated successfully. Mar 17 18:19:10.284121 ignition[1302]: INFO : Ignition 2.14.0 Mar 17 18:19:10.284121 ignition[1302]: INFO : Stage: mount Mar 17 18:19:10.287279 ignition[1302]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:19:10.287279 ignition[1302]: DEBUG : parsing config with SHA512: 6629d8e825d60c9c9d4629d8547ef9a0b839d6b01b7f61a481a1f23308c924b8b0bbf10cae7f7fe3bcaf88b23d1a81baa7771c3670728d4d2a1e665216a1de7b Mar 17 18:19:10.315230 systemd[1]: Finished sysroot-boot.service. Mar 17 18:19:10.317000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:10.323474 ignition[1302]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 18:19:10.323474 ignition[1302]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 18:19:10.329188 kernel: audit: type=1130 audit(1742235550.317:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:10.329779 ignition[1302]: INFO : PUT result: OK Mar 17 18:19:10.334592 ignition[1302]: INFO : mount: mount passed Mar 17 18:19:10.336150 ignition[1302]: INFO : Ignition finished successfully Mar 17 18:19:10.338414 systemd[1]: Finished ignition-mount.service. Mar 17 18:19:10.349941 kernel: audit: type=1130 audit(1742235550.338:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:10.338000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:10.341493 systemd[1]: Starting ignition-files.service... Mar 17 18:19:10.359469 systemd[1]: Mounting sysroot-usr-share-oem.mount... Mar 17 18:19:10.382379 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/nvme0n1p6 scanned by mount (1312) Mar 17 18:19:10.388136 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-generic) checksum algorithm Mar 17 18:19:10.388198 kernel: BTRFS info (device nvme0n1p6): using free space tree Mar 17 18:19:10.388223 kernel: BTRFS info (device nvme0n1p6): has skinny extents Mar 17 18:19:10.403369 kernel: BTRFS info (device nvme0n1p6): enabling ssd optimizations Mar 17 18:19:10.409947 systemd[1]: Mounted sysroot-usr-share-oem.mount. Mar 17 18:19:10.428801 ignition[1331]: INFO : Ignition 2.14.0 Mar 17 18:19:10.428801 ignition[1331]: INFO : Stage: files Mar 17 18:19:10.431902 ignition[1331]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:19:10.431902 ignition[1331]: DEBUG : parsing config with SHA512: 6629d8e825d60c9c9d4629d8547ef9a0b839d6b01b7f61a481a1f23308c924b8b0bbf10cae7f7fe3bcaf88b23d1a81baa7771c3670728d4d2a1e665216a1de7b Mar 17 18:19:10.448528 ignition[1331]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 18:19:10.450883 ignition[1331]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 18:19:10.454075 ignition[1331]: INFO : PUT result: OK Mar 17 18:19:10.458780 ignition[1331]: DEBUG : files: compiled without relabeling support, skipping Mar 17 18:19:10.463455 ignition[1331]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 18:19:10.463455 ignition[1331]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 18:19:10.503558 ignition[1331]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 18:19:10.506243 ignition[1331]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 18:19:10.510429 unknown[1331]: wrote ssh authorized keys file for user: core Mar 17 18:19:10.512716 ignition[1331]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 18:19:10.516100 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 18:19:10.519370 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 18:19:10.519370 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 18:19:10.526058 ignition[1331]: INFO : GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Mar 17 18:19:10.623479 ignition[1331]: INFO : GET result: OK Mar 17 18:19:10.826537 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Mar 17 18:19:10.830214 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 18:19:10.833576 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 18:19:10.836847 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 18:19:10.842874 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 18:19:10.847488 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/etc/eks/bootstrap.sh" Mar 17 18:19:10.850848 ignition[1331]: INFO : oem config not found in "/usr/share/oem", looking on oem partition Mar 17 18:19:10.861526 ignition[1331]: INFO : op(1): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem449212384" Mar 17 18:19:10.864284 ignition[1331]: CRITICAL : op(1): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem449212384": device or resource busy Mar 17 18:19:10.864284 ignition[1331]: ERROR : failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem449212384", trying btrfs: device or resource busy Mar 17 18:19:10.864284 ignition[1331]: INFO : op(2): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem449212384" Mar 17 18:19:10.864284 ignition[1331]: INFO : op(2): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem449212384" Mar 17 18:19:10.889056 ignition[1331]: INFO : op(3): [started] unmounting "/mnt/oem449212384" Mar 17 18:19:10.891161 ignition[1331]: INFO : op(3): [finished] unmounting "/mnt/oem449212384" Mar 17 18:19:10.895369 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/etc/eks/bootstrap.sh" Mar 17 18:19:10.900150 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 18:19:10.900150 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 18:19:10.900150 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 18:19:10.900150 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 18:19:10.900150 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/home/core/install.sh" Mar 17 18:19:10.900150 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 18:19:10.900150 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 18:19:10.900150 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 18:19:10.933583 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(c): [started] writing file "/sysroot/etc/systemd/system/nvidia.service" Mar 17 18:19:10.933583 ignition[1331]: INFO : oem config not found in "/usr/share/oem", looking on oem partition Mar 17 18:19:10.933583 ignition[1331]: INFO : op(4): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem146164068" Mar 17 18:19:10.933583 ignition[1331]: CRITICAL : op(4): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem146164068": device or resource busy Mar 17 18:19:10.933583 ignition[1331]: ERROR : failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem146164068", trying btrfs: device or resource busy Mar 17 18:19:10.933583 ignition[1331]: INFO : op(5): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem146164068" Mar 17 18:19:10.933583 ignition[1331]: INFO : op(5): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem146164068" Mar 17 18:19:10.933583 ignition[1331]: INFO : op(6): [started] unmounting "/mnt/oem146164068" Mar 17 18:19:10.933583 ignition[1331]: INFO : op(6): [finished] unmounting "/mnt/oem146164068" Mar 17 18:19:10.933583 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(c): [finished] writing file "/sysroot/etc/systemd/system/nvidia.service" Mar 17 18:19:10.933583 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(d): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 18:19:10.933583 ignition[1331]: INFO : GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-arm64.raw: attempt #1 Mar 17 18:19:11.280519 systemd-networkd[1176]: eth0: Gained IPv6LL Mar 17 18:19:11.424244 ignition[1331]: INFO : GET result: OK Mar 17 18:19:11.922315 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(d): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-arm64.raw" Mar 17 18:19:11.926607 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(e): [started] writing file "/sysroot/etc/amazon/ssm/amazon-ssm-agent.json" Mar 17 18:19:11.926607 ignition[1331]: INFO : oem config not found in "/usr/share/oem", looking on oem partition Mar 17 18:19:11.944432 ignition[1331]: INFO : op(7): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3395670047" Mar 17 18:19:11.947080 ignition[1331]: CRITICAL : op(7): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3395670047": device or resource busy Mar 17 18:19:11.947080 ignition[1331]: ERROR : failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem3395670047", trying btrfs: device or resource busy Mar 17 18:19:11.947080 ignition[1331]: INFO : op(8): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3395670047" Mar 17 18:19:11.966687 ignition[1331]: INFO : op(8): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3395670047" Mar 17 18:19:11.966687 ignition[1331]: INFO : op(9): [started] unmounting "/mnt/oem3395670047" Mar 17 18:19:11.966687 ignition[1331]: INFO : op(9): [finished] unmounting "/mnt/oem3395670047" Mar 17 18:19:11.966687 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(e): [finished] writing file "/sysroot/etc/amazon/ssm/amazon-ssm-agent.json" Mar 17 18:19:11.966687 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(f): [started] writing file "/sysroot/etc/amazon/ssm/seelog.xml" Mar 17 18:19:11.966687 ignition[1331]: INFO : oem config not found in "/usr/share/oem", looking on oem partition Mar 17 18:19:11.956935 systemd[1]: mnt-oem3395670047.mount: Deactivated successfully. Mar 17 18:19:12.007448 ignition[1331]: INFO : op(a): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3188599130" Mar 17 18:19:12.007448 ignition[1331]: CRITICAL : op(a): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3188599130": device or resource busy Mar 17 18:19:12.007448 ignition[1331]: ERROR : failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem3188599130", trying btrfs: device or resource busy Mar 17 18:19:12.007448 ignition[1331]: INFO : op(b): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3188599130" Mar 17 18:19:12.007448 ignition[1331]: INFO : op(b): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3188599130" Mar 17 18:19:12.007448 ignition[1331]: INFO : op(c): [started] unmounting "/mnt/oem3188599130" Mar 17 18:19:12.007448 ignition[1331]: INFO : op(c): [finished] unmounting "/mnt/oem3188599130" Mar 17 18:19:12.006053 systemd[1]: mnt-oem3188599130.mount: Deactivated successfully. Mar 17 18:19:12.036419 ignition[1331]: INFO : files: createFilesystemsFiles: createFiles: op(f): [finished] writing file "/sysroot/etc/amazon/ssm/seelog.xml" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(10): [started] processing unit "coreos-metadata-sshkeys@.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(10): [finished] processing unit "coreos-metadata-sshkeys@.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(11): [started] processing unit "amazon-ssm-agent.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(11): op(12): [started] writing unit "amazon-ssm-agent.service" at "/sysroot/etc/systemd/system/amazon-ssm-agent.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(11): op(12): [finished] writing unit "amazon-ssm-agent.service" at "/sysroot/etc/systemd/system/amazon-ssm-agent.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(11): [finished] processing unit "amazon-ssm-agent.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(13): [started] processing unit "nvidia.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(13): [finished] processing unit "nvidia.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(14): [started] processing unit "containerd.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(14): op(15): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(14): op(15): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(14): [finished] processing unit "containerd.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(16): [started] processing unit "prepare-helm.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(16): op(17): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(16): op(17): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(16): [finished] processing unit "prepare-helm.service" Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(18): [started] setting preset to enabled for "coreos-metadata-sshkeys@.service " Mar 17 18:19:12.036419 ignition[1331]: INFO : files: op(18): [finished] setting preset to enabled for "coreos-metadata-sshkeys@.service " Mar 17 18:19:12.102157 ignition[1331]: INFO : files: op(19): [started] setting preset to enabled for "amazon-ssm-agent.service" Mar 17 18:19:12.102157 ignition[1331]: INFO : files: op(19): [finished] setting preset to enabled for "amazon-ssm-agent.service" Mar 17 18:19:12.102157 ignition[1331]: INFO : files: op(1a): [started] setting preset to enabled for "nvidia.service" Mar 17 18:19:12.102157 ignition[1331]: INFO : files: op(1a): [finished] setting preset to enabled for "nvidia.service" Mar 17 18:19:12.102157 ignition[1331]: INFO : files: op(1b): [started] setting preset to enabled for "prepare-helm.service" Mar 17 18:19:12.102157 ignition[1331]: INFO : files: op(1b): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 18:19:12.102157 ignition[1331]: INFO : files: createResultFile: createFiles: op(1c): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 18:19:12.102157 ignition[1331]: INFO : files: createResultFile: createFiles: op(1c): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 18:19:12.102157 ignition[1331]: INFO : files: files passed Mar 17 18:19:12.102157 ignition[1331]: INFO : Ignition finished successfully Mar 17 18:19:12.128795 systemd[1]: Finished ignition-files.service. Mar 17 18:19:12.127000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.148167 kernel: audit: type=1130 audit(1742235552.127:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.146016 systemd[1]: Starting initrd-setup-root-after-ignition.service... Mar 17 18:19:12.147854 systemd[1]: torcx-profile-populate.service was skipped because of an unmet condition check (ConditionPathExists=/sysroot/etc/torcx/next-profile). Mar 17 18:19:12.155645 systemd[1]: Starting ignition-quench.service... Mar 17 18:19:12.163262 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 18:19:12.164144 systemd[1]: Finished ignition-quench.service. Mar 17 18:19:12.165000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.165000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.181565 kernel: audit: type=1130 audit(1742235552.165:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.181635 kernel: audit: type=1131 audit(1742235552.165:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.184862 initrd-setup-root-after-ignition[1356]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 18:19:12.189111 systemd[1]: Finished initrd-setup-root-after-ignition.service. Mar 17 18:19:12.193143 systemd[1]: Reached target ignition-complete.target. Mar 17 18:19:12.191000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.197702 systemd[1]: Starting initrd-parse-etc.service... Mar 17 18:19:12.227109 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 18:19:12.227507 systemd[1]: Finished initrd-parse-etc.service. Mar 17 18:19:12.229000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.231000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.232470 systemd[1]: Reached target initrd-fs.target. Mar 17 18:19:12.235541 systemd[1]: Reached target initrd.target. Mar 17 18:19:12.238367 systemd[1]: dracut-mount.service was skipped because no trigger condition checks were met. Mar 17 18:19:12.242155 systemd[1]: Starting dracut-pre-pivot.service... Mar 17 18:19:12.266371 systemd[1]: Finished dracut-pre-pivot.service. Mar 17 18:19:12.268000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.270766 systemd[1]: Starting initrd-cleanup.service... Mar 17 18:19:12.291526 systemd[1]: Stopped target nss-lookup.target. Mar 17 18:19:12.294781 systemd[1]: Stopped target remote-cryptsetup.target. Mar 17 18:19:12.298262 systemd[1]: Stopped target timers.target. Mar 17 18:19:12.301233 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 18:19:12.303261 systemd[1]: Stopped dracut-pre-pivot.service. Mar 17 18:19:12.305000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.306748 systemd[1]: Stopped target initrd.target. Mar 17 18:19:12.309747 systemd[1]: Stopped target basic.target. Mar 17 18:19:12.312662 systemd[1]: Stopped target ignition-complete.target. Mar 17 18:19:12.316064 systemd[1]: Stopped target ignition-diskful.target. Mar 17 18:19:12.319433 systemd[1]: Stopped target initrd-root-device.target. Mar 17 18:19:12.322865 systemd[1]: Stopped target remote-fs.target. Mar 17 18:19:12.326006 systemd[1]: Stopped target remote-fs-pre.target. Mar 17 18:19:12.329292 systemd[1]: Stopped target sysinit.target. Mar 17 18:19:12.332323 systemd[1]: Stopped target local-fs.target. Mar 17 18:19:12.335436 systemd[1]: Stopped target local-fs-pre.target. Mar 17 18:19:12.338667 systemd[1]: Stopped target swap.target. Mar 17 18:19:12.341421 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 18:19:12.343444 systemd[1]: Stopped dracut-pre-mount.service. Mar 17 18:19:12.345000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.346660 systemd[1]: Stopped target cryptsetup.target. Mar 17 18:19:12.349646 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 18:19:12.351599 systemd[1]: Stopped dracut-initqueue.service. Mar 17 18:19:12.353000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.354814 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 18:19:12.357100 systemd[1]: Stopped initrd-setup-root-after-ignition.service. Mar 17 18:19:12.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.360844 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 18:19:12.362756 systemd[1]: Stopped ignition-files.service. Mar 17 18:19:12.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.367250 systemd[1]: Stopping ignition-mount.service... Mar 17 18:19:12.391110 ignition[1369]: INFO : Ignition 2.14.0 Mar 17 18:19:12.391110 ignition[1369]: INFO : Stage: umount Mar 17 18:19:12.391110 ignition[1369]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:19:12.391110 ignition[1369]: DEBUG : parsing config with SHA512: 6629d8e825d60c9c9d4629d8547ef9a0b839d6b01b7f61a481a1f23308c924b8b0bbf10cae7f7fe3bcaf88b23d1a81baa7771c3670728d4d2a1e665216a1de7b Mar 17 18:19:12.397590 systemd[1]: Stopping iscsiuio.service... Mar 17 18:19:12.413820 ignition[1369]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/aws" Mar 17 18:19:12.413820 ignition[1369]: INFO : PUT http://169.254.169.254/latest/api/token: attempt #1 Mar 17 18:19:12.419621 ignition[1369]: INFO : PUT result: OK Mar 17 18:19:12.432000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.434000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.416023 systemd[1]: Stopping sysroot-boot.service... Mar 17 18:19:12.431413 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 18:19:12.431848 systemd[1]: Stopped systemd-udev-trigger.service. Mar 17 18:19:12.446117 ignition[1369]: INFO : umount: umount passed Mar 17 18:19:12.446117 ignition[1369]: INFO : Ignition finished successfully Mar 17 18:19:12.433984 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 18:19:12.434311 systemd[1]: Stopped dracut-pre-trigger.service. Mar 17 18:19:12.461111 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 18:19:12.464480 systemd[1]: iscsiuio.service: Deactivated successfully. Mar 17 18:19:12.466511 systemd[1]: Stopped iscsiuio.service. Mar 17 18:19:12.468000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.470548 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 18:19:12.472638 systemd[1]: Stopped ignition-mount.service. Mar 17 18:19:12.474000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.478051 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 18:19:12.480139 systemd[1]: Finished initrd-cleanup.service. Mar 17 18:19:12.482000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.482000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.484912 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 18:19:12.485030 systemd[1]: Stopped ignition-disks.service. Mar 17 18:19:12.487000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.489710 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 18:19:12.489817 systemd[1]: Stopped ignition-kargs.service. Mar 17 18:19:12.491000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.494382 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 17 18:19:12.494472 systemd[1]: Stopped ignition-fetch.service. Mar 17 18:19:12.497000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.499074 systemd[1]: Stopped target network.target. Mar 17 18:19:12.501770 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 18:19:12.501874 systemd[1]: Stopped ignition-fetch-offline.service. Mar 17 18:19:12.506000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.507812 systemd[1]: Stopped target paths.target. Mar 17 18:19:12.509257 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 18:19:12.513419 systemd[1]: Stopped systemd-ask-password-console.path. Mar 17 18:19:12.516934 systemd[1]: Stopped target slices.target. Mar 17 18:19:12.519691 systemd[1]: Stopped target sockets.target. Mar 17 18:19:12.523846 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 18:19:12.523905 systemd[1]: Closed iscsid.socket. Mar 17 18:19:12.527830 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 18:19:12.527919 systemd[1]: Closed iscsiuio.socket. Mar 17 18:19:12.531994 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 18:19:12.532098 systemd[1]: Stopped ignition-setup.service. Mar 17 18:19:12.536000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.537782 systemd[1]: Stopping systemd-networkd.service... Mar 17 18:19:12.543464 systemd[1]: Stopping systemd-resolved.service... Mar 17 18:19:12.547391 systemd-networkd[1176]: eth0: DHCPv6 lease lost Mar 17 18:19:12.550443 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 18:19:12.551884 systemd[1]: Stopped systemd-networkd.service. Mar 17 18:19:12.552000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.554000 audit: BPF prog-id=9 op=UNLOAD Mar 17 18:19:12.555712 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 18:19:12.555783 systemd[1]: Closed systemd-networkd.socket. Mar 17 18:19:12.560873 systemd[1]: Stopping network-cleanup.service... Mar 17 18:19:12.571648 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 18:19:12.572829 systemd[1]: Stopped parse-ip-for-networkd.service. Mar 17 18:19:12.574000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.577169 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 18:19:12.577409 systemd[1]: Stopped systemd-sysctl.service. Mar 17 18:19:12.579000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.582199 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 18:19:12.582434 systemd[1]: Stopped systemd-modules-load.service. Mar 17 18:19:12.586000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.593095 systemd[1]: Stopping systemd-udevd.service... Mar 17 18:19:12.598373 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 17 18:19:12.601995 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 18:19:12.604151 systemd[1]: Stopped systemd-resolved.service. Mar 17 18:19:12.606000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.608000 audit: BPF prog-id=6 op=UNLOAD Mar 17 18:19:12.614670 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 18:19:12.616764 systemd[1]: Stopped systemd-udevd.service. Mar 17 18:19:12.618000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.620657 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 18:19:12.622576 systemd[1]: Stopped network-cleanup.service. Mar 17 18:19:12.621000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.625940 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 18:19:12.626154 systemd[1]: Closed systemd-udevd-control.socket. Mar 17 18:19:12.631042 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 18:19:12.631220 systemd[1]: Closed systemd-udevd-kernel.socket. Mar 17 18:19:12.635931 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 18:19:12.636116 systemd[1]: Stopped dracut-pre-udev.service. Mar 17 18:19:12.638000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.641012 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 18:19:12.641199 systemd[1]: Stopped dracut-cmdline.service. Mar 17 18:19:12.643000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.645862 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 18:19:12.646046 systemd[1]: Stopped dracut-cmdline-ask.service. Mar 17 18:19:12.648000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.652297 systemd[1]: Starting initrd-udevadm-cleanup-db.service... Mar 17 18:19:12.664487 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 18:19:12.664761 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service. Mar 17 18:19:12.668000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.670312 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 18:19:12.670000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.670450 systemd[1]: Stopped kmod-static-nodes.service. Mar 17 18:19:12.672320 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 18:19:12.672000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.672436 systemd[1]: Stopped systemd-vconsole-setup.service. Mar 17 18:19:12.675917 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 17 18:19:12.676843 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 18:19:12.677154 systemd[1]: Finished initrd-udevadm-cleanup-db.service. Mar 17 18:19:12.692000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.692000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.724855 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 18:19:12.725495 systemd[1]: Stopped sysroot-boot.service. Mar 17 18:19:12.728000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.730534 systemd[1]: Reached target initrd-switch-root.target. Mar 17 18:19:12.732000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:12.732395 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 18:19:12.732489 systemd[1]: Stopped initrd-setup-root.service. Mar 17 18:19:12.735762 systemd[1]: Starting initrd-switch-root.service... Mar 17 18:19:12.764081 systemd[1]: Switching root. Mar 17 18:19:12.771000 audit: BPF prog-id=5 op=UNLOAD Mar 17 18:19:12.772000 audit: BPF prog-id=4 op=UNLOAD Mar 17 18:19:12.772000 audit: BPF prog-id=3 op=UNLOAD Mar 17 18:19:12.772000 audit: BPF prog-id=8 op=UNLOAD Mar 17 18:19:12.772000 audit: BPF prog-id=7 op=UNLOAD Mar 17 18:19:12.800433 iscsid[1181]: iscsid shutting down. Mar 17 18:19:12.801931 systemd-journald[309]: Received SIGTERM from PID 1 (n/a). Mar 17 18:19:12.802014 systemd-journald[309]: Journal stopped Mar 17 18:19:18.817238 kernel: SELinux: Class mctp_socket not defined in policy. Mar 17 18:19:18.817448 kernel: SELinux: Class anon_inode not defined in policy. Mar 17 18:19:18.817485 kernel: SELinux: the above unknown classes and permissions will be allowed Mar 17 18:19:18.817517 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 18:19:18.817548 kernel: SELinux: policy capability open_perms=1 Mar 17 18:19:18.817579 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 18:19:18.817610 kernel: SELinux: policy capability always_check_network=0 Mar 17 18:19:18.817641 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 18:19:18.817670 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 18:19:18.817707 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 18:19:18.817742 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 18:19:18.817777 systemd[1]: Successfully loaded SELinux policy in 121.554ms. Mar 17 18:19:18.817839 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 19.457ms. Mar 17 18:19:18.817873 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Mar 17 18:19:18.817906 systemd[1]: Detected virtualization amazon. Mar 17 18:19:18.817938 systemd[1]: Detected architecture arm64. Mar 17 18:19:18.817971 systemd[1]: Detected first boot. Mar 17 18:19:18.818001 systemd[1]: Initializing machine ID from VM UUID. Mar 17 18:19:18.818033 kernel: SELinux: Context system_u:object_r:container_file_t:s0:c1022,c1023 is not valid (left unmapped). Mar 17 18:19:18.818068 systemd[1]: Populated /etc with preset unit settings. Mar 17 18:19:18.818105 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:19:18.818142 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:19:18.818178 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:19:18.818211 systemd[1]: Queued start job for default target multi-user.target. Mar 17 18:19:18.818246 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device. Mar 17 18:19:18.818283 systemd[1]: Created slice system-addon\x2dconfig.slice. Mar 17 18:19:18.818320 systemd[1]: Created slice system-addon\x2drun.slice. Mar 17 18:19:18.819447 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice. Mar 17 18:19:18.819491 systemd[1]: Created slice system-getty.slice. Mar 17 18:19:18.819521 systemd[1]: Created slice system-modprobe.slice. Mar 17 18:19:18.819553 systemd[1]: Created slice system-serial\x2dgetty.slice. Mar 17 18:19:18.819587 systemd[1]: Created slice system-system\x2dcloudinit.slice. Mar 17 18:19:18.819627 systemd[1]: Created slice system-systemd\x2dfsck.slice. Mar 17 18:19:18.819663 systemd[1]: Created slice user.slice. Mar 17 18:19:18.819694 systemd[1]: Started systemd-ask-password-console.path. Mar 17 18:19:18.819724 systemd[1]: Started systemd-ask-password-wall.path. Mar 17 18:19:18.819757 systemd[1]: Set up automount boot.automount. Mar 17 18:19:18.819788 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount. Mar 17 18:19:18.819817 systemd[1]: Reached target integritysetup.target. Mar 17 18:19:18.819848 systemd[1]: Reached target remote-cryptsetup.target. Mar 17 18:19:18.819887 systemd[1]: Reached target remote-fs.target. Mar 17 18:19:18.819922 systemd[1]: Reached target slices.target. Mar 17 18:19:18.819961 systemd[1]: Reached target swap.target. Mar 17 18:19:18.819991 systemd[1]: Reached target torcx.target. Mar 17 18:19:18.820023 systemd[1]: Reached target veritysetup.target. Mar 17 18:19:18.820057 systemd[1]: Listening on systemd-coredump.socket. Mar 17 18:19:18.820088 systemd[1]: Listening on systemd-initctl.socket. Mar 17 18:19:18.820121 systemd[1]: Listening on systemd-journald-audit.socket. Mar 17 18:19:18.820151 kernel: kauditd_printk_skb: 55 callbacks suppressed Mar 17 18:19:18.820182 kernel: audit: type=1400 audit(1742235558.496:86): avc: denied { audit_read } for pid=1 comm="systemd" capability=37 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Mar 17 18:19:18.820216 kernel: audit: type=1335 audit(1742235558.496:87): pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Mar 17 18:19:18.820245 systemd[1]: Listening on systemd-journald-dev-log.socket. Mar 17 18:19:18.820274 systemd[1]: Listening on systemd-journald.socket. Mar 17 18:19:18.820304 systemd[1]: Listening on systemd-networkd.socket. Mar 17 18:19:18.820384 systemd[1]: Listening on systemd-udevd-control.socket. Mar 17 18:19:18.820420 systemd[1]: Listening on systemd-udevd-kernel.socket. Mar 17 18:19:18.820452 systemd[1]: Listening on systemd-userdbd.socket. Mar 17 18:19:18.820482 systemd[1]: Mounting dev-hugepages.mount... Mar 17 18:19:18.820889 systemd[1]: Mounting dev-mqueue.mount... Mar 17 18:19:18.821207 systemd[1]: Mounting media.mount... Mar 17 18:19:18.831636 systemd[1]: Mounting sys-kernel-debug.mount... Mar 17 18:19:18.834859 systemd[1]: Mounting sys-kernel-tracing.mount... Mar 17 18:19:18.834911 systemd[1]: Mounting tmp.mount... Mar 17 18:19:18.834942 systemd[1]: Starting flatcar-tmpfiles.service... Mar 17 18:19:18.834975 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:19:18.835006 systemd[1]: Starting kmod-static-nodes.service... Mar 17 18:19:18.835038 systemd[1]: Starting modprobe@configfs.service... Mar 17 18:19:18.835075 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:19:18.835104 systemd[1]: Starting modprobe@drm.service... Mar 17 18:19:18.835133 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:19:18.835164 systemd[1]: Starting modprobe@fuse.service... Mar 17 18:19:18.835195 systemd[1]: Starting modprobe@loop.service... Mar 17 18:19:18.835226 systemd[1]: setup-nsswitch.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 18:19:18.835258 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Mar 17 18:19:18.835288 systemd[1]: (This warning is only shown for the first unit using IP firewalling.) Mar 17 18:19:18.835324 systemd[1]: Starting systemd-journald.service... Mar 17 18:19:18.835384 systemd[1]: Starting systemd-modules-load.service... Mar 17 18:19:18.835417 systemd[1]: Starting systemd-network-generator.service... Mar 17 18:19:18.835451 systemd[1]: Starting systemd-remount-fs.service... Mar 17 18:19:18.835481 systemd[1]: Starting systemd-udev-trigger.service... Mar 17 18:19:18.835512 systemd[1]: Mounted dev-hugepages.mount. Mar 17 18:19:18.835542 systemd[1]: Mounted dev-mqueue.mount. Mar 17 18:19:18.835573 systemd[1]: Mounted media.mount. Mar 17 18:19:18.835604 systemd[1]: Mounted sys-kernel-debug.mount. Mar 17 18:19:18.835633 systemd[1]: Mounted sys-kernel-tracing.mount. Mar 17 18:19:18.835667 systemd[1]: Mounted tmp.mount. Mar 17 18:19:18.835697 systemd[1]: Finished kmod-static-nodes.service. Mar 17 18:19:18.835729 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 18:19:18.850381 kernel: audit: type=1130 audit(1742235558.755:88): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.850446 systemd[1]: Finished modprobe@configfs.service. Mar 17 18:19:18.850481 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:19:18.850608 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:19:18.850655 kernel: audit: type=1130 audit(1742235558.770:89): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.850696 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 18:19:18.850728 systemd[1]: Finished modprobe@drm.service. Mar 17 18:19:18.850759 kernel: audit: type=1131 audit(1742235558.770:90): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.850790 kernel: fuse: init (API version 7.34) Mar 17 18:19:18.850822 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:19:18.850852 kernel: audit: type=1130 audit(1742235558.788:91): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.850881 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:19:18.850912 kernel: audit: type=1131 audit(1742235558.788:92): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.850947 systemd[1]: Finished systemd-modules-load.service. Mar 17 18:19:18.862320 kernel: loop: module loaded Mar 17 18:19:18.862427 systemd[1]: Finished systemd-network-generator.service. Mar 17 18:19:18.862458 kernel: audit: type=1305 audit(1742235558.796:93): op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Mar 17 18:19:18.862488 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 18:19:18.862529 kernel: audit: type=1300 audit(1742235558.796:93): arch=c00000b7 syscall=211 success=yes exit=60 a0=6 a1=ffffd97c3330 a2=4000 a3=1 items=0 ppid=1 pid=1518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:18.862561 systemd[1]: Finished modprobe@fuse.service. Mar 17 18:19:18.862595 systemd-journald[1518]: Journal started Mar 17 18:19:18.862706 systemd-journald[1518]: Runtime Journal (/run/log/journal/ec2a7866800b2618146ee04a7ee39c76) is 8.0M, max 75.4M, 67.4M free. Mar 17 18:19:18.496000 audit[1]: AVC avc: denied { audit_read } for pid=1 comm="systemd" capability=37 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Mar 17 18:19:18.496000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Mar 17 18:19:18.755000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.770000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.770000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.788000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.788000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.796000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Mar 17 18:19:18.796000 audit[1518]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=6 a1=ffffd97c3330 a2=4000 a3=1 items=0 ppid=1 pid=1518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:18.880462 kernel: audit: type=1327 audit(1742235558.796:93): proctitle="/usr/lib/systemd/systemd-journald" Mar 17 18:19:18.796000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Mar 17 18:19:18.803000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.803000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.820000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.820000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.834000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.845000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.879000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.879000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.881366 systemd[1]: Started systemd-journald.service. Mar 17 18:19:18.883000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.886794 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:19:18.887220 systemd[1]: Finished modprobe@loop.service. Mar 17 18:19:18.888000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.888000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.890275 systemd[1]: Finished systemd-remount-fs.service. Mar 17 18:19:18.891000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.893481 systemd[1]: Reached target network-pre.target. Mar 17 18:19:18.897622 systemd[1]: Mounting sys-fs-fuse-connections.mount... Mar 17 18:19:18.903972 systemd[1]: Mounting sys-kernel-config.mount... Mar 17 18:19:18.906833 systemd[1]: remount-root.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 18:19:18.919672 systemd[1]: Starting systemd-hwdb-update.service... Mar 17 18:19:18.923848 systemd[1]: Starting systemd-journal-flush.service... Mar 17 18:19:18.932537 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:19:18.935243 systemd[1]: Starting systemd-random-seed.service... Mar 17 18:19:18.937629 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:19:18.941461 systemd[1]: Starting systemd-sysctl.service... Mar 17 18:19:18.946127 systemd[1]: Mounted sys-fs-fuse-connections.mount. Mar 17 18:19:18.951759 systemd[1]: Mounted sys-kernel-config.mount. Mar 17 18:19:18.964914 systemd-journald[1518]: Time spent on flushing to /var/log/journal/ec2a7866800b2618146ee04a7ee39c76 is 38.850ms for 1072 entries. Mar 17 18:19:18.964914 systemd-journald[1518]: System Journal (/var/log/journal/ec2a7866800b2618146ee04a7ee39c76) is 8.0M, max 195.6M, 187.6M free. Mar 17 18:19:19.025534 systemd-journald[1518]: Received client request to flush runtime journal. Mar 17 18:19:18.985000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:19.023000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:19.027000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:19.029000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:18.984625 systemd[1]: Finished systemd-random-seed.service. Mar 17 18:19:18.986935 systemd[1]: Reached target first-boot-complete.target. Mar 17 18:19:19.023086 systemd[1]: Finished systemd-sysctl.service. Mar 17 18:19:19.026616 systemd[1]: Finished flatcar-tmpfiles.service. Mar 17 18:19:19.029311 systemd[1]: Finished systemd-journal-flush.service. Mar 17 18:19:19.033872 systemd[1]: Starting systemd-sysusers.service... Mar 17 18:19:19.141750 systemd[1]: Finished systemd-udev-trigger.service. Mar 17 18:19:19.142000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:19.146003 systemd[1]: Starting systemd-udev-settle.service... Mar 17 18:19:19.168808 udevadm[1573]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 17 18:19:19.254089 systemd[1]: Finished systemd-sysusers.service. Mar 17 18:19:19.254000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:19.258217 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Mar 17 18:19:19.403685 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Mar 17 18:19:19.404000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:19.826702 systemd[1]: Finished systemd-hwdb-update.service. Mar 17 18:19:19.827000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:19.830990 systemd[1]: Starting systemd-udevd.service... Mar 17 18:19:19.872472 systemd-udevd[1579]: Using default interface naming scheme 'v252'. Mar 17 18:19:19.934857 systemd[1]: Started systemd-udevd.service. Mar 17 18:19:19.935000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:19.939601 systemd[1]: Starting systemd-networkd.service... Mar 17 18:19:19.949274 systemd[1]: Starting systemd-userdbd.service... Mar 17 18:19:20.030299 systemd[1]: Found device dev-ttyS0.device. Mar 17 18:19:20.033409 (udev-worker)[1580]: Network interface NamePolicy= disabled on kernel command line. Mar 17 18:19:20.052600 systemd[1]: Started systemd-userdbd.service. Mar 17 18:19:20.053000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.251240 systemd-networkd[1583]: lo: Link UP Mar 17 18:19:20.251262 systemd-networkd[1583]: lo: Gained carrier Mar 17 18:19:20.252200 systemd-networkd[1583]: Enumeration completed Mar 17 18:19:20.252432 systemd[1]: Started systemd-networkd.service. Mar 17 18:19:20.253000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.256769 systemd[1]: Starting systemd-networkd-wait-online.service... Mar 17 18:19:20.260511 systemd-networkd[1583]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:19:20.275084 systemd-networkd[1583]: eth0: Link UP Mar 17 18:19:20.275361 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 18:19:20.275462 systemd-networkd[1583]: eth0: Gained carrier Mar 17 18:19:20.290679 systemd-networkd[1583]: eth0: DHCPv4 address 172.31.21.220/20, gateway 172.31.16.1 acquired from 172.31.16.1 Mar 17 18:19:20.433096 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Mar 17 18:19:20.435765 systemd[1]: Finished systemd-udev-settle.service. Mar 17 18:19:20.436000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-settle comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.448884 systemd[1]: Starting lvm2-activation-early.service... Mar 17 18:19:20.514003 lvm[1699]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 18:19:20.553000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.553201 systemd[1]: Finished lvm2-activation-early.service. Mar 17 18:19:20.555121 systemd[1]: Reached target cryptsetup.target. Mar 17 18:19:20.559453 systemd[1]: Starting lvm2-activation.service... Mar 17 18:19:20.568129 lvm[1701]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 18:19:20.605128 systemd[1]: Finished lvm2-activation.service. Mar 17 18:19:20.605000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.606997 systemd[1]: Reached target local-fs-pre.target. Mar 17 18:19:20.608665 systemd[1]: var-lib-machines.mount was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 18:19:20.608717 systemd[1]: Reached target local-fs.target. Mar 17 18:19:20.610288 systemd[1]: Reached target machines.target. Mar 17 18:19:20.614626 systemd[1]: Starting ldconfig.service... Mar 17 18:19:20.618469 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:19:20.618630 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:19:20.620999 systemd[1]: Starting systemd-boot-update.service... Mar 17 18:19:20.625111 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service... Mar 17 18:19:20.631151 systemd[1]: Starting systemd-machine-id-commit.service... Mar 17 18:19:20.639015 systemd[1]: Starting systemd-sysext.service... Mar 17 18:19:20.644220 systemd[1]: boot.automount: Got automount request for /boot, triggered by 1704 (bootctl) Mar 17 18:19:20.647035 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service... Mar 17 18:19:20.671328 systemd[1]: Unmounting usr-share-oem.mount... Mar 17 18:19:20.682829 systemd[1]: usr-share-oem.mount: Deactivated successfully. Mar 17 18:19:20.683421 systemd[1]: Unmounted usr-share-oem.mount. Mar 17 18:19:20.715377 kernel: loop0: detected capacity change from 0 to 194096 Mar 17 18:19:20.732962 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service. Mar 17 18:19:20.731000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.800320 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 18:19:20.801611 systemd[1]: Finished systemd-machine-id-commit.service. Mar 17 18:19:20.800000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.836594 systemd-fsck[1716]: fsck.fat 4.2 (2021-01-31) Mar 17 18:19:20.836594 systemd-fsck[1716]: /dev/nvme0n1p1: 236 files, 117179/258078 clusters Mar 17 18:19:20.843000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.842555 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service. Mar 17 18:19:20.848216 systemd[1]: Mounting boot.mount... Mar 17 18:19:20.858955 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 18:19:20.879654 systemd[1]: Mounted boot.mount. Mar 17 18:19:20.896372 kernel: loop1: detected capacity change from 0 to 194096 Mar 17 18:19:20.905911 systemd[1]: Finished systemd-boot-update.service. Mar 17 18:19:20.906000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-boot-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.917562 (sd-sysext)[1730]: Using extensions 'kubernetes'. Mar 17 18:19:20.918383 (sd-sysext)[1730]: Merged extensions into '/usr'. Mar 17 18:19:20.965717 systemd[1]: Mounting usr-share-oem.mount... Mar 17 18:19:20.967525 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:19:20.970253 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:19:20.974400 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:19:20.979493 systemd[1]: Starting modprobe@loop.service... Mar 17 18:19:20.982803 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:19:20.983133 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:19:20.996140 systemd[1]: Mounted usr-share-oem.mount. Mar 17 18:19:20.998699 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:19:20.999212 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:19:20.999000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:20.999000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.001880 systemd[1]: Finished systemd-sysext.service. Mar 17 18:19:21.002000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.011503 systemd[1]: Starting ensure-sysext.service... Mar 17 18:19:21.013528 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:19:21.018183 systemd[1]: Starting systemd-tmpfiles-setup.service... Mar 17 18:19:21.027779 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:19:21.028187 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:19:21.029000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.029000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.034000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.034000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.031770 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:19:21.033777 systemd[1]: Finished modprobe@loop.service. Mar 17 18:19:21.035831 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:19:21.053464 systemd[1]: Reloading. Mar 17 18:19:21.067997 systemd-tmpfiles[1750]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Mar 17 18:19:21.078445 systemd-tmpfiles[1750]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 18:19:21.092228 systemd-tmpfiles[1750]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 18:19:21.163495 /usr/lib/systemd/system-generators/torcx-generator[1773]: time="2025-03-17T18:19:21Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 18:19:21.163557 /usr/lib/systemd/system-generators/torcx-generator[1773]: time="2025-03-17T18:19:21Z" level=info msg="torcx already run" Mar 17 18:19:21.409168 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:19:21.409719 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:19:21.459831 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:19:21.619959 systemd[1]: Finished systemd-tmpfiles-setup.service. Mar 17 18:19:21.621000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.630381 systemd[1]: Starting audit-rules.service... Mar 17 18:19:21.635457 systemd[1]: Starting clean-ca-certificates.service... Mar 17 18:19:21.641299 systemd[1]: Starting systemd-journal-catalog-update.service... Mar 17 18:19:21.647246 systemd[1]: Starting systemd-resolved.service... Mar 17 18:19:21.653922 systemd[1]: Starting systemd-timesyncd.service... Mar 17 18:19:21.658794 systemd[1]: Starting systemd-update-utmp.service... Mar 17 18:19:21.680000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.679081 systemd[1]: Finished clean-ca-certificates.service. Mar 17 18:19:21.694476 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:19:21.697147 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:19:21.701529 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:19:21.706461 systemd[1]: Starting modprobe@loop.service... Mar 17 18:19:21.708182 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:19:21.708523 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:19:21.708824 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 18:19:21.711023 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:19:21.716778 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:19:21.719000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.720000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.727000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.730000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.733000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.733000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.722557 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:19:21.724535 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:19:21.732273 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:19:21.732708 systemd[1]: Finished modprobe@loop.service. Mar 17 18:19:21.735581 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:19:21.735787 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:19:21.742279 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:19:21.747028 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:19:21.751265 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:19:21.759696 systemd[1]: Starting modprobe@loop.service... Mar 17 18:19:21.773000 audit[1839]: SYSTEM_BOOT pid=1839 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.765115 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:19:21.765553 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:19:21.765872 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 18:19:21.780786 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:19:21.789041 systemd[1]: Starting modprobe@drm.service... Mar 17 18:19:21.794000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.799000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.799000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.803000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.806000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.790882 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:19:21.791205 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:19:21.791530 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 18:19:21.793709 systemd[1]: Finished ensure-sysext.service. Mar 17 18:19:21.796818 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:19:21.797183 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:19:21.801576 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:19:21.801956 systemd[1]: Finished modprobe@loop.service. Mar 17 18:19:21.809656 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:19:21.819000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.819098 systemd[1]: Finished systemd-update-utmp.service. Mar 17 18:19:21.826000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.826000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.826022 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:19:21.826425 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:19:21.828381 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:19:21.843000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.846000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.846000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:21.843246 systemd[1]: Finished systemd-journal-catalog-update.service. Mar 17 18:19:21.845975 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 18:19:21.846420 systemd[1]: Finished modprobe@drm.service. Mar 17 18:19:21.950000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Mar 17 18:19:21.950000 audit[1875]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffeea5c960 a2=420 a3=0 items=0 ppid=1834 pid=1875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:21.950000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Mar 17 18:19:21.951874 augenrules[1875]: No rules Mar 17 18:19:21.953062 systemd[1]: Finished audit-rules.service. Mar 17 18:19:21.984649 systemd[1]: Started systemd-timesyncd.service. Mar 17 18:19:21.986499 systemd[1]: Reached target time-set.target. Mar 17 18:19:22.009047 systemd-resolved[1837]: Positive Trust Anchors: Mar 17 18:19:22.009076 systemd-resolved[1837]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 18:19:22.009128 systemd-resolved[1837]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Mar 17 18:19:22.032494 systemd-networkd[1583]: eth0: Gained IPv6LL Mar 17 18:19:22.035457 systemd[1]: Finished systemd-networkd-wait-online.service. Mar 17 18:19:22.063325 systemd-resolved[1837]: Defaulting to hostname 'linux'. Mar 17 18:19:22.063700 systemd-timesyncd[1838]: Contacted time server 206.210.192.32:123 (0.flatcar.pool.ntp.org). Mar 17 18:19:22.064410 systemd-timesyncd[1838]: Initial clock synchronization to Mon 2025-03-17 18:19:21.925617 UTC. Mar 17 18:19:22.066352 systemd[1]: Started systemd-resolved.service. Mar 17 18:19:22.068135 systemd[1]: Reached target network.target. Mar 17 18:19:22.069683 systemd[1]: Reached target network-online.target. Mar 17 18:19:22.071351 systemd[1]: Reached target nss-lookup.target. Mar 17 18:19:22.234419 ldconfig[1703]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 18:19:22.241184 systemd[1]: Finished ldconfig.service. Mar 17 18:19:22.245496 systemd[1]: Starting systemd-update-done.service... Mar 17 18:19:22.262209 systemd[1]: Finished systemd-update-done.service. Mar 17 18:19:22.264144 systemd[1]: Reached target sysinit.target. Mar 17 18:19:22.265844 systemd[1]: Started motdgen.path. Mar 17 18:19:22.267282 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path. Mar 17 18:19:22.269848 systemd[1]: Started logrotate.timer. Mar 17 18:19:22.271570 systemd[1]: Started mdadm.timer. Mar 17 18:19:22.273425 systemd[1]: Started systemd-tmpfiles-clean.timer. Mar 17 18:19:22.275161 systemd[1]: update-engine-stub.timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 18:19:22.275226 systemd[1]: Reached target paths.target. Mar 17 18:19:22.276742 systemd[1]: Reached target timers.target. Mar 17 18:19:22.278732 systemd[1]: Listening on dbus.socket. Mar 17 18:19:22.282559 systemd[1]: Starting docker.socket... Mar 17 18:19:22.287916 systemd[1]: Listening on sshd.socket. Mar 17 18:19:22.289952 systemd[1]: systemd-pcrphase-sysinit.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:19:22.291061 systemd[1]: Listening on docker.socket. Mar 17 18:19:22.292833 systemd[1]: Reached target sockets.target. Mar 17 18:19:22.294733 systemd[1]: Reached target basic.target. Mar 17 18:19:22.296795 systemd[1]: System is tainted: cgroupsv1 Mar 17 18:19:22.297067 systemd[1]: addon-config@usr-share-oem.service was skipped because no trigger condition checks were met. Mar 17 18:19:22.297257 systemd[1]: addon-run@usr-share-oem.service was skipped because no trigger condition checks were met. Mar 17 18:19:22.299905 systemd[1]: Started amazon-ssm-agent.service. Mar 17 18:19:22.304698 systemd[1]: Starting containerd.service... Mar 17 18:19:22.308654 systemd[1]: Starting coreos-metadata-sshkeys@core.service... Mar 17 18:19:22.313142 systemd[1]: Starting dbus.service... Mar 17 18:19:22.317379 systemd[1]: Starting enable-oem-cloudinit.service... Mar 17 18:19:22.322947 systemd[1]: Starting extend-filesystems.service... Mar 17 18:19:22.326199 systemd[1]: flatcar-setup-environment.service was skipped because of an unmet condition check (ConditionPathExists=/usr/share/oem/bin/flatcar-setup-environment). Mar 17 18:19:22.333166 systemd[1]: Starting kubelet.service... Mar 17 18:19:22.341207 jq[1893]: false Mar 17 18:19:22.341090 systemd[1]: Starting motdgen.service... Mar 17 18:19:22.345261 systemd[1]: Started nvidia.service. Mar 17 18:19:22.363132 systemd[1]: Starting prepare-helm.service... Mar 17 18:19:22.377674 systemd[1]: Starting ssh-key-proc-cmdline.service... Mar 17 18:19:22.382319 systemd[1]: Starting sshd-keygen.service... Mar 17 18:19:22.404499 systemd[1]: Starting systemd-logind.service... Mar 17 18:19:22.406061 systemd[1]: systemd-pcrphase.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:19:22.406219 systemd[1]: tcsd.service was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 18:19:22.423822 systemd[1]: Starting update-engine.service... Mar 17 18:19:22.428352 systemd[1]: Starting update-ssh-keys-after-ignition.service... Mar 17 18:19:22.474859 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 18:19:22.475463 systemd[1]: Condition check resulted in enable-oem-cloudinit.service being skipped. Mar 17 18:19:22.478215 jq[1911]: true Mar 17 18:19:22.547180 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 18:19:22.547816 systemd[1]: Finished ssh-key-proc-cmdline.service. Mar 17 18:19:22.572856 jq[1920]: true Mar 17 18:19:22.591309 tar[1915]: linux-arm64/helm Mar 17 18:19:22.617997 extend-filesystems[1894]: Found loop1 Mar 17 18:19:22.620643 extend-filesystems[1894]: Found nvme0n1 Mar 17 18:19:22.620643 extend-filesystems[1894]: Found nvme0n1p1 Mar 17 18:19:22.620643 extend-filesystems[1894]: Found nvme0n1p2 Mar 17 18:19:22.620643 extend-filesystems[1894]: Found nvme0n1p3 Mar 17 18:19:22.620643 extend-filesystems[1894]: Found usr Mar 17 18:19:22.620643 extend-filesystems[1894]: Found nvme0n1p4 Mar 17 18:19:22.620643 extend-filesystems[1894]: Found nvme0n1p6 Mar 17 18:19:22.620643 extend-filesystems[1894]: Found nvme0n1p7 Mar 17 18:19:22.620643 extend-filesystems[1894]: Found nvme0n1p9 Mar 17 18:19:22.620643 extend-filesystems[1894]: Checking size of /dev/nvme0n1p9 Mar 17 18:19:22.626503 dbus-daemon[1892]: [system] SELinux support is enabled Mar 17 18:19:22.628063 systemd[1]: Started dbus.service. Mar 17 18:19:22.648885 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 18:19:22.648941 systemd[1]: Reached target system-config.target. Mar 17 18:19:22.663477 systemd[1]: user-cloudinit-proc-cmdline.service was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 18:19:22.663518 systemd[1]: Reached target user-config.target. Mar 17 18:19:22.686601 dbus-daemon[1892]: [system] Activating systemd to hand-off: service name='org.freedesktop.hostname1' unit='dbus-org.freedesktop.hostname1.service' requested by ':1.1' (uid=244 pid=1583 comm="/usr/lib/systemd/systemd-networkd" label="system_u:system_r:kernel_t:s0") Mar 17 18:19:22.690280 dbus-daemon[1892]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 17 18:19:22.691978 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 18:19:22.692546 systemd[1]: Finished motdgen.service. Mar 17 18:19:22.713838 systemd[1]: Starting systemd-hostnamed.service... Mar 17 18:19:22.780002 extend-filesystems[1894]: Resized partition /dev/nvme0n1p9 Mar 17 18:19:22.807270 extend-filesystems[1957]: resize2fs 1.46.5 (30-Dec-2021) Mar 17 18:19:22.810607 amazon-ssm-agent[1888]: 2025/03/17 18:19:22 Failed to load instance info from vault. RegistrationKey does not exist. Mar 17 18:19:22.824074 amazon-ssm-agent[1888]: Initializing new seelog logger Mar 17 18:19:22.824296 amazon-ssm-agent[1888]: New Seelog Logger Creation Complete Mar 17 18:19:22.824429 amazon-ssm-agent[1888]: 2025/03/17 18:19:22 Found config file at /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 18:19:22.824429 amazon-ssm-agent[1888]: Applying config override from /etc/amazon/ssm/amazon-ssm-agent.json. Mar 17 18:19:22.826371 kernel: EXT4-fs (nvme0n1p9): resizing filesystem from 553472 to 1489915 blocks Mar 17 18:19:22.830486 amazon-ssm-agent[1888]: 2025/03/17 18:19:22 processing appconfig overrides Mar 17 18:19:22.866727 update_engine[1909]: I0317 18:19:22.866321 1909 main.cc:92] Flatcar Update Engine starting Mar 17 18:19:22.880852 systemd[1]: Started update-engine.service. Mar 17 18:19:22.882466 update_engine[1909]: I0317 18:19:22.881906 1909 update_check_scheduler.cc:74] Next update check in 8m32s Mar 17 18:19:22.885787 systemd[1]: Started locksmithd.service. Mar 17 18:19:22.915387 kernel: EXT4-fs (nvme0n1p9): resized filesystem to 1489915 Mar 17 18:19:22.929422 extend-filesystems[1957]: Filesystem at /dev/nvme0n1p9 is mounted on /; on-line resizing required Mar 17 18:19:22.929422 extend-filesystems[1957]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 17 18:19:22.929422 extend-filesystems[1957]: The filesystem on /dev/nvme0n1p9 is now 1489915 (4k) blocks long. Mar 17 18:19:22.954281 extend-filesystems[1894]: Resized filesystem in /dev/nvme0n1p9 Mar 17 18:19:22.956094 bash[1970]: Updated "/home/core/.ssh/authorized_keys" Mar 17 18:19:22.932909 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 18:19:22.933940 systemd[1]: Finished extend-filesystems.service. Mar 17 18:19:22.951289 systemd[1]: Finished update-ssh-keys-after-ignition.service. Mar 17 18:19:22.969430 env[1923]: time="2025-03-17T18:19:22.969352094Z" level=info msg="starting containerd" revision=92b3a9d6f1b3bcc6dc74875cfdea653fe39f09c2 version=1.6.16 Mar 17 18:19:22.984870 systemd[1]: nvidia.service: Deactivated successfully. Mar 17 18:19:23.105243 systemd-logind[1906]: Watching system buttons on /dev/input/event0 (Power Button) Mar 17 18:19:23.105302 systemd-logind[1906]: Watching system buttons on /dev/input/event1 (Sleep Button) Mar 17 18:19:23.110420 systemd-logind[1906]: New seat seat0. Mar 17 18:19:23.117814 systemd[1]: Started systemd-logind.service. Mar 17 18:19:23.192575 env[1923]: time="2025-03-17T18:19:23.192443030Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 18:19:23.197670 env[1923]: time="2025-03-17T18:19:23.197617720Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:19:23.214516 env[1923]: time="2025-03-17T18:19:23.214449210Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.15.179-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:19:23.217759 env[1923]: time="2025-03-17T18:19:23.217703427Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:19:23.219344 env[1923]: time="2025-03-17T18:19:23.219260283Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:19:23.221991 env[1923]: time="2025-03-17T18:19:23.221933948Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 18:19:23.222255 env[1923]: time="2025-03-17T18:19:23.222192488Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Mar 17 18:19:23.222420 env[1923]: time="2025-03-17T18:19:23.222386249Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 18:19:23.223990 env[1923]: time="2025-03-17T18:19:23.223941183Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:19:23.225256 env[1923]: time="2025-03-17T18:19:23.225201018Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:19:23.229718 env[1923]: time="2025-03-17T18:19:23.229654713Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:19:23.233961 env[1923]: time="2025-03-17T18:19:23.233904056Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 18:19:23.234305 env[1923]: time="2025-03-17T18:19:23.234267956Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Mar 17 18:19:23.234870 env[1923]: time="2025-03-17T18:19:23.234828035Z" level=info msg="metadata content store policy set" policy=shared Mar 17 18:19:23.246002 env[1923]: time="2025-03-17T18:19:23.245790503Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 18:19:23.246002 env[1923]: time="2025-03-17T18:19:23.245871133Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 18:19:23.246002 env[1923]: time="2025-03-17T18:19:23.245925900Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 18:19:23.246452 env[1923]: time="2025-03-17T18:19:23.246302325Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 18:19:23.246592 env[1923]: time="2025-03-17T18:19:23.246552185Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 18:19:23.246758 env[1923]: time="2025-03-17T18:19:23.246727231Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 18:19:23.246884 env[1923]: time="2025-03-17T18:19:23.246850009Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 18:19:23.247516 env[1923]: time="2025-03-17T18:19:23.247471235Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 18:19:23.247790 env[1923]: time="2025-03-17T18:19:23.247758444Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1 Mar 17 18:19:23.248057 env[1923]: time="2025-03-17T18:19:23.248026702Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 18:19:23.248191 env[1923]: time="2025-03-17T18:19:23.248161722Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 18:19:23.248482 env[1923]: time="2025-03-17T18:19:23.248450641Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 18:19:23.248960 env[1923]: time="2025-03-17T18:19:23.248927260Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 18:19:23.250063 env[1923]: time="2025-03-17T18:19:23.250015470Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 18:19:23.251268 env[1923]: time="2025-03-17T18:19:23.251217649Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 18:19:23.252874 env[1923]: time="2025-03-17T18:19:23.252801205Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.253123 env[1923]: time="2025-03-17T18:19:23.253075678Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 18:19:23.255285 env[1923]: time="2025-03-17T18:19:23.255233629Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.255855 env[1923]: time="2025-03-17T18:19:23.255807966Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.256081 env[1923]: time="2025-03-17T18:19:23.256049737Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.256244 env[1923]: time="2025-03-17T18:19:23.256187185Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.256842 env[1923]: time="2025-03-17T18:19:23.256795934Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.259680 env[1923]: time="2025-03-17T18:19:23.259626848Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.259878 env[1923]: time="2025-03-17T18:19:23.259847462Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.260325 env[1923]: time="2025-03-17T18:19:23.260266849Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.260691 env[1923]: time="2025-03-17T18:19:23.260631940Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 18:19:23.261100 env[1923]: time="2025-03-17T18:19:23.261063769Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.261638 env[1923]: time="2025-03-17T18:19:23.261596264Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.261943 env[1923]: time="2025-03-17T18:19:23.261910196Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.262148 env[1923]: time="2025-03-17T18:19:23.262116918Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 18:19:23.262632 env[1923]: time="2025-03-17T18:19:23.262589893Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1 Mar 17 18:19:23.263913 env[1923]: time="2025-03-17T18:19:23.263870307Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 18:19:23.264204 env[1923]: time="2025-03-17T18:19:23.264168118Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin" Mar 17 18:19:23.264409 env[1923]: time="2025-03-17T18:19:23.264378555Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 18:19:23.264973 env[1923]: time="2025-03-17T18:19:23.264860657Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.6 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 18:19:23.270763 env[1923]: time="2025-03-17T18:19:23.270709536Z" level=info msg="Connect containerd service" Mar 17 18:19:23.271036 env[1923]: time="2025-03-17T18:19:23.270999883Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 18:19:23.276732 env[1923]: time="2025-03-17T18:19:23.276606472Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 18:19:23.277988 env[1923]: time="2025-03-17T18:19:23.277910225Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 18:19:23.278110 env[1923]: time="2025-03-17T18:19:23.278078607Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 18:19:23.278392 systemd[1]: Started containerd.service. Mar 17 18:19:23.282309 env[1923]: time="2025-03-17T18:19:23.282241565Z" level=info msg="Start subscribing containerd event" Mar 17 18:19:23.307681 env[1923]: time="2025-03-17T18:19:23.307615989Z" level=info msg="Start recovering state" Mar 17 18:19:23.308122 env[1923]: time="2025-03-17T18:19:23.308075921Z" level=info msg="Start event monitor" Mar 17 18:19:23.308345 env[1923]: time="2025-03-17T18:19:23.308269481Z" level=info msg="Start snapshots syncer" Mar 17 18:19:23.308477 env[1923]: time="2025-03-17T18:19:23.308438689Z" level=info msg="Start cni network conf syncer for default" Mar 17 18:19:23.308477 env[1923]: time="2025-03-17T18:19:23.308474917Z" level=info msg="Start streaming server" Mar 17 18:19:23.309490 env[1923]: time="2025-03-17T18:19:23.285072597Z" level=info msg="containerd successfully booted in 0.324619s" Mar 17 18:19:23.357241 dbus-daemon[1892]: [system] Successfully activated service 'org.freedesktop.hostname1' Mar 17 18:19:23.357527 systemd[1]: Started systemd-hostnamed.service. Mar 17 18:19:23.360570 dbus-daemon[1892]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' requested by ':1.6' (uid=0 pid=1948 comm="/usr/lib/systemd/systemd-hostnamed" label="system_u:system_r:kernel_t:s0") Mar 17 18:19:23.365777 systemd[1]: Starting polkit.service... Mar 17 18:19:23.407888 polkitd[2006]: Started polkitd version 121 Mar 17 18:19:23.435415 polkitd[2006]: Loading rules from directory /etc/polkit-1/rules.d Mar 17 18:19:23.435966 polkitd[2006]: Loading rules from directory /usr/share/polkit-1/rules.d Mar 17 18:19:23.444177 polkitd[2006]: Finished loading, compiling and executing 2 rules Mar 17 18:19:23.445164 dbus-daemon[1892]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' Mar 17 18:19:23.445472 systemd[1]: Started polkit.service. Mar 17 18:19:23.448410 polkitd[2006]: Acquired the name org.freedesktop.PolicyKit1 on the system bus Mar 17 18:19:23.484350 systemd-resolved[1837]: System hostname changed to 'ip-172-31-21-220'. Mar 17 18:19:23.484500 systemd-hostnamed[1948]: Hostname set to (transient) Mar 17 18:19:23.620631 coreos-metadata[1890]: Mar 17 18:19:23.620 INFO Putting http://169.254.169.254/latest/api/token: Attempt #1 Mar 17 18:19:23.624186 coreos-metadata[1890]: Mar 17 18:19:23.624 INFO Fetching http://169.254.169.254/2019-10-01/meta-data/public-keys: Attempt #1 Mar 17 18:19:23.626920 coreos-metadata[1890]: Mar 17 18:19:23.626 INFO Fetch successful Mar 17 18:19:23.626920 coreos-metadata[1890]: Mar 17 18:19:23.626 INFO Fetching http://169.254.169.254/2019-10-01/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 17 18:19:23.629309 coreos-metadata[1890]: Mar 17 18:19:23.629 INFO Fetch successful Mar 17 18:19:23.633387 unknown[1890]: wrote ssh authorized keys file for user: core Mar 17 18:19:23.661394 update-ssh-keys[2038]: Updated "/home/core/.ssh/authorized_keys" Mar 17 18:19:23.662738 systemd[1]: Finished coreos-metadata-sshkeys@core.service. Mar 17 18:19:23.856449 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Create new startup processor Mar 17 18:19:23.858742 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [LongRunningPluginsManager] registered plugins: {} Mar 17 18:19:23.858742 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Initializing bookkeeping folders Mar 17 18:19:23.858916 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO removing the completed state files Mar 17 18:19:23.858916 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Initializing bookkeeping folders for long running plugins Mar 17 18:19:23.858916 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Initializing replies folder for MDS reply requests that couldn't reach the service Mar 17 18:19:23.858916 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Initializing healthcheck folders for long running plugins Mar 17 18:19:23.858916 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Initializing locations for inventory plugin Mar 17 18:19:23.858916 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Initializing default location for custom inventory Mar 17 18:19:23.858916 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Initializing default location for file inventory Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Initializing default location for role inventory Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Init the cloudwatchlogs publisher Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform independent plugin aws:softwareInventory Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform independent plugin aws:runDockerAction Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform independent plugin aws:refreshAssociation Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform independent plugin aws:configurePackage Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform independent plugin aws:runPowerShellScript Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform independent plugin aws:updateSsmAgent Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform independent plugin aws:configureDocker Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform independent plugin aws:downloadContent Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform independent plugin aws:runDocument Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Successfully loaded platform dependent plugin aws:runShellScript Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO Starting Agent: amazon-ssm-agent - v2.3.1319.0 Mar 17 18:19:23.859256 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO OS: linux, Arch: arm64 Mar 17 18:19:23.869974 amazon-ssm-agent[1888]: datastore file /var/lib/amazon/ssm/i-04493a1077e76a73b/longrunningplugins/datastore/store doesn't exist - no long running plugins to execute Mar 17 18:19:23.959980 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] Starting document processing engine... Mar 17 18:19:24.054680 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] [EngineProcessor] Starting Mar 17 18:19:24.149948 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] [EngineProcessor] Initial processing Mar 17 18:19:24.244354 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] Starting message polling Mar 17 18:19:24.338805 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] Starting send replies to MDS Mar 17 18:19:24.353835 tar[1915]: linux-arm64/LICENSE Mar 17 18:19:24.354447 tar[1915]: linux-arm64/README.md Mar 17 18:19:24.369378 systemd[1]: Finished prepare-helm.service. Mar 17 18:19:24.433954 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [instanceID=i-04493a1077e76a73b] Starting association polling Mar 17 18:19:24.529548 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] [Association] [EngineProcessor] Starting Mar 17 18:19:24.570471 locksmithd[1975]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 18:19:24.624607 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] [Association] Launching response handler Mar 17 18:19:24.719902 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] [Association] [EngineProcessor] Initial processing Mar 17 18:19:24.815467 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] [Association] Initializing association scheduling service Mar 17 18:19:24.911103 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessagingDeliveryService] [Association] Association scheduling service initialized Mar 17 18:19:25.006984 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [HealthCheck] HealthCheck reporting agent health. Mar 17 18:19:25.103139 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessageGatewayService] Starting session document processing engine... Mar 17 18:19:25.199368 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessageGatewayService] [EngineProcessor] Starting Mar 17 18:19:25.263654 sshd_keygen[1940]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 18:19:25.295836 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessageGatewayService] SSM Agent is trying to setup control channel for Session Manager module. Mar 17 18:19:25.304823 systemd[1]: Started kubelet.service. Mar 17 18:19:25.317605 systemd[1]: Finished sshd-keygen.service. Mar 17 18:19:25.322868 systemd[1]: Starting issuegen.service... Mar 17 18:19:25.338641 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 18:19:25.339163 systemd[1]: Finished issuegen.service. Mar 17 18:19:25.344454 systemd[1]: Starting systemd-user-sessions.service... Mar 17 18:19:25.359502 systemd[1]: Finished systemd-user-sessions.service. Mar 17 18:19:25.366113 systemd[1]: Started getty@tty1.service. Mar 17 18:19:25.372208 systemd[1]: Started serial-getty@ttyS0.service. Mar 17 18:19:25.374199 systemd[1]: Reached target getty.target. Mar 17 18:19:25.376637 systemd[1]: Reached target multi-user.target. Mar 17 18:19:25.382000 systemd[1]: Starting systemd-update-utmp-runlevel.service... Mar 17 18:19:25.392523 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessageGatewayService] Setting up websocket for controlchannel for instance: i-04493a1077e76a73b, requestId: e23a7fcb-b455-45e5-89df-a0d28bd6e17a Mar 17 18:19:25.399235 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Mar 17 18:19:25.399817 systemd[1]: Finished systemd-update-utmp-runlevel.service. Mar 17 18:19:25.406281 systemd[1]: Startup finished in 9.283s (kernel) + 11.657s (userspace) = 20.940s. Mar 17 18:19:25.489397 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [OfflineService] Starting document processing engine... Mar 17 18:19:25.586535 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [OfflineService] [EngineProcessor] Starting Mar 17 18:19:25.683915 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [OfflineService] [EngineProcessor] Initial processing Mar 17 18:19:25.781552 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [OfflineService] Starting message polling Mar 17 18:19:25.879453 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [OfflineService] Starting send replies to MDS Mar 17 18:19:25.977397 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [LongRunningPluginsManager] starting long running plugin manager Mar 17 18:19:26.075378 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [LongRunningPluginsManager] there aren't any long running plugin to execute Mar 17 18:19:26.173646 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessageGatewayService] listening reply. Mar 17 18:19:26.272180 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [HealthCheck] HealthCheck reporting agent health. Mar 17 18:19:26.370723 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [LongRunningPluginsManager] There are no long running plugins currently getting executed - skipping their healthcheck Mar 17 18:19:26.469582 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [StartupProcessor] Executing startup processor tasks Mar 17 18:19:26.568676 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [StartupProcessor] Write to serial port: Amazon SSM Agent v2.3.1319.0 is running Mar 17 18:19:26.667869 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [StartupProcessor] Write to serial port: OsProductName: Flatcar Container Linux by Kinvolk Mar 17 18:19:26.678603 kubelet[2131]: E0317 18:19:26.678545 2131 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:19:26.682645 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:19:26.683046 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:19:26.767361 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [StartupProcessor] Write to serial port: OsVersion: 3510.3.7 Mar 17 18:19:26.866987 amazon-ssm-agent[1888]: 2025-03-17 18:19:23 INFO [MessageGatewayService] Opening websocket connection to: wss://ssmmessages.us-west-2.amazonaws.com/v1/control-channel/i-04493a1077e76a73b?role=subscribe&stream=input Mar 17 18:19:26.966747 amazon-ssm-agent[1888]: 2025-03-17 18:19:24 INFO [MessageGatewayService] Successfully opened websocket connection to: wss://ssmmessages.us-west-2.amazonaws.com/v1/control-channel/i-04493a1077e76a73b?role=subscribe&stream=input Mar 17 18:19:27.066747 amazon-ssm-agent[1888]: 2025-03-17 18:19:24 INFO [MessageGatewayService] Starting receiving message from control channel Mar 17 18:19:27.167019 amazon-ssm-agent[1888]: 2025-03-17 18:19:24 INFO [MessageGatewayService] [EngineProcessor] Initial processing Mar 17 18:19:30.414924 amazon-ssm-agent[1888]: 2025-03-17 18:19:30 INFO [MessagingDeliveryService] [Association] No associations on boot. Requerying for associations after 30 seconds. Mar 17 18:19:30.843621 systemd[1]: Created slice system-sshd.slice. Mar 17 18:19:30.846003 systemd[1]: Started sshd@0-172.31.21.220:22-139.178.89.65:44208.service. Mar 17 18:19:31.110198 sshd[2151]: Accepted publickey for core from 139.178.89.65 port 44208 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:19:31.115345 sshd[2151]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:19:31.134376 systemd[1]: Created slice user-500.slice. Mar 17 18:19:31.136801 systemd[1]: Starting user-runtime-dir@500.service... Mar 17 18:19:31.147469 systemd-logind[1906]: New session 1 of user core. Mar 17 18:19:31.163426 systemd[1]: Finished user-runtime-dir@500.service. Mar 17 18:19:31.167702 systemd[1]: Starting user@500.service... Mar 17 18:19:31.174878 (systemd)[2156]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:19:31.363040 systemd[2156]: Queued start job for default target default.target. Mar 17 18:19:31.363515 systemd[2156]: Reached target paths.target. Mar 17 18:19:31.363554 systemd[2156]: Reached target sockets.target. Mar 17 18:19:31.363588 systemd[2156]: Reached target timers.target. Mar 17 18:19:31.363617 systemd[2156]: Reached target basic.target. Mar 17 18:19:31.363804 systemd[1]: Started user@500.service. Mar 17 18:19:31.365582 systemd[1]: Started session-1.scope. Mar 17 18:19:31.366387 systemd[2156]: Reached target default.target. Mar 17 18:19:31.366776 systemd[2156]: Startup finished in 179ms. Mar 17 18:19:31.508040 systemd[1]: Started sshd@1-172.31.21.220:22-139.178.89.65:37406.service. Mar 17 18:19:31.679929 sshd[2165]: Accepted publickey for core from 139.178.89.65 port 37406 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:19:31.682437 sshd[2165]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:19:31.689712 systemd-logind[1906]: New session 2 of user core. Mar 17 18:19:31.691640 systemd[1]: Started session-2.scope. Mar 17 18:19:31.822074 sshd[2165]: pam_unix(sshd:session): session closed for user core Mar 17 18:19:31.827667 systemd-logind[1906]: Session 2 logged out. Waiting for processes to exit. Mar 17 18:19:31.829017 systemd[1]: sshd@1-172.31.21.220:22-139.178.89.65:37406.service: Deactivated successfully. Mar 17 18:19:31.830555 systemd[1]: session-2.scope: Deactivated successfully. Mar 17 18:19:31.832588 systemd-logind[1906]: Removed session 2. Mar 17 18:19:31.847428 systemd[1]: Started sshd@2-172.31.21.220:22-139.178.89.65:37420.service. Mar 17 18:19:32.014135 sshd[2172]: Accepted publickey for core from 139.178.89.65 port 37420 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:19:32.016889 sshd[2172]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:19:32.025744 systemd[1]: Started session-3.scope. Mar 17 18:19:32.028170 systemd-logind[1906]: New session 3 of user core. Mar 17 18:19:32.150425 sshd[2172]: pam_unix(sshd:session): session closed for user core Mar 17 18:19:32.155795 systemd-logind[1906]: Session 3 logged out. Waiting for processes to exit. Mar 17 18:19:32.156962 systemd[1]: sshd@2-172.31.21.220:22-139.178.89.65:37420.service: Deactivated successfully. Mar 17 18:19:32.158477 systemd[1]: session-3.scope: Deactivated successfully. Mar 17 18:19:32.159856 systemd-logind[1906]: Removed session 3. Mar 17 18:19:32.177127 systemd[1]: Started sshd@3-172.31.21.220:22-139.178.89.65:37434.service. Mar 17 18:19:32.350448 sshd[2179]: Accepted publickey for core from 139.178.89.65 port 37434 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:19:32.353569 sshd[2179]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:19:32.362308 systemd[1]: Started session-4.scope. Mar 17 18:19:32.364791 systemd-logind[1906]: New session 4 of user core. Mar 17 18:19:32.498474 sshd[2179]: pam_unix(sshd:session): session closed for user core Mar 17 18:19:32.503827 systemd-logind[1906]: Session 4 logged out. Waiting for processes to exit. Mar 17 18:19:32.504446 systemd[1]: sshd@3-172.31.21.220:22-139.178.89.65:37434.service: Deactivated successfully. Mar 17 18:19:32.505944 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 18:19:32.506932 systemd-logind[1906]: Removed session 4. Mar 17 18:19:32.523173 systemd[1]: Started sshd@4-172.31.21.220:22-139.178.89.65:37440.service. Mar 17 18:19:32.689888 sshd[2186]: Accepted publickey for core from 139.178.89.65 port 37440 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:19:32.691703 sshd[2186]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:19:32.700506 systemd[1]: Started session-5.scope. Mar 17 18:19:32.701441 systemd-logind[1906]: New session 5 of user core. Mar 17 18:19:32.838202 sudo[2190]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 18:19:32.838786 sudo[2190]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:19:32.867646 dbus-daemon[1892]: avc: received setenforce notice (enforcing=1) Mar 17 18:19:32.871091 sudo[2190]: pam_unix(sudo:session): session closed for user root Mar 17 18:19:32.897306 sshd[2186]: pam_unix(sshd:session): session closed for user core Mar 17 18:19:32.902977 systemd[1]: sshd@4-172.31.21.220:22-139.178.89.65:37440.service: Deactivated successfully. Mar 17 18:19:32.905414 systemd-logind[1906]: Session 5 logged out. Waiting for processes to exit. Mar 17 18:19:32.906684 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 18:19:32.908301 systemd-logind[1906]: Removed session 5. Mar 17 18:19:32.921471 systemd[1]: Started sshd@5-172.31.21.220:22-139.178.89.65:37446.service. Mar 17 18:19:33.088861 sshd[2194]: Accepted publickey for core from 139.178.89.65 port 37446 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:19:33.091974 sshd[2194]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:19:33.100512 systemd[1]: Started session-6.scope. Mar 17 18:19:33.101112 systemd-logind[1906]: New session 6 of user core. Mar 17 18:19:33.208600 sudo[2199]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 18:19:33.209142 sudo[2199]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:19:33.214757 sudo[2199]: pam_unix(sudo:session): session closed for user root Mar 17 18:19:33.224194 sudo[2198]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 17 18:19:33.225268 sudo[2198]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:19:33.243231 systemd[1]: Stopping audit-rules.service... Mar 17 18:19:33.244000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Mar 17 18:19:33.246211 auditctl[2202]: No rules Mar 17 18:19:33.247524 kernel: kauditd_printk_skb: 60 callbacks suppressed Mar 17 18:19:33.247586 kernel: audit: type=1305 audit(1742235573.244:152): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Mar 17 18:19:33.248172 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 18:19:33.248742 systemd[1]: Stopped audit-rules.service. Mar 17 18:19:33.244000 audit[2202]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffcd716eb0 a2=420 a3=0 items=0 ppid=1 pid=2202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:33.253411 systemd[1]: Starting audit-rules.service... Mar 17 18:19:33.263100 kernel: audit: type=1300 audit(1742235573.244:152): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffcd716eb0 a2=420 a3=0 items=0 ppid=1 pid=2202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:33.244000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D44 Mar 17 18:19:33.270378 kernel: audit: type=1327 audit(1742235573.244:152): proctitle=2F7362696E2F617564697463746C002D44 Mar 17 18:19:33.247000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.279387 kernel: audit: type=1131 audit(1742235573.247:153): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.297035 augenrules[2220]: No rules Mar 17 18:19:33.297000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.298545 systemd[1]: Finished audit-rules.service. Mar 17 18:19:33.300001 sudo[2198]: pam_unix(sudo:session): session closed for user root Mar 17 18:19:33.298000 audit[2198]: USER_END pid=2198 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.317288 kernel: audit: type=1130 audit(1742235573.297:154): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.317400 kernel: audit: type=1106 audit(1742235573.298:155): pid=2198 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.317446 kernel: audit: type=1104 audit(1742235573.298:156): pid=2198 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.298000 audit[2198]: CRED_DISP pid=2198 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.324843 sshd[2194]: pam_unix(sshd:session): session closed for user core Mar 17 18:19:33.329850 systemd[1]: sshd@5-172.31.21.220:22-139.178.89.65:37446.service: Deactivated successfully. Mar 17 18:19:33.331237 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 18:19:33.331662 systemd-logind[1906]: Session 6 logged out. Waiting for processes to exit. Mar 17 18:19:33.325000 audit[2194]: USER_END pid=2194 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:19:33.334206 systemd-logind[1906]: Removed session 6. Mar 17 18:19:33.325000 audit[2194]: CRED_DISP pid=2194 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:19:33.351166 systemd[1]: Started sshd@6-172.31.21.220:22-139.178.89.65:37448.service. Mar 17 18:19:33.358446 kernel: audit: type=1106 audit(1742235573.325:157): pid=2194 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:19:33.358591 kernel: audit: type=1104 audit(1742235573.325:158): pid=2194 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:19:33.358635 kernel: audit: type=1131 audit(1742235573.328:159): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-172.31.21.220:22-139.178.89.65:37446 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.328000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-172.31.21.220:22-139.178.89.65:37446 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.348000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-172.31.21.220:22-139.178.89.65:37448 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.524000 audit[2227]: USER_ACCT pid=2227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:19:33.525991 sshd[2227]: Accepted publickey for core from 139.178.89.65 port 37448 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:19:33.526000 audit[2227]: CRED_ACQ pid=2227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:19:33.526000 audit[2227]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffedff5960 a2=3 a3=1 items=0 ppid=1 pid=2227 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=7 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:33.526000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:19:33.528977 sshd[2227]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:19:33.537183 systemd-logind[1906]: New session 7 of user core. Mar 17 18:19:33.538222 systemd[1]: Started session-7.scope. Mar 17 18:19:33.546000 audit[2227]: USER_START pid=2227 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:19:33.550000 audit[2230]: CRED_ACQ pid=2230 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:19:33.646000 audit[2231]: USER_ACCT pid=2231 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.647093 sudo[2231]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 18:19:33.647000 audit[2231]: CRED_REFR pid=2231 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.648494 sudo[2231]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:19:33.651000 audit[2231]: USER_START pid=2231 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:19:33.695693 systemd[1]: Starting docker.service... Mar 17 18:19:33.769261 env[2241]: time="2025-03-17T18:19:33.769170025Z" level=info msg="Starting up" Mar 17 18:19:33.771633 env[2241]: time="2025-03-17T18:19:33.771586806Z" level=info msg="parsed scheme: \"unix\"" module=grpc Mar 17 18:19:33.771840 env[2241]: time="2025-03-17T18:19:33.771812327Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Mar 17 18:19:33.771969 env[2241]: time="2025-03-17T18:19:33.771936381Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Mar 17 18:19:33.772076 env[2241]: time="2025-03-17T18:19:33.772050103Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Mar 17 18:19:33.776010 env[2241]: time="2025-03-17T18:19:33.775964083Z" level=info msg="parsed scheme: \"unix\"" module=grpc Mar 17 18:19:33.776211 env[2241]: time="2025-03-17T18:19:33.776184753Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Mar 17 18:19:33.776376 env[2241]: time="2025-03-17T18:19:33.776345063Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Mar 17 18:19:33.776483 env[2241]: time="2025-03-17T18:19:33.776457076Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Mar 17 18:19:33.795238 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport716042335-merged.mount: Deactivated successfully. Mar 17 18:19:34.115629 env[2241]: time="2025-03-17T18:19:34.115484623Z" level=warning msg="Your kernel does not support cgroup blkio weight" Mar 17 18:19:34.115629 env[2241]: time="2025-03-17T18:19:34.115531930Z" level=warning msg="Your kernel does not support cgroup blkio weight_device" Mar 17 18:19:34.115934 env[2241]: time="2025-03-17T18:19:34.115840708Z" level=info msg="Loading containers: start." Mar 17 18:19:34.285000 audit[2271]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2271 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.285000 audit[2271]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffff4c0e640 a2=0 a3=1 items=0 ppid=2241 pid=2271 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.285000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Mar 17 18:19:34.289000 audit[2273]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2273 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.289000 audit[2273]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=fffff4aeed60 a2=0 a3=1 items=0 ppid=2241 pid=2273 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.289000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Mar 17 18:19:34.293000 audit[2275]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2275 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.293000 audit[2275]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc315f090 a2=0 a3=1 items=0 ppid=2241 pid=2275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.293000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Mar 17 18:19:34.297000 audit[2277]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2277 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.297000 audit[2277]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffff0eb2d50 a2=0 a3=1 items=0 ppid=2241 pid=2277 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.297000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Mar 17 18:19:34.310000 audit[2279]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_rule pid=2279 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.310000 audit[2279]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffca9b5250 a2=0 a3=1 items=0 ppid=2241 pid=2279 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.310000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6A0052455455524E Mar 17 18:19:34.340000 audit[2284]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_rule pid=2284 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.340000 audit[2284]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=fffff2e30210 a2=0 a3=1 items=0 ppid=2241 pid=2284 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.340000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D32002D6A0052455455524E Mar 17 18:19:34.352000 audit[2286]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2286 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.352000 audit[2286]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffffce69390 a2=0 a3=1 items=0 ppid=2241 pid=2286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.352000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Mar 17 18:19:34.356000 audit[2288]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_rule pid=2288 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.356000 audit[2288]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffde7cbd30 a2=0 a3=1 items=0 ppid=2241 pid=2288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.356000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Mar 17 18:19:34.359000 audit[2290]: NETFILTER_CFG table=filter:10 family=2 entries=2 op=nft_register_chain pid=2290 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.359000 audit[2290]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=308 a0=3 a1=fffff6b19a20 a2=0 a3=1 items=0 ppid=2241 pid=2290 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.359000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:19:34.371000 audit[2294]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_unregister_rule pid=2294 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.371000 audit[2294]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=216 a0=3 a1=ffffd1fe8570 a2=0 a3=1 items=0 ppid=2241 pid=2294 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.371000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:19:34.377000 audit[2295]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2295 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.377000 audit[2295]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffce685690 a2=0 a3=1 items=0 ppid=2241 pid=2295 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.377000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:19:34.405377 kernel: Initializing XFRM netlink socket Mar 17 18:19:34.484157 env[2241]: time="2025-03-17T18:19:34.484088615Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address" Mar 17 18:19:34.486629 (udev-worker)[2251]: Network interface NamePolicy= disabled on kernel command line. Mar 17 18:19:34.521000 audit[2303]: NETFILTER_CFG table=nat:13 family=2 entries=2 op=nft_register_chain pid=2303 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.521000 audit[2303]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=492 a0=3 a1=ffffe4d36e80 a2=0 a3=1 items=0 ppid=2241 pid=2303 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.521000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Mar 17 18:19:34.544000 audit[2306]: NETFILTER_CFG table=nat:14 family=2 entries=1 op=nft_register_rule pid=2306 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.544000 audit[2306]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffd8134160 a2=0 a3=1 items=0 ppid=2241 pid=2306 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.544000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Mar 17 18:19:34.550000 audit[2309]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=2309 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.550000 audit[2309]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffeebe0940 a2=0 a3=1 items=0 ppid=2241 pid=2309 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.550000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B657230002D6F00646F636B657230002D6A00414343455054 Mar 17 18:19:34.554000 audit[2311]: NETFILTER_CFG table=filter:16 family=2 entries=1 op=nft_register_rule pid=2311 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.554000 audit[2311]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffce723520 a2=0 a3=1 items=0 ppid=2241 pid=2311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.554000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B6572300000002D6F00646F636B657230002D6A00414343455054 Mar 17 18:19:34.558000 audit[2313]: NETFILTER_CFG table=nat:17 family=2 entries=2 op=nft_register_chain pid=2313 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.558000 audit[2313]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=356 a0=3 a1=ffffd3e93140 a2=0 a3=1 items=0 ppid=2241 pid=2313 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.558000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Mar 17 18:19:34.562000 audit[2315]: NETFILTER_CFG table=nat:18 family=2 entries=2 op=nft_register_chain pid=2315 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.562000 audit[2315]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=444 a0=3 a1=ffffd6428d90 a2=0 a3=1 items=0 ppid=2241 pid=2315 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.562000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Mar 17 18:19:34.566000 audit[2317]: NETFILTER_CFG table=filter:19 family=2 entries=1 op=nft_register_rule pid=2317 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.566000 audit[2317]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=304 a0=3 a1=ffffc46655b0 a2=0 a3=1 items=0 ppid=2241 pid=2317 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.566000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6A00444F434B4552 Mar 17 18:19:34.586000 audit[2320]: NETFILTER_CFG table=filter:20 family=2 entries=1 op=nft_register_rule pid=2320 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.586000 audit[2320]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=508 a0=3 a1=ffffffdcac20 a2=0 a3=1 items=0 ppid=2241 pid=2320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.586000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Mar 17 18:19:34.591000 audit[2322]: NETFILTER_CFG table=filter:21 family=2 entries=1 op=nft_register_rule pid=2322 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.591000 audit[2322]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=240 a0=3 a1=ffffdee77550 a2=0 a3=1 items=0 ppid=2241 pid=2322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.591000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Mar 17 18:19:34.595000 audit[2324]: NETFILTER_CFG table=filter:22 family=2 entries=1 op=nft_register_rule pid=2324 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.595000 audit[2324]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffd73c4700 a2=0 a3=1 items=0 ppid=2241 pid=2324 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.595000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Mar 17 18:19:34.599000 audit[2326]: NETFILTER_CFG table=filter:23 family=2 entries=1 op=nft_register_rule pid=2326 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.599000 audit[2326]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe688de70 a2=0 a3=1 items=0 ppid=2241 pid=2326 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.599000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Mar 17 18:19:34.600700 systemd-networkd[1583]: docker0: Link UP Mar 17 18:19:34.614000 audit[2330]: NETFILTER_CFG table=filter:24 family=2 entries=1 op=nft_unregister_rule pid=2330 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.614000 audit[2330]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=fffffda585b0 a2=0 a3=1 items=0 ppid=2241 pid=2330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.614000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:19:34.622000 audit[2331]: NETFILTER_CFG table=filter:25 family=2 entries=1 op=nft_register_rule pid=2331 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:19:34.622000 audit[2331]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffd1c72630 a2=0 a3=1 items=0 ppid=2241 pid=2331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:19:34.622000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:19:34.626811 env[2241]: time="2025-03-17T18:19:34.626748426Z" level=info msg="Loading containers: done." Mar 17 18:19:34.657559 env[2241]: time="2025-03-17T18:19:34.657482949Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 18:19:34.658298 env[2241]: time="2025-03-17T18:19:34.658266630Z" level=info msg="Docker daemon" commit=112bdf3343 graphdriver(s)=overlay2 version=20.10.23 Mar 17 18:19:34.658687 env[2241]: time="2025-03-17T18:19:34.658647230Z" level=info msg="Daemon has completed initialization" Mar 17 18:19:34.682000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:34.683527 systemd[1]: Started docker.service. Mar 17 18:19:34.698046 env[2241]: time="2025-03-17T18:19:34.697948619Z" level=info msg="API listen on /run/docker.sock" Mar 17 18:19:36.934000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:36.934560 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 18:19:36.934866 systemd[1]: Stopped kubelet.service. Mar 17 18:19:36.934000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:36.937925 systemd[1]: Starting kubelet.service... Mar 17 18:19:37.234050 env[1923]: time="2025-03-17T18:19:37.232439069Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 17 18:19:37.298000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:37.298200 systemd[1]: Started kubelet.service. Mar 17 18:19:37.411925 kubelet[2376]: E0317 18:19:37.411871 2376 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:19:37.419000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:19:37.420035 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:19:37.420448 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:19:37.854159 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1122367455.mount: Deactivated successfully. Mar 17 18:19:39.765466 env[1923]: time="2025-03-17T18:19:39.765396570Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:39.768794 env[1923]: time="2025-03-17T18:19:39.768744174Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:39.771980 env[1923]: time="2025-03-17T18:19:39.771915653Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-apiserver:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:39.775444 env[1923]: time="2025-03-17T18:19:39.775380558Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:39.777257 env[1923]: time="2025-03-17T18:19:39.777210953Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:fcbef283ab16167d1ca4acb66836af518e9fe445111fbc618fdbe196858f9530\"" Mar 17 18:19:39.794704 env[1923]: time="2025-03-17T18:19:39.794649736Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 17 18:19:42.195041 env[1923]: time="2025-03-17T18:19:42.194982299Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:42.198181 env[1923]: time="2025-03-17T18:19:42.198131851Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:42.202570 env[1923]: time="2025-03-17T18:19:42.202518880Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-controller-manager:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:42.205890 env[1923]: time="2025-03-17T18:19:42.205838899Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:42.207589 env[1923]: time="2025-03-17T18:19:42.207525644Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:9469d949b9e8c03b6cb06af513f683dd2975b57092f3deb2a9e125e0d05188d3\"" Mar 17 18:19:42.226446 env[1923]: time="2025-03-17T18:19:42.226398613Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 17 18:19:43.626251 env[1923]: time="2025-03-17T18:19:43.626185340Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:43.630588 env[1923]: time="2025-03-17T18:19:43.630530054Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:43.639627 env[1923]: time="2025-03-17T18:19:43.639552228Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-scheduler:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:43.643105 env[1923]: time="2025-03-17T18:19:43.643054373Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:43.644804 env[1923]: time="2025-03-17T18:19:43.644759480Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:3540cd10f52fac0a58ba43c004c6d3941e2a9f53e06440b982b9c130a72c0213\"" Mar 17 18:19:43.663169 env[1923]: time="2025-03-17T18:19:43.663096698Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 17 18:19:44.991548 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1657616483.mount: Deactivated successfully. Mar 17 18:19:45.795504 env[1923]: time="2025-03-17T18:19:45.795438391Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:45.805765 env[1923]: time="2025-03-17T18:19:45.805689390Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:45.808181 env[1923]: time="2025-03-17T18:19:45.808116199Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-proxy:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:45.811769 env[1923]: time="2025-03-17T18:19:45.811696425Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:45.812219 env[1923]: time="2025-03-17T18:19:45.812167211Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:fe83790bf8a35411788b67fe5f0ce35309056c40530484d516af2ca01375220c\"" Mar 17 18:19:45.828902 env[1923]: time="2025-03-17T18:19:45.828854009Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 18:19:46.355062 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3242666979.mount: Deactivated successfully. Mar 17 18:19:47.662387 env[1923]: time="2025-03-17T18:19:47.662308504Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns:v1.11.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:47.666151 env[1923]: time="2025-03-17T18:19:47.666078845Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:47.670294 env[1923]: time="2025-03-17T18:19:47.670229476Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/coredns/coredns:v1.11.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:47.671937 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 18:19:47.682427 kernel: kauditd_printk_skb: 88 callbacks suppressed Mar 17 18:19:47.682517 kernel: audit: type=1130 audit(1742235587.670:198): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:47.670000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:47.672265 systemd[1]: Stopped kubelet.service. Mar 17 18:19:47.675067 systemd[1]: Starting kubelet.service... Mar 17 18:19:47.690801 kernel: audit: type=1131 audit(1742235587.671:199): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:47.671000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:47.695627 env[1923]: time="2025-03-17T18:19:47.695573078Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:47.700149 env[1923]: time="2025-03-17T18:19:47.697917102Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:2437cf762177702dec2dfe99a09c37427a15af6d9a57c456b65352667c223d93\"" Mar 17 18:19:47.718383 env[1923]: time="2025-03-17T18:19:47.718298828Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 17 18:19:47.986143 systemd[1]: Started kubelet.service. Mar 17 18:19:47.985000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:48.001381 kernel: audit: type=1130 audit(1742235587.985:200): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:48.081447 kubelet[2421]: E0317 18:19:48.081390 2421 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:19:48.085000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:19:48.085356 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:19:48.085745 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:19:48.095378 kernel: audit: type=1131 audit(1742235588.085:201): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:19:48.219020 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1353291505.mount: Deactivated successfully. Mar 17 18:19:48.227744 env[1923]: time="2025-03-17T18:19:48.227692862Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:48.231711 env[1923]: time="2025-03-17T18:19:48.231663654Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:48.236095 env[1923]: time="2025-03-17T18:19:48.236018127Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:48.239320 env[1923]: time="2025-03-17T18:19:48.237982764Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:48.240004 env[1923]: time="2025-03-17T18:19:48.239158237Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:829e9de338bd5fdd3f16f68f83a9fb288fbc8453e881e5d5cfd0f6f2ff72b43e\"" Mar 17 18:19:48.255691 env[1923]: time="2025-03-17T18:19:48.255622391Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 17 18:19:48.831003 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2688266311.mount: Deactivated successfully. Mar 17 18:19:51.632296 env[1923]: time="2025-03-17T18:19:51.632188398Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd:3.5.12-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:51.653409 env[1923]: time="2025-03-17T18:19:51.653323981Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:51.660696 env[1923]: time="2025-03-17T18:19:51.660642132Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/etcd:3.5.12-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:51.670920 env[1923]: time="2025-03-17T18:19:51.670849321Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:19:51.673667 env[1923]: time="2025-03-17T18:19:51.673595595Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:014faa467e29798aeef733fe6d1a3b5e382688217b053ad23410e6cccd5d22fd\"" Mar 17 18:19:53.517861 systemd[1]: systemd-hostnamed.service: Deactivated successfully. Mar 17 18:19:53.517000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hostnamed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:53.527366 kernel: audit: type=1131 audit(1742235593.517:202): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hostnamed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:58.213780 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 17 18:19:58.214125 systemd[1]: Stopped kubelet.service. Mar 17 18:19:58.212000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:58.216920 systemd[1]: Starting kubelet.service... Mar 17 18:19:58.230568 kernel: audit: type=1130 audit(1742235598.212:203): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:58.230701 kernel: audit: type=1131 audit(1742235598.212:204): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:58.212000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:58.501143 systemd[1]: Started kubelet.service. Mar 17 18:19:58.501000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:58.512383 kernel: audit: type=1130 audit(1742235598.501:205): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:19:58.608036 kubelet[2509]: E0317 18:19:58.607979 2509 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:19:58.611644 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:19:58.612057 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:19:58.612000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:19:58.627362 kernel: audit: type=1131 audit(1742235598.612:206): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:20:00.444887 amazon-ssm-agent[1888]: 2025-03-17 18:20:00 INFO [MessagingDeliveryService] [Association] Schedule manager refreshed with 0 associations, 0 new associations associated Mar 17 18:20:01.020577 systemd[1]: Stopped kubelet.service. Mar 17 18:20:01.020000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:01.024779 systemd[1]: Starting kubelet.service... Mar 17 18:20:01.020000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:01.038188 kernel: audit: type=1130 audit(1742235601.020:207): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:01.038305 kernel: audit: type=1131 audit(1742235601.020:208): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:01.077236 systemd[1]: Reloading. Mar 17 18:20:01.288975 /usr/lib/systemd/system-generators/torcx-generator[2542]: time="2025-03-17T18:20:01Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 18:20:01.300495 /usr/lib/systemd/system-generators/torcx-generator[2542]: time="2025-03-17T18:20:01Z" level=info msg="torcx already run" Mar 17 18:20:01.492222 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:20:01.492483 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:20:01.535174 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:20:01.748599 systemd[1]: Started kubelet.service. Mar 17 18:20:01.749000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:01.754894 systemd[1]: Stopping kubelet.service... Mar 17 18:20:01.759000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:01.759646 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 18:20:01.760169 systemd[1]: Stopped kubelet.service. Mar 17 18:20:01.764598 systemd[1]: Starting kubelet.service... Mar 17 18:20:01.767482 kernel: audit: type=1130 audit(1742235601.749:209): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:01.767629 kernel: audit: type=1131 audit(1742235601.759:210): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:02.065000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:02.065030 systemd[1]: Started kubelet.service. Mar 17 18:20:02.081478 kernel: audit: type=1130 audit(1742235602.065:211): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:02.168017 kubelet[2619]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:20:02.168017 kubelet[2619]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 18:20:02.168668 kubelet[2619]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:20:02.168668 kubelet[2619]: I0317 18:20:02.168148 2619 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 18:20:02.645885 kubelet[2619]: I0317 18:20:02.645839 2619 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 18:20:02.652211 kubelet[2619]: I0317 18:20:02.652172 2619 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 18:20:02.653015 kubelet[2619]: I0317 18:20:02.652985 2619 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 18:20:02.690230 kubelet[2619]: E0317 18:20:02.690192 2619 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.31.21.220:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:02.690811 kubelet[2619]: I0317 18:20:02.690759 2619 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 18:20:02.705498 kubelet[2619]: I0317 18:20:02.705463 2619 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 18:20:02.706529 kubelet[2619]: I0317 18:20:02.706481 2619 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 18:20:02.706935 kubelet[2619]: I0317 18:20:02.706662 2619 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-21-220","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 18:20:02.707184 kubelet[2619]: I0317 18:20:02.707161 2619 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 18:20:02.707302 kubelet[2619]: I0317 18:20:02.707283 2619 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 18:20:02.707666 kubelet[2619]: I0317 18:20:02.707645 2619 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:20:02.709639 kubelet[2619]: I0317 18:20:02.709612 2619 kubelet.go:400] "Attempting to sync node with API server" Mar 17 18:20:02.709820 kubelet[2619]: I0317 18:20:02.709798 2619 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 18:20:02.710040 kubelet[2619]: I0317 18:20:02.710019 2619 kubelet.go:312] "Adding apiserver pod source" Mar 17 18:20:02.710212 kubelet[2619]: I0317 18:20:02.710191 2619 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 18:20:02.712454 kubelet[2619]: I0317 18:20:02.712403 2619 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Mar 17 18:20:02.712810 kubelet[2619]: I0317 18:20:02.712773 2619 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 18:20:02.712885 kubelet[2619]: W0317 18:20:02.712858 2619 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 18:20:02.713938 kubelet[2619]: I0317 18:20:02.713893 2619 server.go:1264] "Started kubelet" Mar 17 18:20:02.714198 kubelet[2619]: W0317 18:20:02.714126 2619 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.21.220:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-21-220&limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:02.714297 kubelet[2619]: E0317 18:20:02.714223 2619 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.31.21.220:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-21-220&limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:02.739260 kubelet[2619]: E0317 18:20:02.739047 2619 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.31.21.220:6443/api/v1/namespaces/default/events\": dial tcp 172.31.21.220:6443: connect: connection refused" event="&Event{ObjectMeta:{ip-172-31-21-220.182daa0cc6cc4b60 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-21-220,UID:ip-172-31-21-220,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-21-220,},FirstTimestamp:2025-03-17 18:20:02.713856864 +0000 UTC m=+0.630429516,LastTimestamp:2025-03-17 18:20:02.713856864 +0000 UTC m=+0.630429516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-21-220,}" Mar 17 18:20:02.740370 kubelet[2619]: W0317 18:20:02.740284 2619 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.21.220:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:02.740565 kubelet[2619]: E0317 18:20:02.740541 2619 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.31.21.220:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:02.741000 audit[2619]: AVC avc: denied { mac_admin } for pid=2619 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:20:02.743466 kubelet[2619]: I0317 18:20:02.743424 2619 kubelet.go:1419] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Mar 17 18:20:02.743657 kubelet[2619]: I0317 18:20:02.743630 2619 kubelet.go:1423] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Mar 17 18:20:02.743915 kubelet[2619]: I0317 18:20:02.743895 2619 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 18:20:02.741000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:20:02.752894 kernel: audit: type=1400 audit(1742235602.741:212): avc: denied { mac_admin } for pid=2619 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:20:02.753037 kernel: audit: type=1401 audit(1742235602.741:212): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:20:02.753086 kernel: audit: type=1300 audit(1742235602.741:212): arch=c00000b7 syscall=5 success=no exit=-22 a0=4000489bf0 a1=4000c3a720 a2=4000489bc0 a3=25 items=0 ppid=1 pid=2619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.741000 audit[2619]: SYSCALL arch=c00000b7 syscall=5 success=no exit=-22 a0=4000489bf0 a1=4000c3a720 a2=4000489bc0 a3=25 items=0 ppid=1 pid=2619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.754315 kubelet[2619]: I0317 18:20:02.754287 2619 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 18:20:02.755127 kubelet[2619]: I0317 18:20:02.755092 2619 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 18:20:02.755427 kubelet[2619]: I0317 18:20:02.755408 2619 reconciler.go:26] "Reconciler: start to sync state" Mar 17 18:20:02.741000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:20:02.773311 kernel: audit: type=1327 audit(1742235602.741:212): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:20:02.741000 audit[2619]: AVC avc: denied { mac_admin } for pid=2619 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:20:02.741000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:20:02.741000 audit[2619]: SYSCALL arch=c00000b7 syscall=5 success=no exit=-22 a0=400096e960 a1=4000c3a738 a2=4000489c80 a3=25 items=0 ppid=1 pid=2619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.741000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:20:02.746000 audit[2630]: NETFILTER_CFG table=mangle:26 family=2 entries=2 op=nft_register_chain pid=2630 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:02.746000 audit[2630]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffe3f6e2a0 a2=0 a3=1 items=0 ppid=2619 pid=2630 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.746000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Mar 17 18:20:02.752000 audit[2631]: NETFILTER_CFG table=filter:27 family=2 entries=1 op=nft_register_chain pid=2631 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:02.752000 audit[2631]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffc82a030 a2=0 a3=1 items=0 ppid=2619 pid=2631 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.752000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Mar 17 18:20:02.758000 audit[2633]: NETFILTER_CFG table=filter:28 family=2 entries=2 op=nft_register_chain pid=2633 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:02.758000 audit[2633]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffd5541f60 a2=0 a3=1 items=0 ppid=2619 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.758000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:20:02.763000 audit[2635]: NETFILTER_CFG table=filter:29 family=2 entries=2 op=nft_register_chain pid=2635 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:02.763000 audit[2635]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffca2bf280 a2=0 a3=1 items=0 ppid=2619 pid=2635 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.763000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:20:02.775059 kubelet[2619]: W0317 18:20:02.774029 2619 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.21.220:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:02.775059 kubelet[2619]: E0317 18:20:02.774110 2619 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.31.21.220:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:02.775059 kubelet[2619]: E0317 18:20:02.774219 2619 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.21.220:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-21-220?timeout=10s\": dial tcp 172.31.21.220:6443: connect: connection refused" interval="200ms" Mar 17 18:20:02.775849 kubelet[2619]: I0317 18:20:02.775737 2619 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 18:20:02.776260 kubelet[2619]: I0317 18:20:02.776199 2619 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 18:20:02.778241 kubelet[2619]: I0317 18:20:02.778196 2619 server.go:455] "Adding debug handlers to kubelet server" Mar 17 18:20:02.780702 kubelet[2619]: I0317 18:20:02.776253 2619 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 18:20:02.781231 kubelet[2619]: I0317 18:20:02.781199 2619 factory.go:221] Registration of the systemd container factory successfully Mar 17 18:20:02.783065 kubelet[2619]: I0317 18:20:02.782991 2619 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 18:20:02.785171 kubelet[2619]: E0317 18:20:02.785129 2619 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 18:20:02.789206 kubelet[2619]: I0317 18:20:02.786749 2619 factory.go:221] Registration of the containerd container factory successfully Mar 17 18:20:02.813000 audit[2640]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2640 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:02.813000 audit[2640]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffcfc09a10 a2=0 a3=1 items=0 ppid=2619 pid=2640 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.813000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Mar 17 18:20:02.815425 kubelet[2619]: I0317 18:20:02.815373 2619 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 18:20:02.817000 audit[2642]: NETFILTER_CFG table=mangle:31 family=10 entries=2 op=nft_register_chain pid=2642 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:02.817000 audit[2642]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffed429420 a2=0 a3=1 items=0 ppid=2619 pid=2642 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.817000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Mar 17 18:20:02.819506 kubelet[2619]: I0317 18:20:02.819463 2619 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 18:20:02.819812 kubelet[2619]: I0317 18:20:02.819789 2619 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 18:20:02.819955 kubelet[2619]: I0317 18:20:02.819934 2619 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 18:20:02.820170 kubelet[2619]: E0317 18:20:02.820135 2619 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 18:20:02.821171 kubelet[2619]: W0317 18:20:02.821042 2619 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.21.220:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:02.821499 kubelet[2619]: E0317 18:20:02.821456 2619 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.31.21.220:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:02.824000 audit[2643]: NETFILTER_CFG table=mangle:32 family=2 entries=1 op=nft_register_chain pid=2643 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:02.824000 audit[2643]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc722e2a0 a2=0 a3=1 items=0 ppid=2619 pid=2643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.824000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Mar 17 18:20:02.826000 audit[2644]: NETFILTER_CFG table=mangle:33 family=10 entries=1 op=nft_register_chain pid=2644 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:02.826000 audit[2644]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd31e1900 a2=0 a3=1 items=0 ppid=2619 pid=2644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.826000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Mar 17 18:20:02.828000 audit[2646]: NETFILTER_CFG table=nat:34 family=2 entries=1 op=nft_register_chain pid=2646 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:02.828000 audit[2646]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff03645e0 a2=0 a3=1 items=0 ppid=2619 pid=2646 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.828000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Mar 17 18:20:02.832000 audit[2647]: NETFILTER_CFG table=nat:35 family=10 entries=2 op=nft_register_chain pid=2647 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:02.832000 audit[2647]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=128 a0=3 a1=ffffc993d2d0 a2=0 a3=1 items=0 ppid=2619 pid=2647 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.832000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Mar 17 18:20:02.834000 audit[2648]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_chain pid=2648 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:02.834000 audit[2648]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd730d8a0 a2=0 a3=1 items=0 ppid=2619 pid=2648 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.834000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Mar 17 18:20:02.838000 audit[2649]: NETFILTER_CFG table=filter:37 family=10 entries=2 op=nft_register_chain pid=2649 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:02.838000 audit[2649]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffda246690 a2=0 a3=1 items=0 ppid=2619 pid=2649 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.838000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Mar 17 18:20:02.852882 kubelet[2619]: I0317 18:20:02.852825 2619 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 18:20:02.852882 kubelet[2619]: I0317 18:20:02.852864 2619 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 18:20:02.853080 kubelet[2619]: I0317 18:20:02.852899 2619 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:20:02.855375 kubelet[2619]: I0317 18:20:02.855317 2619 policy_none.go:49] "None policy: Start" Mar 17 18:20:02.857642 kubelet[2619]: I0317 18:20:02.857603 2619 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 18:20:02.857642 kubelet[2619]: I0317 18:20:02.857647 2619 state_mem.go:35] "Initializing new in-memory state store" Mar 17 18:20:02.858990 kubelet[2619]: I0317 18:20:02.858934 2619 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-21-220" Mar 17 18:20:02.859682 kubelet[2619]: E0317 18:20:02.859635 2619 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.31.21.220:6443/api/v1/nodes\": dial tcp 172.31.21.220:6443: connect: connection refused" node="ip-172-31-21-220" Mar 17 18:20:02.866583 kubelet[2619]: I0317 18:20:02.866536 2619 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 18:20:02.866000 audit[2619]: AVC avc: denied { mac_admin } for pid=2619 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:20:02.866000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:20:02.866000 audit[2619]: SYSCALL arch=c00000b7 syscall=5 success=no exit=-22 a0=4000e752f0 a1=4000e5f188 a2=4000e752c0 a3=25 items=0 ppid=1 pid=2619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:02.866000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:20:02.867220 kubelet[2619]: I0317 18:20:02.867186 2619 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Mar 17 18:20:02.867626 kubelet[2619]: I0317 18:20:02.867546 2619 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 18:20:02.867872 kubelet[2619]: I0317 18:20:02.867853 2619 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 18:20:02.875157 kubelet[2619]: E0317 18:20:02.875115 2619 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-172-31-21-220\" not found" Mar 17 18:20:02.926188 kubelet[2619]: I0317 18:20:02.921019 2619 topology_manager.go:215] "Topology Admit Handler" podUID="ed8ccfaf6ebf40080f411f94f1d968fa" podNamespace="kube-system" podName="kube-apiserver-ip-172-31-21-220" Mar 17 18:20:02.927806 kubelet[2619]: I0317 18:20:02.927767 2619 topology_manager.go:215] "Topology Admit Handler" podUID="df23926ade792ba2387023434175b398" podNamespace="kube-system" podName="kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:02.929876 kubelet[2619]: I0317 18:20:02.929837 2619 topology_manager.go:215] "Topology Admit Handler" podUID="beec6036bc55beb6629ef7debd0a7202" podNamespace="kube-system" podName="kube-scheduler-ip-172-31-21-220" Mar 17 18:20:02.975639 kubelet[2619]: E0317 18:20:02.975580 2619 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.21.220:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-21-220?timeout=10s\": dial tcp 172.31.21.220:6443: connect: connection refused" interval="400ms" Mar 17 18:20:03.057140 kubelet[2619]: I0317 18:20:03.057079 2619 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:03.057293 kubelet[2619]: I0317 18:20:03.057149 2619 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/beec6036bc55beb6629ef7debd0a7202-kubeconfig\") pod \"kube-scheduler-ip-172-31-21-220\" (UID: \"beec6036bc55beb6629ef7debd0a7202\") " pod="kube-system/kube-scheduler-ip-172-31-21-220" Mar 17 18:20:03.057293 kubelet[2619]: I0317 18:20:03.057191 2619 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ed8ccfaf6ebf40080f411f94f1d968fa-ca-certs\") pod \"kube-apiserver-ip-172-31-21-220\" (UID: \"ed8ccfaf6ebf40080f411f94f1d968fa\") " pod="kube-system/kube-apiserver-ip-172-31-21-220" Mar 17 18:20:03.057293 kubelet[2619]: I0317 18:20:03.057229 2619 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ed8ccfaf6ebf40080f411f94f1d968fa-k8s-certs\") pod \"kube-apiserver-ip-172-31-21-220\" (UID: \"ed8ccfaf6ebf40080f411f94f1d968fa\") " pod="kube-system/kube-apiserver-ip-172-31-21-220" Mar 17 18:20:03.057293 kubelet[2619]: I0317 18:20:03.057265 2619 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-ca-certs\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:03.057597 kubelet[2619]: I0317 18:20:03.057300 2619 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:03.057597 kubelet[2619]: I0317 18:20:03.057352 2619 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-k8s-certs\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:03.057597 kubelet[2619]: I0317 18:20:03.057393 2619 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-kubeconfig\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:03.057597 kubelet[2619]: I0317 18:20:03.057433 2619 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ed8ccfaf6ebf40080f411f94f1d968fa-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-21-220\" (UID: \"ed8ccfaf6ebf40080f411f94f1d968fa\") " pod="kube-system/kube-apiserver-ip-172-31-21-220" Mar 17 18:20:03.062015 kubelet[2619]: I0317 18:20:03.061962 2619 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-21-220" Mar 17 18:20:03.062635 kubelet[2619]: E0317 18:20:03.062568 2619 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.31.21.220:6443/api/v1/nodes\": dial tcp 172.31.21.220:6443: connect: connection refused" node="ip-172-31-21-220" Mar 17 18:20:03.242474 env[1923]: time="2025-03-17T18:20:03.242420078Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-21-220,Uid:beec6036bc55beb6629ef7debd0a7202,Namespace:kube-system,Attempt:0,}" Mar 17 18:20:03.244087 env[1923]: time="2025-03-17T18:20:03.243994373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-21-220,Uid:ed8ccfaf6ebf40080f411f94f1d968fa,Namespace:kube-system,Attempt:0,}" Mar 17 18:20:03.249673 env[1923]: time="2025-03-17T18:20:03.249577523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-21-220,Uid:df23926ade792ba2387023434175b398,Namespace:kube-system,Attempt:0,}" Mar 17 18:20:03.377155 kubelet[2619]: E0317 18:20:03.377025 2619 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.21.220:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-21-220?timeout=10s\": dial tcp 172.31.21.220:6443: connect: connection refused" interval="800ms" Mar 17 18:20:03.465814 kubelet[2619]: I0317 18:20:03.465164 2619 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-21-220" Mar 17 18:20:03.465814 kubelet[2619]: E0317 18:20:03.465755 2619 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.31.21.220:6443/api/v1/nodes\": dial tcp 172.31.21.220:6443: connect: connection refused" node="ip-172-31-21-220" Mar 17 18:20:03.702506 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2652437889.mount: Deactivated successfully. Mar 17 18:20:03.715015 env[1923]: time="2025-03-17T18:20:03.714943623Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.716858 env[1923]: time="2025-03-17T18:20:03.716812117Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.719697 env[1923]: time="2025-03-17T18:20:03.719650911Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.721831 env[1923]: time="2025-03-17T18:20:03.721601975Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7d46a07936af93fcce097459055f93ab07331509aa55f4a2a90d95a3ace1850e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.730841 env[1923]: time="2025-03-17T18:20:03.730763572Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:7d46a07936af93fcce097459055f93ab07331509aa55f4a2a90d95a3ace1850e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.733877 env[1923]: time="2025-03-17T18:20:03.733809086Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:7d46a07936af93fcce097459055f93ab07331509aa55f4a2a90d95a3ace1850e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.739451 env[1923]: time="2025-03-17T18:20:03.739398403Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.745935 env[1923]: time="2025-03-17T18:20:03.745858315Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.750049 env[1923]: time="2025-03-17T18:20:03.749996143Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.751702 env[1923]: time="2025-03-17T18:20:03.751656897Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.753289 env[1923]: time="2025-03-17T18:20:03.753246216Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.754928 env[1923]: time="2025-03-17T18:20:03.754880679Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:03.795489 env[1923]: time="2025-03-17T18:20:03.788243069Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:20:03.795489 env[1923]: time="2025-03-17T18:20:03.788309704Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:20:03.795489 env[1923]: time="2025-03-17T18:20:03.788361831Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:20:03.795489 env[1923]: time="2025-03-17T18:20:03.788614893Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/9ea0a60e45ff5907486d6a75552b273e0de7d3c94e8cc1ec58c7d39a50869d78 pid=2656 runtime=io.containerd.runc.v2 Mar 17 18:20:03.832415 env[1923]: time="2025-03-17T18:20:03.832221990Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:20:03.832415 env[1923]: time="2025-03-17T18:20:03.832298873Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:20:03.832679 env[1923]: time="2025-03-17T18:20:03.832375167Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:20:03.832679 env[1923]: time="2025-03-17T18:20:03.832428446Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:20:03.832679 env[1923]: time="2025-03-17T18:20:03.832453777Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:20:03.833142 env[1923]: time="2025-03-17T18:20:03.832324948Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:20:03.833411 env[1923]: time="2025-03-17T18:20:03.833293376Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/d2dca5d5e7d790c5f54625c7d367d13e92016bcd55c69591ceffafe4906e4fd7 pid=2687 runtime=io.containerd.runc.v2 Mar 17 18:20:03.834092 env[1923]: time="2025-03-17T18:20:03.833957491Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/534c095e71a74cb9d27a359da503cfe2c786957a17113766ee63748d12f1b270 pid=2686 runtime=io.containerd.runc.v2 Mar 17 18:20:03.944800 kubelet[2619]: W0317 18:20:03.944590 2619 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.31.21.220:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:03.944800 kubelet[2619]: E0317 18:20:03.944749 2619 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.31.21.220:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:04.000364 env[1923]: time="2025-03-17T18:20:03.998724815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ip-172-31-21-220,Uid:df23926ade792ba2387023434175b398,Namespace:kube-system,Attempt:0,} returns sandbox id \"d2dca5d5e7d790c5f54625c7d367d13e92016bcd55c69591ceffafe4906e4fd7\"" Mar 17 18:20:04.012090 env[1923]: time="2025-03-17T18:20:04.012027803Z" level=info msg="CreateContainer within sandbox \"d2dca5d5e7d790c5f54625c7d367d13e92016bcd55c69591ceffafe4906e4fd7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 18:20:04.046211 env[1923]: time="2025-03-17T18:20:04.046151650Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ip-172-31-21-220,Uid:beec6036bc55beb6629ef7debd0a7202,Namespace:kube-system,Attempt:0,} returns sandbox id \"9ea0a60e45ff5907486d6a75552b273e0de7d3c94e8cc1ec58c7d39a50869d78\"" Mar 17 18:20:04.051116 env[1923]: time="2025-03-17T18:20:04.051058331Z" level=info msg="CreateContainer within sandbox \"9ea0a60e45ff5907486d6a75552b273e0de7d3c94e8cc1ec58c7d39a50869d78\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 18:20:04.062780 env[1923]: time="2025-03-17T18:20:04.062697757Z" level=info msg="CreateContainer within sandbox \"d2dca5d5e7d790c5f54625c7d367d13e92016bcd55c69591ceffafe4906e4fd7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"b4ca2a1c7d6e8f740802cb2c038e0e6dd5979b4c91a1408b885791cab295bdb2\"" Mar 17 18:20:04.063957 env[1923]: time="2025-03-17T18:20:04.063559364Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ip-172-31-21-220,Uid:ed8ccfaf6ebf40080f411f94f1d968fa,Namespace:kube-system,Attempt:0,} returns sandbox id \"534c095e71a74cb9d27a359da503cfe2c786957a17113766ee63748d12f1b270\"" Mar 17 18:20:04.063957 env[1923]: time="2025-03-17T18:20:04.063895154Z" level=info msg="StartContainer for \"b4ca2a1c7d6e8f740802cb2c038e0e6dd5979b4c91a1408b885791cab295bdb2\"" Mar 17 18:20:04.072791 env[1923]: time="2025-03-17T18:20:04.072667164Z" level=info msg="CreateContainer within sandbox \"534c095e71a74cb9d27a359da503cfe2c786957a17113766ee63748d12f1b270\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 18:20:04.087859 env[1923]: time="2025-03-17T18:20:04.087779395Z" level=info msg="CreateContainer within sandbox \"9ea0a60e45ff5907486d6a75552b273e0de7d3c94e8cc1ec58c7d39a50869d78\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"086cddaa54e138fa6d8b15f16d20824a013aea32885dc850b21e9bbbf76f53ea\"" Mar 17 18:20:04.088644 env[1923]: time="2025-03-17T18:20:04.088544896Z" level=info msg="StartContainer for \"086cddaa54e138fa6d8b15f16d20824a013aea32885dc850b21e9bbbf76f53ea\"" Mar 17 18:20:04.090764 kubelet[2619]: W0317 18:20:04.090577 2619 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.31.21.220:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:04.090764 kubelet[2619]: E0317 18:20:04.090728 2619 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.31.21.220:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:04.096482 env[1923]: time="2025-03-17T18:20:04.096413899Z" level=info msg="CreateContainer within sandbox \"534c095e71a74cb9d27a359da503cfe2c786957a17113766ee63748d12f1b270\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f271ec2ec1f413f3d6f38380cdcaae9f6f61b593d69f5fd59d4127de1cca0f75\"" Mar 17 18:20:04.097524 env[1923]: time="2025-03-17T18:20:04.097470815Z" level=info msg="StartContainer for \"f271ec2ec1f413f3d6f38380cdcaae9f6f61b593d69f5fd59d4127de1cca0f75\"" Mar 17 18:20:04.108674 kubelet[2619]: W0317 18:20:04.108133 2619 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.31.21.220:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-21-220&limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:04.108674 kubelet[2619]: E0317 18:20:04.108459 2619 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.31.21.220:6443/api/v1/nodes?fieldSelector=metadata.name%3Dip-172-31-21-220&limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:04.183670 kubelet[2619]: E0317 18:20:04.183559 2619 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.31.21.220:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ip-172-31-21-220?timeout=10s\": dial tcp 172.31.21.220:6443: connect: connection refused" interval="1.6s" Mar 17 18:20:04.279489 kubelet[2619]: I0317 18:20:04.277245 2619 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-21-220" Mar 17 18:20:04.279922 kubelet[2619]: E0317 18:20:04.279798 2619 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.31.21.220:6443/api/v1/nodes\": dial tcp 172.31.21.220:6443: connect: connection refused" node="ip-172-31-21-220" Mar 17 18:20:04.290358 env[1923]: time="2025-03-17T18:20:04.289953672Z" level=info msg="StartContainer for \"b4ca2a1c7d6e8f740802cb2c038e0e6dd5979b4c91a1408b885791cab295bdb2\" returns successfully" Mar 17 18:20:04.313920 env[1923]: time="2025-03-17T18:20:04.313848617Z" level=info msg="StartContainer for \"086cddaa54e138fa6d8b15f16d20824a013aea32885dc850b21e9bbbf76f53ea\" returns successfully" Mar 17 18:20:04.347278 kubelet[2619]: W0317 18:20:04.347143 2619 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.31.21.220:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:04.347278 kubelet[2619]: E0317 18:20:04.347240 2619 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.31.21.220:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.31.21.220:6443: connect: connection refused Mar 17 18:20:04.363062 env[1923]: time="2025-03-17T18:20:04.362981137Z" level=info msg="StartContainer for \"f271ec2ec1f413f3d6f38380cdcaae9f6f61b593d69f5fd59d4127de1cca0f75\" returns successfully" Mar 17 18:20:05.882508 kubelet[2619]: I0317 18:20:05.882459 2619 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-21-220" Mar 17 18:20:08.480473 update_engine[1909]: I0317 18:20:08.480412 1909 update_attempter.cc:509] Updating boot flags... Mar 17 18:20:08.502383 kubelet[2619]: E0317 18:20:08.500126 2619 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-172-31-21-220\" not found" node="ip-172-31-21-220" Mar 17 18:20:08.566954 kubelet[2619]: I0317 18:20:08.560597 2619 kubelet_node_status.go:76] "Successfully registered node" node="ip-172-31-21-220" Mar 17 18:20:08.591365 kubelet[2619]: E0317 18:20:08.591015 2619 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ip-172-31-21-220.182daa0cc6cc4b60 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-21-220,UID:ip-172-31-21-220,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-172-31-21-220,},FirstTimestamp:2025-03-17 18:20:02.713856864 +0000 UTC m=+0.630429516,LastTimestamp:2025-03-17 18:20:02.713856864 +0000 UTC m=+0.630429516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-21-220,}" Mar 17 18:20:08.714011 kubelet[2619]: E0317 18:20:08.713648 2619 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ip-172-31-21-220.182daa0ccb0b5468 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-172-31-21-220,UID:ip-172-31-21-220,APIVersion:,ResourceVersion:,FieldPath:,},Reason:InvalidDiskCapacity,Message:invalid capacity 0 on image filesystem,Source:EventSource{Component:kubelet,Host:ip-172-31-21-220,},FirstTimestamp:2025-03-17 18:20:02.785096808 +0000 UTC m=+0.701669484,LastTimestamp:2025-03-17 18:20:02.785096808 +0000 UTC m=+0.701669484,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-172-31-21-220,}" Mar 17 18:20:08.728843 kubelet[2619]: I0317 18:20:08.728439 2619 apiserver.go:52] "Watching apiserver" Mar 17 18:20:08.755860 kubelet[2619]: I0317 18:20:08.755610 2619 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 18:20:11.056254 systemd[1]: Reloading. Mar 17 18:20:11.176220 /usr/lib/systemd/system-generators/torcx-generator[3091]: time="2025-03-17T18:20:11Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 18:20:11.176998 /usr/lib/systemd/system-generators/torcx-generator[3091]: time="2025-03-17T18:20:11Z" level=info msg="torcx already run" Mar 17 18:20:11.387261 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:20:11.387301 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:20:11.431316 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:20:11.636012 systemd[1]: Stopping kubelet.service... Mar 17 18:20:11.655594 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 18:20:11.656196 systemd[1]: Stopped kubelet.service. Mar 17 18:20:11.667440 kernel: kauditd_printk_skb: 44 callbacks suppressed Mar 17 18:20:11.667583 kernel: audit: type=1131 audit(1742235611.655:227): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:11.655000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:11.660828 systemd[1]: Starting kubelet.service... Mar 17 18:20:11.950247 systemd[1]: Started kubelet.service. Mar 17 18:20:11.951000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:11.982366 kernel: audit: type=1130 audit(1742235611.951:228): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:12.108718 kubelet[3162]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:20:12.109284 kubelet[3162]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 18:20:12.109413 kubelet[3162]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:20:12.109646 kubelet[3162]: I0317 18:20:12.109589 3162 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 18:20:12.119101 kubelet[3162]: I0317 18:20:12.119059 3162 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 18:20:12.119348 kubelet[3162]: I0317 18:20:12.119275 3162 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 18:20:12.119803 kubelet[3162]: I0317 18:20:12.119779 3162 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 18:20:12.129608 kubelet[3162]: I0317 18:20:12.129551 3162 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 18:20:12.134158 kubelet[3162]: I0317 18:20:12.134115 3162 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 18:20:12.148008 kubelet[3162]: I0317 18:20:12.147962 3162 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 18:20:12.149201 kubelet[3162]: I0317 18:20:12.149139 3162 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 18:20:12.149783 kubelet[3162]: I0317 18:20:12.149458 3162 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-172-31-21-220","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 18:20:12.150032 kubelet[3162]: I0317 18:20:12.150008 3162 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 18:20:12.150237 kubelet[3162]: I0317 18:20:12.150218 3162 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 18:20:12.150476 kubelet[3162]: I0317 18:20:12.150455 3162 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:20:12.150866 kubelet[3162]: I0317 18:20:12.150843 3162 kubelet.go:400] "Attempting to sync node with API server" Mar 17 18:20:12.151840 kubelet[3162]: I0317 18:20:12.151813 3162 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 18:20:12.152082 kubelet[3162]: I0317 18:20:12.152061 3162 kubelet.go:312] "Adding apiserver pod source" Mar 17 18:20:12.154482 kubelet[3162]: I0317 18:20:12.154430 3162 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 18:20:12.160678 kubelet[3162]: I0317 18:20:12.160644 3162 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Mar 17 18:20:12.161324 kubelet[3162]: I0317 18:20:12.161303 3162 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 18:20:12.162963 kubelet[3162]: I0317 18:20:12.162922 3162 server.go:1264] "Started kubelet" Mar 17 18:20:12.166000 audit[3162]: AVC avc: denied { mac_admin } for pid=3162 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:20:12.174612 kernel: audit: type=1400 audit(1742235612.166:229): avc: denied { mac_admin } for pid=3162 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:20:12.166000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:20:12.179708 kubelet[3162]: I0317 18:20:12.179657 3162 kubelet.go:1419] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Mar 17 18:20:12.179929 kubelet[3162]: I0317 18:20:12.179901 3162 kubelet.go:1423] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Mar 17 18:20:12.180104 kubelet[3162]: I0317 18:20:12.180084 3162 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 18:20:12.192375 kernel: audit: type=1401 audit(1742235612.166:229): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:20:12.192502 kernel: audit: type=1300 audit(1742235612.166:229): arch=c00000b7 syscall=5 success=no exit=-22 a0=4000c26c30 a1=4000bd7158 a2=4000c26c00 a3=25 items=0 ppid=1 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:12.166000 audit[3162]: SYSCALL arch=c00000b7 syscall=5 success=no exit=-22 a0=4000c26c30 a1=4000bd7158 a2=4000c26c00 a3=25 items=0 ppid=1 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:12.192698 kubelet[3162]: I0317 18:20:12.183842 3162 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 18:20:12.192698 kubelet[3162]: I0317 18:20:12.185382 3162 server.go:455] "Adding debug handlers to kubelet server" Mar 17 18:20:12.166000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:20:12.207386 kernel: audit: type=1327 audit(1742235612.166:229): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:20:12.207584 kubelet[3162]: I0317 18:20:12.196456 3162 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 18:20:12.207584 kubelet[3162]: I0317 18:20:12.197123 3162 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 18:20:12.220832 kernel: audit: type=1400 audit(1742235612.179:230): avc: denied { mac_admin } for pid=3162 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:20:12.179000 audit[3162]: AVC avc: denied { mac_admin } for pid=3162 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:20:12.221270 kubelet[3162]: I0317 18:20:12.217708 3162 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 18:20:12.221270 kubelet[3162]: I0317 18:20:12.220275 3162 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 18:20:12.222376 kubelet[3162]: I0317 18:20:12.222307 3162 reconciler.go:26] "Reconciler: start to sync state" Mar 17 18:20:12.227427 kernel: audit: type=1401 audit(1742235612.179:230): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:20:12.179000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:20:12.179000 audit[3162]: SYSCALL arch=c00000b7 syscall=5 success=no exit=-22 a0=4000bd9260 a1=4000bd7170 a2=4000c26cc0 a3=25 items=0 ppid=1 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:12.240631 kernel: audit: type=1300 audit(1742235612.179:230): arch=c00000b7 syscall=5 success=no exit=-22 a0=4000bd9260 a1=4000bd7170 a2=4000c26cc0 a3=25 items=0 ppid=1 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:12.179000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:20:12.254057 kernel: audit: type=1327 audit(1742235612.179:230): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:20:12.280153 kubelet[3162]: I0317 18:20:12.280116 3162 factory.go:221] Registration of the containerd container factory successfully Mar 17 18:20:12.280387 kubelet[3162]: I0317 18:20:12.280365 3162 factory.go:221] Registration of the systemd container factory successfully Mar 17 18:20:12.280627 kubelet[3162]: I0317 18:20:12.280592 3162 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 18:20:12.280932 kubelet[3162]: E0317 18:20:12.280888 3162 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 18:20:12.288322 kubelet[3162]: I0317 18:20:12.288218 3162 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 18:20:12.307222 kubelet[3162]: I0317 18:20:12.307181 3162 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 18:20:12.310221 kubelet[3162]: I0317 18:20:12.310190 3162 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 18:20:12.315312 kubelet[3162]: I0317 18:20:12.315267 3162 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 18:20:12.315703 kubelet[3162]: E0317 18:20:12.315659 3162 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 18:20:12.331701 kubelet[3162]: I0317 18:20:12.331650 3162 kubelet_node_status.go:73] "Attempting to register node" node="ip-172-31-21-220" Mar 17 18:20:12.352848 kubelet[3162]: I0317 18:20:12.352810 3162 kubelet_node_status.go:112] "Node was previously registered" node="ip-172-31-21-220" Mar 17 18:20:12.353125 kubelet[3162]: I0317 18:20:12.353104 3162 kubelet_node_status.go:76] "Successfully registered node" node="ip-172-31-21-220" Mar 17 18:20:12.417000 kubelet[3162]: E0317 18:20:12.416938 3162 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 17 18:20:12.444031 kubelet[3162]: I0317 18:20:12.443963 3162 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 18:20:12.444031 kubelet[3162]: I0317 18:20:12.443998 3162 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 18:20:12.444031 kubelet[3162]: I0317 18:20:12.444034 3162 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:20:12.444319 kubelet[3162]: I0317 18:20:12.444282 3162 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 18:20:12.444524 kubelet[3162]: I0317 18:20:12.444302 3162 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 18:20:12.444524 kubelet[3162]: I0317 18:20:12.444364 3162 policy_none.go:49] "None policy: Start" Mar 17 18:20:12.446417 kubelet[3162]: I0317 18:20:12.446295 3162 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 18:20:12.446612 kubelet[3162]: I0317 18:20:12.446591 3162 state_mem.go:35] "Initializing new in-memory state store" Mar 17 18:20:12.447154 kubelet[3162]: I0317 18:20:12.447130 3162 state_mem.go:75] "Updated machine memory state" Mar 17 18:20:12.450044 kubelet[3162]: I0317 18:20:12.450005 3162 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 18:20:12.449000 audit[3162]: AVC avc: denied { mac_admin } for pid=3162 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:20:12.449000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:20:12.449000 audit[3162]: SYSCALL arch=c00000b7 syscall=5 success=no exit=-22 a0=4000df6cf0 a1=4000f09968 a2=4000df6cc0 a3=25 items=0 ppid=1 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:12.449000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:20:12.450867 kubelet[3162]: I0317 18:20:12.450831 3162 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Mar 17 18:20:12.451277 kubelet[3162]: I0317 18:20:12.451221 3162 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 18:20:12.456312 kubelet[3162]: I0317 18:20:12.456234 3162 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 18:20:12.617458 kubelet[3162]: I0317 18:20:12.617372 3162 topology_manager.go:215] "Topology Admit Handler" podUID="ed8ccfaf6ebf40080f411f94f1d968fa" podNamespace="kube-system" podName="kube-apiserver-ip-172-31-21-220" Mar 17 18:20:12.617651 kubelet[3162]: I0317 18:20:12.617566 3162 topology_manager.go:215] "Topology Admit Handler" podUID="df23926ade792ba2387023434175b398" podNamespace="kube-system" podName="kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:12.617651 kubelet[3162]: I0317 18:20:12.617642 3162 topology_manager.go:215] "Topology Admit Handler" podUID="beec6036bc55beb6629ef7debd0a7202" podNamespace="kube-system" podName="kube-scheduler-ip-172-31-21-220" Mar 17 18:20:12.627941 kubelet[3162]: I0317 18:20:12.627424 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ed8ccfaf6ebf40080f411f94f1d968fa-k8s-certs\") pod \"kube-apiserver-ip-172-31-21-220\" (UID: \"ed8ccfaf6ebf40080f411f94f1d968fa\") " pod="kube-system/kube-apiserver-ip-172-31-21-220" Mar 17 18:20:12.627941 kubelet[3162]: I0317 18:20:12.627513 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ed8ccfaf6ebf40080f411f94f1d968fa-usr-share-ca-certificates\") pod \"kube-apiserver-ip-172-31-21-220\" (UID: \"ed8ccfaf6ebf40080f411f94f1d968fa\") " pod="kube-system/kube-apiserver-ip-172-31-21-220" Mar 17 18:20:12.627941 kubelet[3162]: I0317 18:20:12.627562 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-ca-certs\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:12.627941 kubelet[3162]: I0317 18:20:12.627619 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-k8s-certs\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:12.627941 kubelet[3162]: I0317 18:20:12.627657 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-kubeconfig\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:12.628676 kubelet[3162]: I0317 18:20:12.627692 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-usr-share-ca-certificates\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:12.628676 kubelet[3162]: I0317 18:20:12.627730 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ed8ccfaf6ebf40080f411f94f1d968fa-ca-certs\") pod \"kube-apiserver-ip-172-31-21-220\" (UID: \"ed8ccfaf6ebf40080f411f94f1d968fa\") " pod="kube-system/kube-apiserver-ip-172-31-21-220" Mar 17 18:20:12.628676 kubelet[3162]: I0317 18:20:12.627766 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/df23926ade792ba2387023434175b398-flexvolume-dir\") pod \"kube-controller-manager-ip-172-31-21-220\" (UID: \"df23926ade792ba2387023434175b398\") " pod="kube-system/kube-controller-manager-ip-172-31-21-220" Mar 17 18:20:12.628676 kubelet[3162]: I0317 18:20:12.627815 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/beec6036bc55beb6629ef7debd0a7202-kubeconfig\") pod \"kube-scheduler-ip-172-31-21-220\" (UID: \"beec6036bc55beb6629ef7debd0a7202\") " pod="kube-system/kube-scheduler-ip-172-31-21-220" Mar 17 18:20:12.632697 kubelet[3162]: E0317 18:20:12.632626 3162 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ip-172-31-21-220\" already exists" pod="kube-system/kube-apiserver-ip-172-31-21-220" Mar 17 18:20:13.155554 kubelet[3162]: I0317 18:20:13.155495 3162 apiserver.go:52] "Watching apiserver" Mar 17 18:20:13.220959 kubelet[3162]: I0317 18:20:13.220911 3162 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 18:20:13.482008 kubelet[3162]: I0317 18:20:13.481883 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ip-172-31-21-220" podStartSLOduration=2.481837669 podStartE2EDuration="2.481837669s" podCreationTimestamp="2025-03-17 18:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:20:13.477177286 +0000 UTC m=+1.513044678" watchObservedRunningTime="2025-03-17 18:20:13.481837669 +0000 UTC m=+1.517705049" Mar 17 18:20:13.545930 kubelet[3162]: I0317 18:20:13.545856 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ip-172-31-21-220" podStartSLOduration=1.545832887 podStartE2EDuration="1.545832887s" podCreationTimestamp="2025-03-17 18:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:20:13.517749287 +0000 UTC m=+1.553616679" watchObservedRunningTime="2025-03-17 18:20:13.545832887 +0000 UTC m=+1.581700291" Mar 17 18:20:13.588806 kubelet[3162]: I0317 18:20:13.588690 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ip-172-31-21-220" podStartSLOduration=1.58864388 podStartE2EDuration="1.58864388s" podCreationTimestamp="2025-03-17 18:20:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:20:13.54706796 +0000 UTC m=+1.582935376" watchObservedRunningTime="2025-03-17 18:20:13.58864388 +0000 UTC m=+1.624511272" Mar 17 18:20:18.969316 sudo[2231]: pam_unix(sudo:session): session closed for user root Mar 17 18:20:18.973380 kernel: kauditd_printk_skb: 4 callbacks suppressed Mar 17 18:20:18.973482 kernel: audit: type=1106 audit(1742235618.968:232): pid=2231 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:20:18.968000 audit[2231]: USER_END pid=2231 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:20:18.968000 audit[2231]: CRED_DISP pid=2231 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:20:18.987595 kernel: audit: type=1104 audit(1742235618.968:233): pid=2231 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:20:19.003814 sshd[2227]: pam_unix(sshd:session): session closed for user core Mar 17 18:20:19.004000 audit[2227]: USER_END pid=2227 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:20:19.009518 systemd-logind[1906]: Session 7 logged out. Waiting for processes to exit. Mar 17 18:20:19.012819 systemd[1]: sshd@6-172.31.21.220:22-139.178.89.65:37448.service: Deactivated successfully. Mar 17 18:20:19.014141 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 18:20:19.005000 audit[2227]: CRED_DISP pid=2227 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:20:19.017652 systemd-logind[1906]: Removed session 7. Mar 17 18:20:19.025047 kernel: audit: type=1106 audit(1742235619.004:234): pid=2227 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:20:19.025174 kernel: audit: type=1104 audit(1742235619.005:235): pid=2227 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:20:19.011000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-172.31.21.220:22-139.178.89.65:37448 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:19.033387 kernel: audit: type=1131 audit(1742235619.011:236): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-172.31.21.220:22-139.178.89.65:37448 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:20:24.576087 kubelet[3162]: I0317 18:20:24.576045 3162 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 18:20:24.577834 env[1923]: time="2025-03-17T18:20:24.577757435Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 18:20:24.578521 kubelet[3162]: I0317 18:20:24.578214 3162 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 18:20:25.541669 kubelet[3162]: I0317 18:20:25.541577 3162 topology_manager.go:215] "Topology Admit Handler" podUID="81ab80e2-528c-475b-ac32-c1457dda24cf" podNamespace="kube-system" podName="kube-proxy-9z7lf" Mar 17 18:20:25.631581 kubelet[3162]: I0317 18:20:25.631531 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/81ab80e2-528c-475b-ac32-c1457dda24cf-kube-proxy\") pod \"kube-proxy-9z7lf\" (UID: \"81ab80e2-528c-475b-ac32-c1457dda24cf\") " pod="kube-system/kube-proxy-9z7lf" Mar 17 18:20:25.632293 kubelet[3162]: I0317 18:20:25.632255 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgtdb\" (UniqueName: \"kubernetes.io/projected/81ab80e2-528c-475b-ac32-c1457dda24cf-kube-api-access-mgtdb\") pod \"kube-proxy-9z7lf\" (UID: \"81ab80e2-528c-475b-ac32-c1457dda24cf\") " pod="kube-system/kube-proxy-9z7lf" Mar 17 18:20:25.632495 kubelet[3162]: I0317 18:20:25.632469 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/81ab80e2-528c-475b-ac32-c1457dda24cf-xtables-lock\") pod \"kube-proxy-9z7lf\" (UID: \"81ab80e2-528c-475b-ac32-c1457dda24cf\") " pod="kube-system/kube-proxy-9z7lf" Mar 17 18:20:25.632659 kubelet[3162]: I0317 18:20:25.632633 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/81ab80e2-528c-475b-ac32-c1457dda24cf-lib-modules\") pod \"kube-proxy-9z7lf\" (UID: \"81ab80e2-528c-475b-ac32-c1457dda24cf\") " pod="kube-system/kube-proxy-9z7lf" Mar 17 18:20:25.646174 kubelet[3162]: I0317 18:20:25.646099 3162 topology_manager.go:215] "Topology Admit Handler" podUID="e17100dd-5865-4df2-b6d5-1e3a16163bc9" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-p5vk9" Mar 17 18:20:25.733348 kubelet[3162]: I0317 18:20:25.733291 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbl2m\" (UniqueName: \"kubernetes.io/projected/e17100dd-5865-4df2-b6d5-1e3a16163bc9-kube-api-access-nbl2m\") pod \"tigera-operator-7bc55997bb-p5vk9\" (UID: \"e17100dd-5865-4df2-b6d5-1e3a16163bc9\") " pod="tigera-operator/tigera-operator-7bc55997bb-p5vk9" Mar 17 18:20:25.733642 kubelet[3162]: I0317 18:20:25.733612 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e17100dd-5865-4df2-b6d5-1e3a16163bc9-var-lib-calico\") pod \"tigera-operator-7bc55997bb-p5vk9\" (UID: \"e17100dd-5865-4df2-b6d5-1e3a16163bc9\") " pod="tigera-operator/tigera-operator-7bc55997bb-p5vk9" Mar 17 18:20:25.850411 env[1923]: time="2025-03-17T18:20:25.850217948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9z7lf,Uid:81ab80e2-528c-475b-ac32-c1457dda24cf,Namespace:kube-system,Attempt:0,}" Mar 17 18:20:25.892235 env[1923]: time="2025-03-17T18:20:25.892103677Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:20:25.892235 env[1923]: time="2025-03-17T18:20:25.892184904Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:20:25.892854 env[1923]: time="2025-03-17T18:20:25.892211976Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:20:25.892854 env[1923]: time="2025-03-17T18:20:25.892740212Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/c7dace07d0eac9f33e5cb7ce0cf36a0f5a9674d6f72106976de7520811b9774b pid=3248 runtime=io.containerd.runc.v2 Mar 17 18:20:25.966469 env[1923]: time="2025-03-17T18:20:25.966416026Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-p5vk9,Uid:e17100dd-5865-4df2-b6d5-1e3a16163bc9,Namespace:tigera-operator,Attempt:0,}" Mar 17 18:20:25.988644 env[1923]: time="2025-03-17T18:20:25.988572369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-9z7lf,Uid:81ab80e2-528c-475b-ac32-c1457dda24cf,Namespace:kube-system,Attempt:0,} returns sandbox id \"c7dace07d0eac9f33e5cb7ce0cf36a0f5a9674d6f72106976de7520811b9774b\"" Mar 17 18:20:25.997740 env[1923]: time="2025-03-17T18:20:25.997576225Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:20:25.998376 env[1923]: time="2025-03-17T18:20:25.998289823Z" level=info msg="CreateContainer within sandbox \"c7dace07d0eac9f33e5cb7ce0cf36a0f5a9674d6f72106976de7520811b9774b\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 18:20:26.019470 env[1923]: time="2025-03-17T18:20:26.006381747Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:20:26.019470 env[1923]: time="2025-03-17T18:20:26.006421743Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:20:26.019470 env[1923]: time="2025-03-17T18:20:26.006828456Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/1c48db8b94e1623fc7e05626b42069d9f23c3b8638743fca523793cd15dc4b3b pid=3289 runtime=io.containerd.runc.v2 Mar 17 18:20:26.034867 env[1923]: time="2025-03-17T18:20:26.034780966Z" level=info msg="CreateContainer within sandbox \"c7dace07d0eac9f33e5cb7ce0cf36a0f5a9674d6f72106976de7520811b9774b\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ef39ca3f523453784d71c0b0a6bb3369e3822323f597876bcdd82d92525c9b2c\"" Mar 17 18:20:26.041226 env[1923]: time="2025-03-17T18:20:26.041052023Z" level=info msg="StartContainer for \"ef39ca3f523453784d71c0b0a6bb3369e3822323f597876bcdd82d92525c9b2c\"" Mar 17 18:20:26.163417 env[1923]: time="2025-03-17T18:20:26.163158084Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-p5vk9,Uid:e17100dd-5865-4df2-b6d5-1e3a16163bc9,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"1c48db8b94e1623fc7e05626b42069d9f23c3b8638743fca523793cd15dc4b3b\"" Mar 17 18:20:26.170265 env[1923]: time="2025-03-17T18:20:26.170199296Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Mar 17 18:20:26.178721 env[1923]: time="2025-03-17T18:20:26.178650162Z" level=info msg="StartContainer for \"ef39ca3f523453784d71c0b0a6bb3369e3822323f597876bcdd82d92525c9b2c\" returns successfully" Mar 17 18:20:26.306000 audit[3381]: NETFILTER_CFG table=mangle:38 family=10 entries=1 op=nft_register_chain pid=3381 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.306000 audit[3381]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd51fb2d0 a2=0 a3=1 items=0 ppid=3332 pid=3381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.315396 kernel: audit: type=1325 audit(1742235626.306:237): table=mangle:38 family=10 entries=1 op=nft_register_chain pid=3381 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.306000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:20:26.331480 kernel: audit: type=1300 audit(1742235626.306:237): arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd51fb2d0 a2=0 a3=1 items=0 ppid=3332 pid=3381 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.307000 audit[3382]: NETFILTER_CFG table=nat:39 family=10 entries=1 op=nft_register_chain pid=3382 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.342498 kernel: audit: type=1327 audit(1742235626.306:237): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:20:26.342667 kernel: audit: type=1325 audit(1742235626.307:238): table=nat:39 family=10 entries=1 op=nft_register_chain pid=3382 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.307000 audit[3382]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff2d9d5c0 a2=0 a3=1 items=0 ppid=3332 pid=3382 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.353327 kernel: audit: type=1300 audit(1742235626.307:238): arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff2d9d5c0 a2=0 a3=1 items=0 ppid=3332 pid=3382 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.307000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 18:20:26.307000 audit[3383]: NETFILTER_CFG table=filter:40 family=10 entries=1 op=nft_register_chain pid=3383 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.365930 kernel: audit: type=1327 audit(1742235626.307:238): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 18:20:26.366024 kernel: audit: type=1325 audit(1742235626.307:239): table=filter:40 family=10 entries=1 op=nft_register_chain pid=3383 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.307000 audit[3383]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe61e0af0 a2=0 a3=1 items=0 ppid=3332 pid=3383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.376498 kernel: audit: type=1300 audit(1742235626.307:239): arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe61e0af0 a2=0 a3=1 items=0 ppid=3332 pid=3383 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.307000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 18:20:26.381890 kernel: audit: type=1327 audit(1742235626.307:239): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 18:20:26.322000 audit[3380]: NETFILTER_CFG table=mangle:41 family=2 entries=1 op=nft_register_chain pid=3380 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.387111 kernel: audit: type=1325 audit(1742235626.322:240): table=mangle:41 family=2 entries=1 op=nft_register_chain pid=3380 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.322000 audit[3380]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff509b640 a2=0 a3=1 items=0 ppid=3332 pid=3380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.322000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:20:26.341000 audit[3384]: NETFILTER_CFG table=nat:42 family=2 entries=1 op=nft_register_chain pid=3384 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.341000 audit[3384]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffeb135c10 a2=0 a3=1 items=0 ppid=3332 pid=3384 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.341000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 18:20:26.353000 audit[3385]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=3385 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.353000 audit[3385]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcc2379e0 a2=0 a3=1 items=0 ppid=3332 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.353000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 18:20:26.419000 audit[3386]: NETFILTER_CFG table=filter:44 family=2 entries=1 op=nft_register_chain pid=3386 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.419000 audit[3386]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffebb530f0 a2=0 a3=1 items=0 ppid=3332 pid=3386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.419000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Mar 17 18:20:26.427000 audit[3388]: NETFILTER_CFG table=filter:45 family=2 entries=1 op=nft_register_rule pid=3388 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.427000 audit[3388]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffcaf7c4e0 a2=0 a3=1 items=0 ppid=3332 pid=3388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.427000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Mar 17 18:20:26.434000 audit[3391]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=3391 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.434000 audit[3391]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffeee63800 a2=0 a3=1 items=0 ppid=3332 pid=3391 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.434000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Mar 17 18:20:26.437000 audit[3392]: NETFILTER_CFG table=filter:47 family=2 entries=1 op=nft_register_chain pid=3392 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.437000 audit[3392]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffefc1b120 a2=0 a3=1 items=0 ppid=3332 pid=3392 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.437000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Mar 17 18:20:26.444000 audit[3394]: NETFILTER_CFG table=filter:48 family=2 entries=1 op=nft_register_rule pid=3394 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.444000 audit[3394]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc2485b10 a2=0 a3=1 items=0 ppid=3332 pid=3394 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.444000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Mar 17 18:20:26.446000 audit[3395]: NETFILTER_CFG table=filter:49 family=2 entries=1 op=nft_register_chain pid=3395 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.446000 audit[3395]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe1a9bca0 a2=0 a3=1 items=0 ppid=3332 pid=3395 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.446000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Mar 17 18:20:26.452000 audit[3397]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_rule pid=3397 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.452000 audit[3397]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffff56f1080 a2=0 a3=1 items=0 ppid=3332 pid=3397 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.452000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Mar 17 18:20:26.463000 audit[3400]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_rule pid=3400 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.463000 audit[3400]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffc44d6b20 a2=0 a3=1 items=0 ppid=3332 pid=3400 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.463000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Mar 17 18:20:26.466000 audit[3401]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=3401 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.466000 audit[3401]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff1dd8820 a2=0 a3=1 items=0 ppid=3332 pid=3401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.466000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Mar 17 18:20:26.471000 audit[3403]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_rule pid=3403 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.471000 audit[3403]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffffc219ce0 a2=0 a3=1 items=0 ppid=3332 pid=3403 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.471000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Mar 17 18:20:26.475000 audit[3404]: NETFILTER_CFG table=filter:54 family=2 entries=1 op=nft_register_chain pid=3404 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.475000 audit[3404]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe74aa400 a2=0 a3=1 items=0 ppid=3332 pid=3404 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.475000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Mar 17 18:20:26.480000 audit[3406]: NETFILTER_CFG table=filter:55 family=2 entries=1 op=nft_register_rule pid=3406 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.480000 audit[3406]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff8be62f0 a2=0 a3=1 items=0 ppid=3332 pid=3406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.480000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 18:20:26.488000 audit[3409]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_rule pid=3409 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.488000 audit[3409]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffdb7d9570 a2=0 a3=1 items=0 ppid=3332 pid=3409 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.488000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 18:20:26.497000 audit[3412]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_rule pid=3412 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.497000 audit[3412]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffda6d9060 a2=0 a3=1 items=0 ppid=3332 pid=3412 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.497000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Mar 17 18:20:26.499000 audit[3413]: NETFILTER_CFG table=nat:58 family=2 entries=1 op=nft_register_chain pid=3413 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.499000 audit[3413]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffdd5de200 a2=0 a3=1 items=0 ppid=3332 pid=3413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.499000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Mar 17 18:20:26.508000 audit[3415]: NETFILTER_CFG table=nat:59 family=2 entries=1 op=nft_register_rule pid=3415 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.508000 audit[3415]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=fffff129d260 a2=0 a3=1 items=0 ppid=3332 pid=3415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.508000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:20:26.515000 audit[3418]: NETFILTER_CFG table=nat:60 family=2 entries=1 op=nft_register_rule pid=3418 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.515000 audit[3418]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc38af120 a2=0 a3=1 items=0 ppid=3332 pid=3418 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.515000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:20:26.518000 audit[3419]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=3419 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.518000 audit[3419]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd2ad0690 a2=0 a3=1 items=0 ppid=3332 pid=3419 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.518000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Mar 17 18:20:26.524000 audit[3421]: NETFILTER_CFG table=nat:62 family=2 entries=1 op=nft_register_rule pid=3421 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:20:26.524000 audit[3421]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffca5bb8b0 a2=0 a3=1 items=0 ppid=3332 pid=3421 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.524000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Mar 17 18:20:26.561000 audit[3427]: NETFILTER_CFG table=filter:63 family=2 entries=8 op=nft_register_rule pid=3427 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:26.561000 audit[3427]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5164 a0=3 a1=ffffdc814310 a2=0 a3=1 items=0 ppid=3332 pid=3427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.561000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:26.576000 audit[3427]: NETFILTER_CFG table=nat:64 family=2 entries=14 op=nft_register_chain pid=3427 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:26.576000 audit[3427]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffdc814310 a2=0 a3=1 items=0 ppid=3332 pid=3427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.576000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:26.579000 audit[3433]: NETFILTER_CFG table=filter:65 family=10 entries=1 op=nft_register_chain pid=3433 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.579000 audit[3433]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffff1ea94a0 a2=0 a3=1 items=0 ppid=3332 pid=3433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.579000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Mar 17 18:20:26.585000 audit[3435]: NETFILTER_CFG table=filter:66 family=10 entries=2 op=nft_register_chain pid=3435 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.585000 audit[3435]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=fffff4bdfdf0 a2=0 a3=1 items=0 ppid=3332 pid=3435 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.585000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Mar 17 18:20:26.592000 audit[3438]: NETFILTER_CFG table=filter:67 family=10 entries=2 op=nft_register_chain pid=3438 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.592000 audit[3438]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=fffff97bd4f0 a2=0 a3=1 items=0 ppid=3332 pid=3438 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.592000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Mar 17 18:20:26.595000 audit[3439]: NETFILTER_CFG table=filter:68 family=10 entries=1 op=nft_register_chain pid=3439 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.595000 audit[3439]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffb144db0 a2=0 a3=1 items=0 ppid=3332 pid=3439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.595000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Mar 17 18:20:26.600000 audit[3441]: NETFILTER_CFG table=filter:69 family=10 entries=1 op=nft_register_rule pid=3441 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.600000 audit[3441]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd461e0a0 a2=0 a3=1 items=0 ppid=3332 pid=3441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.600000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Mar 17 18:20:26.602000 audit[3442]: NETFILTER_CFG table=filter:70 family=10 entries=1 op=nft_register_chain pid=3442 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.602000 audit[3442]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff175f260 a2=0 a3=1 items=0 ppid=3332 pid=3442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.602000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Mar 17 18:20:26.607000 audit[3444]: NETFILTER_CFG table=filter:71 family=10 entries=1 op=nft_register_rule pid=3444 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.607000 audit[3444]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffdbd7c460 a2=0 a3=1 items=0 ppid=3332 pid=3444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.607000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Mar 17 18:20:26.615000 audit[3447]: NETFILTER_CFG table=filter:72 family=10 entries=2 op=nft_register_chain pid=3447 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.615000 audit[3447]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffc0492480 a2=0 a3=1 items=0 ppid=3332 pid=3447 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.615000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Mar 17 18:20:26.617000 audit[3448]: NETFILTER_CFG table=filter:73 family=10 entries=1 op=nft_register_chain pid=3448 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.617000 audit[3448]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffa262c40 a2=0 a3=1 items=0 ppid=3332 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.617000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Mar 17 18:20:26.622000 audit[3450]: NETFILTER_CFG table=filter:74 family=10 entries=1 op=nft_register_rule pid=3450 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.622000 audit[3450]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdd9af360 a2=0 a3=1 items=0 ppid=3332 pid=3450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.622000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Mar 17 18:20:26.625000 audit[3451]: NETFILTER_CFG table=filter:75 family=10 entries=1 op=nft_register_chain pid=3451 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.625000 audit[3451]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc131d6f0 a2=0 a3=1 items=0 ppid=3332 pid=3451 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.625000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Mar 17 18:20:26.630000 audit[3453]: NETFILTER_CFG table=filter:76 family=10 entries=1 op=nft_register_rule pid=3453 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.630000 audit[3453]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffbb0b6a0 a2=0 a3=1 items=0 ppid=3332 pid=3453 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.630000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 18:20:26.637000 audit[3456]: NETFILTER_CFG table=filter:77 family=10 entries=1 op=nft_register_rule pid=3456 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.637000 audit[3456]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffc7b6780 a2=0 a3=1 items=0 ppid=3332 pid=3456 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.637000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Mar 17 18:20:26.645000 audit[3459]: NETFILTER_CFG table=filter:78 family=10 entries=1 op=nft_register_rule pid=3459 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.645000 audit[3459]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe5619120 a2=0 a3=1 items=0 ppid=3332 pid=3459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.645000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Mar 17 18:20:26.647000 audit[3460]: NETFILTER_CFG table=nat:79 family=10 entries=1 op=nft_register_chain pid=3460 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.647000 audit[3460]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd7858a20 a2=0 a3=1 items=0 ppid=3332 pid=3460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.647000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Mar 17 18:20:26.651000 audit[3462]: NETFILTER_CFG table=nat:80 family=10 entries=2 op=nft_register_chain pid=3462 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.651000 audit[3462]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=600 a0=3 a1=ffffff4d5470 a2=0 a3=1 items=0 ppid=3332 pid=3462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.651000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:20:26.658000 audit[3465]: NETFILTER_CFG table=nat:81 family=10 entries=2 op=nft_register_chain pid=3465 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.658000 audit[3465]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=608 a0=3 a1=ffffef80ab80 a2=0 a3=1 items=0 ppid=3332 pid=3465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.658000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:20:26.661000 audit[3466]: NETFILTER_CFG table=nat:82 family=10 entries=1 op=nft_register_chain pid=3466 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.661000 audit[3466]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff20cb570 a2=0 a3=1 items=0 ppid=3332 pid=3466 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.661000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Mar 17 18:20:26.666000 audit[3468]: NETFILTER_CFG table=nat:83 family=10 entries=2 op=nft_register_chain pid=3468 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.666000 audit[3468]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffe05ff6f0 a2=0 a3=1 items=0 ppid=3332 pid=3468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.666000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Mar 17 18:20:26.671000 audit[3469]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3469 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.671000 audit[3469]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdb600d20 a2=0 a3=1 items=0 ppid=3332 pid=3469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.671000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Mar 17 18:20:26.681000 audit[3471]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3471 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.681000 audit[3471]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd3059c10 a2=0 a3=1 items=0 ppid=3332 pid=3471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.681000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:20:26.688000 audit[3474]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_rule pid=3474 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:20:26.688000 audit[3474]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=fffff72b82c0 a2=0 a3=1 items=0 ppid=3332 pid=3474 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.688000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:20:26.696000 audit[3476]: NETFILTER_CFG table=filter:87 family=10 entries=3 op=nft_register_rule pid=3476 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Mar 17 18:20:26.696000 audit[3476]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2004 a0=3 a1=ffffc7c8a9e0 a2=0 a3=1 items=0 ppid=3332 pid=3476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.696000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:26.698000 audit[3476]: NETFILTER_CFG table=nat:88 family=10 entries=7 op=nft_register_chain pid=3476 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Mar 17 18:20:26.698000 audit[3476]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffc7c8a9e0 a2=0 a3=1 items=0 ppid=3332 pid=3476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:26.698000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:26.764323 systemd[1]: run-containerd-runc-k8s.io-c7dace07d0eac9f33e5cb7ce0cf36a0f5a9674d6f72106976de7520811b9774b-runc.u1JULa.mount: Deactivated successfully. Mar 17 18:20:29.026678 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3261615129.mount: Deactivated successfully. Mar 17 18:20:29.970513 env[1923]: time="2025-03-17T18:20:29.970454696Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator:v1.36.2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:29.973686 env[1923]: time="2025-03-17T18:20:29.973631387Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:29.976286 env[1923]: time="2025-03-17T18:20:29.976204470Z" level=info msg="ImageUpdate event &ImageUpdate{Name:quay.io/tigera/operator:v1.36.2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:29.979018 env[1923]: time="2025-03-17T18:20:29.978969587Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:29.982133 env[1923]: time="2025-03-17T18:20:29.980839535Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:30d521e4e84764b396aacbb2a373ca7a573f84571e3955b34329652acccfb73c\"" Mar 17 18:20:29.990092 env[1923]: time="2025-03-17T18:20:29.990023721Z" level=info msg="CreateContainer within sandbox \"1c48db8b94e1623fc7e05626b42069d9f23c3b8638743fca523793cd15dc4b3b\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 18:20:30.020553 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2559575340.mount: Deactivated successfully. Mar 17 18:20:30.028220 env[1923]: time="2025-03-17T18:20:30.028133540Z" level=info msg="CreateContainer within sandbox \"1c48db8b94e1623fc7e05626b42069d9f23c3b8638743fca523793cd15dc4b3b\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"2b8bf8a24ccbeb444d774aa66339634defc3d7587ce9c12ecfcf408e6905968f\"" Mar 17 18:20:30.029997 env[1923]: time="2025-03-17T18:20:30.029947400Z" level=info msg="StartContainer for \"2b8bf8a24ccbeb444d774aa66339634defc3d7587ce9c12ecfcf408e6905968f\"" Mar 17 18:20:30.148112 env[1923]: time="2025-03-17T18:20:30.145102732Z" level=info msg="StartContainer for \"2b8bf8a24ccbeb444d774aa66339634defc3d7587ce9c12ecfcf408e6905968f\" returns successfully" Mar 17 18:20:30.425483 kubelet[3162]: I0317 18:20:30.423748 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-9z7lf" podStartSLOduration=5.423727134 podStartE2EDuration="5.423727134s" podCreationTimestamp="2025-03-17 18:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:20:26.414567068 +0000 UTC m=+14.450434460" watchObservedRunningTime="2025-03-17 18:20:30.423727134 +0000 UTC m=+18.459594538" Mar 17 18:20:31.011515 systemd[1]: run-containerd-runc-k8s.io-2b8bf8a24ccbeb444d774aa66339634defc3d7587ce9c12ecfcf408e6905968f-runc.cNq6JJ.mount: Deactivated successfully. Mar 17 18:20:32.336680 kubelet[3162]: I0317 18:20:32.336601 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-p5vk9" podStartSLOduration=3.519968382 podStartE2EDuration="7.336579899s" podCreationTimestamp="2025-03-17 18:20:25 +0000 UTC" firstStartedPulling="2025-03-17 18:20:26.166795257 +0000 UTC m=+14.202662637" lastFinishedPulling="2025-03-17 18:20:29.983406774 +0000 UTC m=+18.019274154" observedRunningTime="2025-03-17 18:20:30.42620411 +0000 UTC m=+18.462071490" watchObservedRunningTime="2025-03-17 18:20:32.336579899 +0000 UTC m=+20.372447327" Mar 17 18:20:34.683877 kernel: kauditd_printk_skb: 143 callbacks suppressed Mar 17 18:20:34.684044 kernel: audit: type=1325 audit(1742235634.674:288): table=filter:89 family=2 entries=15 op=nft_register_rule pid=3517 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:34.674000 audit[3517]: NETFILTER_CFG table=filter:89 family=2 entries=15 op=nft_register_rule pid=3517 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:34.674000 audit[3517]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5908 a0=3 a1=ffffe14c2ac0 a2=0 a3=1 items=0 ppid=3332 pid=3517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:34.674000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:34.721305 kernel: audit: type=1300 audit(1742235634.674:288): arch=c00000b7 syscall=211 success=yes exit=5908 a0=3 a1=ffffe14c2ac0 a2=0 a3=1 items=0 ppid=3332 pid=3517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:34.721468 kernel: audit: type=1327 audit(1742235634.674:288): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:34.721542 kernel: audit: type=1325 audit(1742235634.713:289): table=nat:90 family=2 entries=12 op=nft_register_rule pid=3517 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:34.713000 audit[3517]: NETFILTER_CFG table=nat:90 family=2 entries=12 op=nft_register_rule pid=3517 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:34.713000 audit[3517]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe14c2ac0 a2=0 a3=1 items=0 ppid=3332 pid=3517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:34.739380 kernel: audit: type=1300 audit(1742235634.713:289): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe14c2ac0 a2=0 a3=1 items=0 ppid=3332 pid=3517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:34.713000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:34.748902 kernel: audit: type=1327 audit(1742235634.713:289): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:34.754000 audit[3519]: NETFILTER_CFG table=filter:91 family=2 entries=16 op=nft_register_rule pid=3519 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:34.754000 audit[3519]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5908 a0=3 a1=fffff6305370 a2=0 a3=1 items=0 ppid=3332 pid=3519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:34.771390 kernel: audit: type=1325 audit(1742235634.754:290): table=filter:91 family=2 entries=16 op=nft_register_rule pid=3519 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:34.771494 kernel: audit: type=1300 audit(1742235634.754:290): arch=c00000b7 syscall=211 success=yes exit=5908 a0=3 a1=fffff6305370 a2=0 a3=1 items=0 ppid=3332 pid=3519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:34.754000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:34.777523 kernel: audit: type=1327 audit(1742235634.754:290): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:34.771000 audit[3519]: NETFILTER_CFG table=nat:92 family=2 entries=12 op=nft_register_rule pid=3519 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:34.782833 kernel: audit: type=1325 audit(1742235634.771:291): table=nat:92 family=2 entries=12 op=nft_register_rule pid=3519 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:34.771000 audit[3519]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff6305370 a2=0 a3=1 items=0 ppid=3332 pid=3519 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:34.771000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:35.436421 kubelet[3162]: I0317 18:20:35.436324 3162 topology_manager.go:215] "Topology Admit Handler" podUID="150c390e-d4ef-439c-aecc-5ebd9796ba93" podNamespace="calico-system" podName="calico-typha-5689d795bc-4bp2d" Mar 17 18:20:35.518779 kubelet[3162]: I0317 18:20:35.518712 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/150c390e-d4ef-439c-aecc-5ebd9796ba93-typha-certs\") pod \"calico-typha-5689d795bc-4bp2d\" (UID: \"150c390e-d4ef-439c-aecc-5ebd9796ba93\") " pod="calico-system/calico-typha-5689d795bc-4bp2d" Mar 17 18:20:35.518981 kubelet[3162]: I0317 18:20:35.518802 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/150c390e-d4ef-439c-aecc-5ebd9796ba93-tigera-ca-bundle\") pod \"calico-typha-5689d795bc-4bp2d\" (UID: \"150c390e-d4ef-439c-aecc-5ebd9796ba93\") " pod="calico-system/calico-typha-5689d795bc-4bp2d" Mar 17 18:20:35.518981 kubelet[3162]: I0317 18:20:35.518857 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxwjn\" (UniqueName: \"kubernetes.io/projected/150c390e-d4ef-439c-aecc-5ebd9796ba93-kube-api-access-xxwjn\") pod \"calico-typha-5689d795bc-4bp2d\" (UID: \"150c390e-d4ef-439c-aecc-5ebd9796ba93\") " pod="calico-system/calico-typha-5689d795bc-4bp2d" Mar 17 18:20:35.612745 kubelet[3162]: I0317 18:20:35.612671 3162 topology_manager.go:215] "Topology Admit Handler" podUID="ecd1548f-69c5-4344-9b4d-0867235f8e18" podNamespace="calico-system" podName="calico-node-5wz9m" Mar 17 18:20:35.719531 kubelet[3162]: I0317 18:20:35.719483 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecd1548f-69c5-4344-9b4d-0867235f8e18-tigera-ca-bundle\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.719819 kubelet[3162]: I0317 18:20:35.719779 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/ecd1548f-69c5-4344-9b4d-0867235f8e18-cni-log-dir\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.720007 kubelet[3162]: I0317 18:20:35.719969 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/ecd1548f-69c5-4344-9b4d-0867235f8e18-flexvol-driver-host\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.720186 kubelet[3162]: I0317 18:20:35.720150 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ecd1548f-69c5-4344-9b4d-0867235f8e18-lib-modules\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.720369 kubelet[3162]: I0317 18:20:35.720310 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/ecd1548f-69c5-4344-9b4d-0867235f8e18-xtables-lock\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.720536 kubelet[3162]: I0317 18:20:35.720500 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/ecd1548f-69c5-4344-9b4d-0867235f8e18-var-run-calico\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.720719 kubelet[3162]: I0317 18:20:35.720681 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/ecd1548f-69c5-4344-9b4d-0867235f8e18-policysync\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.720893 kubelet[3162]: I0317 18:20:35.720854 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/ecd1548f-69c5-4344-9b4d-0867235f8e18-node-certs\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.721041 kubelet[3162]: I0317 18:20:35.721016 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/ecd1548f-69c5-4344-9b4d-0867235f8e18-cni-net-dir\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.721218 kubelet[3162]: I0317 18:20:35.721181 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/ecd1548f-69c5-4344-9b4d-0867235f8e18-var-lib-calico\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.721387 kubelet[3162]: I0317 18:20:35.721361 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/ecd1548f-69c5-4344-9b4d-0867235f8e18-cni-bin-dir\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.721569 kubelet[3162]: I0317 18:20:35.721533 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfj6v\" (UniqueName: \"kubernetes.io/projected/ecd1548f-69c5-4344-9b4d-0867235f8e18-kube-api-access-wfj6v\") pod \"calico-node-5wz9m\" (UID: \"ecd1548f-69c5-4344-9b4d-0867235f8e18\") " pod="calico-system/calico-node-5wz9m" Mar 17 18:20:35.745192 env[1923]: time="2025-03-17T18:20:35.744886212Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5689d795bc-4bp2d,Uid:150c390e-d4ef-439c-aecc-5ebd9796ba93,Namespace:calico-system,Attempt:0,}" Mar 17 18:20:35.801518 env[1923]: time="2025-03-17T18:20:35.801355803Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:20:35.801680 env[1923]: time="2025-03-17T18:20:35.801551594Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:20:35.801680 env[1923]: time="2025-03-17T18:20:35.801634310Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:20:35.802294 env[1923]: time="2025-03-17T18:20:35.802205614Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/5707704d06c15fd800bb04e820fd73cec78b5f004952f1009b8fa5f11d43eabe pid=3529 runtime=io.containerd.runc.v2 Mar 17 18:20:35.840614 kubelet[3162]: E0317 18:20:35.840555 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.840614 kubelet[3162]: W0317 18:20:35.840599 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.840826 kubelet[3162]: E0317 18:20:35.840645 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.841610 kubelet[3162]: E0317 18:20:35.841563 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.841610 kubelet[3162]: W0317 18:20:35.841600 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.841816 kubelet[3162]: E0317 18:20:35.841655 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.845463 kubelet[3162]: E0317 18:20:35.845409 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.845627 kubelet[3162]: W0317 18:20:35.845452 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.845627 kubelet[3162]: E0317 18:20:35.845600 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.850564 kubelet[3162]: E0317 18:20:35.850394 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.850564 kubelet[3162]: W0317 18:20:35.850553 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.850800 kubelet[3162]: E0317 18:20:35.850612 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.850987 kubelet[3162]: I0317 18:20:35.850931 3162 topology_manager.go:215] "Topology Admit Handler" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" podNamespace="calico-system" podName="csi-node-driver-pv4xb" Mar 17 18:20:35.856237 kubelet[3162]: E0317 18:20:35.856171 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pv4xb" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" Mar 17 18:20:35.868000 audit[3541]: NETFILTER_CFG table=filter:93 family=2 entries=17 op=nft_register_rule pid=3541 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:35.868000 audit[3541]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6652 a0=3 a1=ffffc315b0b0 a2=0 a3=1 items=0 ppid=3332 pid=3541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:35.868000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:35.879000 audit[3541]: NETFILTER_CFG table=nat:94 family=2 entries=12 op=nft_register_rule pid=3541 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:35.879000 audit[3541]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc315b0b0 a2=0 a3=1 items=0 ppid=3332 pid=3541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:35.879000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:35.907273 kubelet[3162]: E0317 18:20:35.907217 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.907458 kubelet[3162]: W0317 18:20:35.907277 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.907458 kubelet[3162]: E0317 18:20:35.907314 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.909111 kubelet[3162]: E0317 18:20:35.908457 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.909111 kubelet[3162]: W0317 18:20:35.908496 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.909111 kubelet[3162]: E0317 18:20:35.908711 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.909427 kubelet[3162]: E0317 18:20:35.909158 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.909427 kubelet[3162]: W0317 18:20:35.909183 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.913907 kubelet[3162]: E0317 18:20:35.909578 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.913907 kubelet[3162]: E0317 18:20:35.910079 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.913907 kubelet[3162]: W0317 18:20:35.910103 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.913907 kubelet[3162]: E0317 18:20:35.910435 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.913907 kubelet[3162]: E0317 18:20:35.912937 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.913907 kubelet[3162]: W0317 18:20:35.912966 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.913907 kubelet[3162]: E0317 18:20:35.913152 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.913907 kubelet[3162]: E0317 18:20:35.913616 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.913907 kubelet[3162]: W0317 18:20:35.913639 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.913907 kubelet[3162]: E0317 18:20:35.913844 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.914596 kubelet[3162]: E0317 18:20:35.914275 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.914596 kubelet[3162]: W0317 18:20:35.914296 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.914596 kubelet[3162]: E0317 18:20:35.914479 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.915022 kubelet[3162]: E0317 18:20:35.914896 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.915125 kubelet[3162]: W0317 18:20:35.915060 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.917387 kubelet[3162]: E0317 18:20:35.915234 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.917387 kubelet[3162]: E0317 18:20:35.915890 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.917387 kubelet[3162]: W0317 18:20:35.915916 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.917387 kubelet[3162]: E0317 18:20:35.916097 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.917387 kubelet[3162]: E0317 18:20:35.916416 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.917387 kubelet[3162]: W0317 18:20:35.916439 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.917387 kubelet[3162]: E0317 18:20:35.916589 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.917387 kubelet[3162]: E0317 18:20:35.917276 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.917387 kubelet[3162]: W0317 18:20:35.917317 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.918002 kubelet[3162]: E0317 18:20:35.917409 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.926399 kubelet[3162]: E0317 18:20:35.919182 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.926399 kubelet[3162]: W0317 18:20:35.919222 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.926399 kubelet[3162]: E0317 18:20:35.919481 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.926399 kubelet[3162]: E0317 18:20:35.919740 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.926399 kubelet[3162]: W0317 18:20:35.919760 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.926399 kubelet[3162]: E0317 18:20:35.919913 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.926399 kubelet[3162]: E0317 18:20:35.920220 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.926399 kubelet[3162]: W0317 18:20:35.920239 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.926399 kubelet[3162]: E0317 18:20:35.920272 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.926399 kubelet[3162]: E0317 18:20:35.920690 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.927046 kubelet[3162]: W0317 18:20:35.920712 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.927046 kubelet[3162]: E0317 18:20:35.920745 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.927046 kubelet[3162]: E0317 18:20:35.921412 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.927046 kubelet[3162]: W0317 18:20:35.921439 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.927046 kubelet[3162]: E0317 18:20:35.921667 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.927046 kubelet[3162]: E0317 18:20:35.921963 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.927046 kubelet[3162]: W0317 18:20:35.921982 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.927046 kubelet[3162]: E0317 18:20:35.922014 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.927046 kubelet[3162]: E0317 18:20:35.922393 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.927046 kubelet[3162]: W0317 18:20:35.922413 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.927633 kubelet[3162]: E0317 18:20:35.922436 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.927633 kubelet[3162]: E0317 18:20:35.922775 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.927633 kubelet[3162]: W0317 18:20:35.922794 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.927633 kubelet[3162]: E0317 18:20:35.922818 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.927633 kubelet[3162]: E0317 18:20:35.923229 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.927633 kubelet[3162]: W0317 18:20:35.923250 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.927633 kubelet[3162]: E0317 18:20:35.923275 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.927633 kubelet[3162]: E0317 18:20:35.923694 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.927633 kubelet[3162]: W0317 18:20:35.923714 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.927633 kubelet[3162]: E0317 18:20:35.923736 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.928150 kubelet[3162]: E0317 18:20:35.924035 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.928150 kubelet[3162]: W0317 18:20:35.924051 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.928150 kubelet[3162]: E0317 18:20:35.924069 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.928150 kubelet[3162]: E0317 18:20:35.924351 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.928150 kubelet[3162]: W0317 18:20:35.924403 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.928150 kubelet[3162]: E0317 18:20:35.924427 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.928150 kubelet[3162]: E0317 18:20:35.924803 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.928150 kubelet[3162]: W0317 18:20:35.924830 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.928150 kubelet[3162]: E0317 18:20:35.924853 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.928150 kubelet[3162]: E0317 18:20:35.925190 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.931507 kubelet[3162]: W0317 18:20:35.925206 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.931507 kubelet[3162]: E0317 18:20:35.925227 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.931507 kubelet[3162]: E0317 18:20:35.925594 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.931507 kubelet[3162]: W0317 18:20:35.925615 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.931507 kubelet[3162]: E0317 18:20:35.925640 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.931507 kubelet[3162]: E0317 18:20:35.925947 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.931507 kubelet[3162]: W0317 18:20:35.925964 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.931507 kubelet[3162]: E0317 18:20:35.925985 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.931507 kubelet[3162]: E0317 18:20:35.926272 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.931507 kubelet[3162]: W0317 18:20:35.926292 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.932189 kubelet[3162]: E0317 18:20:35.926310 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.932189 kubelet[3162]: E0317 18:20:35.926737 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.932189 kubelet[3162]: W0317 18:20:35.926755 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.932189 kubelet[3162]: E0317 18:20:35.926776 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.932189 kubelet[3162]: E0317 18:20:35.927038 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.932189 kubelet[3162]: W0317 18:20:35.927053 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.932189 kubelet[3162]: E0317 18:20:35.927072 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.932189 kubelet[3162]: E0317 18:20:35.927454 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.932189 kubelet[3162]: W0317 18:20:35.927472 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.932189 kubelet[3162]: E0317 18:20:35.927493 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.932774 kubelet[3162]: E0317 18:20:35.927780 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.932774 kubelet[3162]: W0317 18:20:35.927796 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.932774 kubelet[3162]: E0317 18:20:35.927814 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.934982 kubelet[3162]: E0317 18:20:35.933421 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.934982 kubelet[3162]: W0317 18:20:35.933465 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.934982 kubelet[3162]: E0317 18:20:35.933499 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.952017 kubelet[3162]: E0317 18:20:35.951967 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.952017 kubelet[3162]: W0317 18:20:35.952005 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.952307 kubelet[3162]: E0317 18:20:35.952040 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.952307 kubelet[3162]: I0317 18:20:35.952159 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf8d2\" (UniqueName: \"kubernetes.io/projected/7af73a1a-8033-4ba4-ba19-078aeb2052b7-kube-api-access-sf8d2\") pod \"csi-node-driver-pv4xb\" (UID: \"7af73a1a-8033-4ba4-ba19-078aeb2052b7\") " pod="calico-system/csi-node-driver-pv4xb" Mar 17 18:20:35.954199 kubelet[3162]: E0317 18:20:35.954108 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.954199 kubelet[3162]: W0317 18:20:35.954149 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.954199 kubelet[3162]: E0317 18:20:35.954193 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.954483 kubelet[3162]: I0317 18:20:35.954237 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7af73a1a-8033-4ba4-ba19-078aeb2052b7-socket-dir\") pod \"csi-node-driver-pv4xb\" (UID: \"7af73a1a-8033-4ba4-ba19-078aeb2052b7\") " pod="calico-system/csi-node-driver-pv4xb" Mar 17 18:20:35.955861 kubelet[3162]: E0317 18:20:35.955597 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.955861 kubelet[3162]: W0317 18:20:35.955649 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.956075 kubelet[3162]: E0317 18:20:35.955890 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.956075 kubelet[3162]: I0317 18:20:35.955939 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7af73a1a-8033-4ba4-ba19-078aeb2052b7-varrun\") pod \"csi-node-driver-pv4xb\" (UID: \"7af73a1a-8033-4ba4-ba19-078aeb2052b7\") " pod="calico-system/csi-node-driver-pv4xb" Mar 17 18:20:35.957367 kubelet[3162]: E0317 18:20:35.956387 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.957367 kubelet[3162]: W0317 18:20:35.956429 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.957367 kubelet[3162]: E0317 18:20:35.956628 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.957367 kubelet[3162]: E0317 18:20:35.956941 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.957367 kubelet[3162]: W0317 18:20:35.956962 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.957367 kubelet[3162]: E0317 18:20:35.957122 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.957796 kubelet[3162]: E0317 18:20:35.957449 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.957796 kubelet[3162]: W0317 18:20:35.957469 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.957796 kubelet[3162]: E0317 18:20:35.957643 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.957796 kubelet[3162]: I0317 18:20:35.957686 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7af73a1a-8033-4ba4-ba19-078aeb2052b7-kubelet-dir\") pod \"csi-node-driver-pv4xb\" (UID: \"7af73a1a-8033-4ba4-ba19-078aeb2052b7\") " pod="calico-system/csi-node-driver-pv4xb" Mar 17 18:20:35.958022 kubelet[3162]: E0317 18:20:35.957945 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.958022 kubelet[3162]: W0317 18:20:35.957961 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.958157 kubelet[3162]: E0317 18:20:35.958098 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.963371 kubelet[3162]: E0317 18:20:35.958447 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.963371 kubelet[3162]: W0317 18:20:35.958476 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.963371 kubelet[3162]: E0317 18:20:35.958501 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.963371 kubelet[3162]: E0317 18:20:35.958964 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.963371 kubelet[3162]: W0317 18:20:35.958999 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.963371 kubelet[3162]: E0317 18:20:35.959033 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.963371 kubelet[3162]: I0317 18:20:35.959071 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7af73a1a-8033-4ba4-ba19-078aeb2052b7-registration-dir\") pod \"csi-node-driver-pv4xb\" (UID: \"7af73a1a-8033-4ba4-ba19-078aeb2052b7\") " pod="calico-system/csi-node-driver-pv4xb" Mar 17 18:20:35.963371 kubelet[3162]: E0317 18:20:35.959496 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.963371 kubelet[3162]: W0317 18:20:35.959533 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.963994 kubelet[3162]: E0317 18:20:35.959565 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.963994 kubelet[3162]: E0317 18:20:35.959897 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.963994 kubelet[3162]: W0317 18:20:35.959917 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.963994 kubelet[3162]: E0317 18:20:35.959936 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.963994 kubelet[3162]: E0317 18:20:35.960275 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.963994 kubelet[3162]: W0317 18:20:35.960292 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.963994 kubelet[3162]: E0317 18:20:35.960311 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.963994 kubelet[3162]: E0317 18:20:35.960807 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.963994 kubelet[3162]: W0317 18:20:35.960831 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.963994 kubelet[3162]: E0317 18:20:35.960859 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.964590 kubelet[3162]: E0317 18:20:35.961189 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.964590 kubelet[3162]: W0317 18:20:35.961205 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.964590 kubelet[3162]: E0317 18:20:35.961225 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:35.964590 kubelet[3162]: E0317 18:20:35.962829 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:35.964590 kubelet[3162]: W0317 18:20:35.962859 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:35.964590 kubelet[3162]: E0317 18:20:35.962891 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.071741 kubelet[3162]: E0317 18:20:36.067053 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.071741 kubelet[3162]: W0317 18:20:36.067087 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.071741 kubelet[3162]: E0317 18:20:36.067135 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.071741 kubelet[3162]: E0317 18:20:36.071436 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.071741 kubelet[3162]: W0317 18:20:36.071461 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.071741 kubelet[3162]: E0317 18:20:36.071506 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.072462 kubelet[3162]: E0317 18:20:36.072250 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.072462 kubelet[3162]: W0317 18:20:36.072274 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.072462 kubelet[3162]: E0317 18:20:36.072326 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.072897 kubelet[3162]: E0317 18:20:36.072730 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.072897 kubelet[3162]: W0317 18:20:36.072750 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.072897 kubelet[3162]: E0317 18:20:36.072793 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.073352 kubelet[3162]: E0317 18:20:36.073169 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.073352 kubelet[3162]: W0317 18:20:36.073188 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.073352 kubelet[3162]: E0317 18:20:36.073230 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.079291 kubelet[3162]: E0317 18:20:36.073605 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.079291 kubelet[3162]: W0317 18:20:36.073623 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.079291 kubelet[3162]: E0317 18:20:36.073657 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.079291 kubelet[3162]: E0317 18:20:36.079029 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.079291 kubelet[3162]: W0317 18:20:36.079057 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.079291 kubelet[3162]: E0317 18:20:36.079137 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.081447 kubelet[3162]: E0317 18:20:36.079857 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.081447 kubelet[3162]: W0317 18:20:36.079883 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.081447 kubelet[3162]: E0317 18:20:36.081021 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.081447 kubelet[3162]: E0317 18:20:36.081179 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.081447 kubelet[3162]: W0317 18:20:36.081199 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.082995 kubelet[3162]: E0317 18:20:36.082872 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.088829 kubelet[3162]: E0317 18:20:36.088792 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.089056 kubelet[3162]: W0317 18:20:36.089027 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.089410 kubelet[3162]: E0317 18:20:36.089324 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.089731 kubelet[3162]: E0317 18:20:36.089661 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.089953 kubelet[3162]: W0317 18:20:36.089915 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.090193 kubelet[3162]: E0317 18:20:36.090169 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.090711 kubelet[3162]: E0317 18:20:36.090683 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.090892 kubelet[3162]: W0317 18:20:36.090865 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.091197 kubelet[3162]: E0317 18:20:36.091169 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.091658 kubelet[3162]: E0317 18:20:36.091633 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.091902 kubelet[3162]: W0317 18:20:36.091874 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.092198 kubelet[3162]: E0317 18:20:36.092171 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.092476 kubelet[3162]: E0317 18:20:36.092454 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.092636 kubelet[3162]: W0317 18:20:36.092610 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.092922 kubelet[3162]: E0317 18:20:36.092898 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.093211 kubelet[3162]: E0317 18:20:36.093191 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.093373 kubelet[3162]: W0317 18:20:36.093315 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.093640 kubelet[3162]: E0317 18:20:36.093614 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.093926 kubelet[3162]: E0317 18:20:36.093905 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.094178 kubelet[3162]: W0317 18:20:36.094152 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.094528 kubelet[3162]: E0317 18:20:36.094503 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.094749 kubelet[3162]: E0317 18:20:36.094727 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.094868 kubelet[3162]: W0317 18:20:36.094844 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.095289 kubelet[3162]: E0317 18:20:36.095257 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.095570 kubelet[3162]: E0317 18:20:36.095548 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.095732 kubelet[3162]: W0317 18:20:36.095706 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.096096 kubelet[3162]: E0317 18:20:36.096066 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.096402 kubelet[3162]: E0317 18:20:36.096382 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.096555 kubelet[3162]: W0317 18:20:36.096530 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.096848 kubelet[3162]: E0317 18:20:36.096826 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.097082 kubelet[3162]: E0317 18:20:36.097062 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.097280 kubelet[3162]: W0317 18:20:36.097254 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.097547 kubelet[3162]: E0317 18:20:36.097525 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.097982 kubelet[3162]: E0317 18:20:36.097958 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.098120 kubelet[3162]: W0317 18:20:36.098094 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.098404 kubelet[3162]: E0317 18:20:36.098378 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.100629 kubelet[3162]: E0317 18:20:36.100595 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.100843 kubelet[3162]: W0317 18:20:36.100815 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.101182 kubelet[3162]: E0317 18:20:36.101155 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.101708 kubelet[3162]: E0317 18:20:36.101680 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.102310 kubelet[3162]: W0317 18:20:36.102277 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.102792 kubelet[3162]: E0317 18:20:36.102764 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.103120 kubelet[3162]: E0317 18:20:36.103079 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.103261 env[1923]: time="2025-03-17T18:20:36.103189191Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5689d795bc-4bp2d,Uid:150c390e-d4ef-439c-aecc-5ebd9796ba93,Namespace:calico-system,Attempt:0,} returns sandbox id \"5707704d06c15fd800bb04e820fd73cec78b5f004952f1009b8fa5f11d43eabe\"" Mar 17 18:20:36.103748 kubelet[3162]: W0317 18:20:36.103719 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.104142 kubelet[3162]: E0317 18:20:36.104117 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.106470 kubelet[3162]: E0317 18:20:36.105768 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.106470 kubelet[3162]: W0317 18:20:36.105806 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.106470 kubelet[3162]: E0317 18:20:36.105840 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.113876 env[1923]: time="2025-03-17T18:20:36.109964617Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Mar 17 18:20:36.128852 kubelet[3162]: E0317 18:20:36.128597 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:36.129484 kubelet[3162]: W0317 18:20:36.129437 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:36.129833 kubelet[3162]: E0317 18:20:36.129804 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:36.228559 env[1923]: time="2025-03-17T18:20:36.227536125Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5wz9m,Uid:ecd1548f-69c5-4344-9b4d-0867235f8e18,Namespace:calico-system,Attempt:0,}" Mar 17 18:20:36.256960 env[1923]: time="2025-03-17T18:20:36.252579974Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:20:36.256960 env[1923]: time="2025-03-17T18:20:36.252688406Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:20:36.256960 env[1923]: time="2025-03-17T18:20:36.252715850Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:20:36.256960 env[1923]: time="2025-03-17T18:20:36.253761860Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/697355467225ad3ac2f278a9f0174bd0a5705e42b72e07793bda04eb9c77a019 pid=3664 runtime=io.containerd.runc.v2 Mar 17 18:20:36.343959 env[1923]: time="2025-03-17T18:20:36.340919728Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-5wz9m,Uid:ecd1548f-69c5-4344-9b4d-0867235f8e18,Namespace:calico-system,Attempt:0,} returns sandbox id \"697355467225ad3ac2f278a9f0174bd0a5705e42b72e07793bda04eb9c77a019\"" Mar 17 18:20:37.316840 kubelet[3162]: E0317 18:20:37.316479 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pv4xb" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" Mar 17 18:20:37.696714 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3653209545.mount: Deactivated successfully. Mar 17 18:20:38.861476 env[1923]: time="2025-03-17T18:20:38.861411635Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:38.865703 env[1923]: time="2025-03-17T18:20:38.865650156Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:38.869472 env[1923]: time="2025-03-17T18:20:38.869424160Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/typha:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:38.873517 env[1923]: time="2025-03-17T18:20:38.873462307Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:38.879612 env[1923]: time="2025-03-17T18:20:38.879557567Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:1d1fc316829ae1650b0b1629b54232520f297e7c3b1444eecd290ae088902a28\"" Mar 17 18:20:38.882418 env[1923]: time="2025-03-17T18:20:38.882364208Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Mar 17 18:20:38.917854 env[1923]: time="2025-03-17T18:20:38.917776920Z" level=info msg="CreateContainer within sandbox \"5707704d06c15fd800bb04e820fd73cec78b5f004952f1009b8fa5f11d43eabe\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 18:20:38.946846 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2723863477.mount: Deactivated successfully. Mar 17 18:20:38.957183 env[1923]: time="2025-03-17T18:20:38.957097867Z" level=info msg="CreateContainer within sandbox \"5707704d06c15fd800bb04e820fd73cec78b5f004952f1009b8fa5f11d43eabe\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"17318dc982150c555e391435cb2630c6a7b7a937c9f064f9dd5fd3f6cfdf9bc5\"" Mar 17 18:20:38.959559 env[1923]: time="2025-03-17T18:20:38.959482087Z" level=info msg="StartContainer for \"17318dc982150c555e391435cb2630c6a7b7a937c9f064f9dd5fd3f6cfdf9bc5\"" Mar 17 18:20:39.104463 env[1923]: time="2025-03-17T18:20:39.104289638Z" level=info msg="StartContainer for \"17318dc982150c555e391435cb2630c6a7b7a937c9f064f9dd5fd3f6cfdf9bc5\" returns successfully" Mar 17 18:20:39.316109 kubelet[3162]: E0317 18:20:39.316035 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pv4xb" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" Mar 17 18:20:39.458513 kubelet[3162]: E0317 18:20:39.458475 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.458718 kubelet[3162]: W0317 18:20:39.458690 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.458877 kubelet[3162]: E0317 18:20:39.458851 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.459425 kubelet[3162]: E0317 18:20:39.459387 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.459637 kubelet[3162]: W0317 18:20:39.459607 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.459776 kubelet[3162]: E0317 18:20:39.459750 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.460306 kubelet[3162]: E0317 18:20:39.460276 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.460520 kubelet[3162]: W0317 18:20:39.460492 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.460651 kubelet[3162]: E0317 18:20:39.460625 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.461204 kubelet[3162]: E0317 18:20:39.461176 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.461393 kubelet[3162]: W0317 18:20:39.461364 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.461523 kubelet[3162]: E0317 18:20:39.461497 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.463538 kubelet[3162]: E0317 18:20:39.463502 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.463859 kubelet[3162]: W0317 18:20:39.463789 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.463997 kubelet[3162]: E0317 18:20:39.463969 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.465304 kubelet[3162]: E0317 18:20:39.465267 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.465607 kubelet[3162]: W0317 18:20:39.465575 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.465743 kubelet[3162]: E0317 18:20:39.465717 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.466416 kubelet[3162]: E0317 18:20:39.466384 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.466623 kubelet[3162]: W0317 18:20:39.466593 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.466757 kubelet[3162]: E0317 18:20:39.466730 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.467474 kubelet[3162]: E0317 18:20:39.467442 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.467699 kubelet[3162]: W0317 18:20:39.467648 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.467878 kubelet[3162]: E0317 18:20:39.467848 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.474765 kubelet[3162]: E0317 18:20:39.474714 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.474765 kubelet[3162]: W0317 18:20:39.474752 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.475006 kubelet[3162]: E0317 18:20:39.474784 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.475185 kubelet[3162]: E0317 18:20:39.475148 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.475185 kubelet[3162]: W0317 18:20:39.475178 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.475382 kubelet[3162]: E0317 18:20:39.475204 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.478090 kubelet[3162]: E0317 18:20:39.478006 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.478631 kubelet[3162]: W0317 18:20:39.478532 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.479252 kubelet[3162]: E0317 18:20:39.479210 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.481467 kubelet[3162]: E0317 18:20:39.481321 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.481640 kubelet[3162]: W0317 18:20:39.481482 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.482304 kubelet[3162]: E0317 18:20:39.481516 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.482796 kubelet[3162]: E0317 18:20:39.482764 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.482931 kubelet[3162]: W0317 18:20:39.482797 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.482931 kubelet[3162]: E0317 18:20:39.482827 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.483323 kubelet[3162]: E0317 18:20:39.483296 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.483323 kubelet[3162]: W0317 18:20:39.483323 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.483510 kubelet[3162]: E0317 18:20:39.483398 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.483763 kubelet[3162]: E0317 18:20:39.483735 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.483842 kubelet[3162]: W0317 18:20:39.483762 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.483842 kubelet[3162]: E0317 18:20:39.483785 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.501091 kubelet[3162]: E0317 18:20:39.501045 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.501091 kubelet[3162]: W0317 18:20:39.501082 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.501369 kubelet[3162]: E0317 18:20:39.501114 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.501686 kubelet[3162]: E0317 18:20:39.501651 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.501686 kubelet[3162]: W0317 18:20:39.501680 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.501831 kubelet[3162]: E0317 18:20:39.501714 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.502177 kubelet[3162]: E0317 18:20:39.502140 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.502177 kubelet[3162]: W0317 18:20:39.502170 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.502305 kubelet[3162]: E0317 18:20:39.502201 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.502612 kubelet[3162]: E0317 18:20:39.502578 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.502612 kubelet[3162]: W0317 18:20:39.502606 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.502766 kubelet[3162]: E0317 18:20:39.502748 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.503053 kubelet[3162]: E0317 18:20:39.503021 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.503053 kubelet[3162]: W0317 18:20:39.503047 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.503249 kubelet[3162]: E0317 18:20:39.503215 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.503535 kubelet[3162]: E0317 18:20:39.503504 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.503535 kubelet[3162]: W0317 18:20:39.503530 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.503718 kubelet[3162]: E0317 18:20:39.503683 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.504053 kubelet[3162]: E0317 18:20:39.504021 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.504053 kubelet[3162]: W0317 18:20:39.504047 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.504194 kubelet[3162]: E0317 18:20:39.504078 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.504473 kubelet[3162]: E0317 18:20:39.504442 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.504473 kubelet[3162]: W0317 18:20:39.504468 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.504651 kubelet[3162]: E0317 18:20:39.504617 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.505058 kubelet[3162]: E0317 18:20:39.504900 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.505058 kubelet[3162]: W0317 18:20:39.504925 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.505196 kubelet[3162]: E0317 18:20:39.505062 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.505840 kubelet[3162]: E0317 18:20:39.505490 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.505840 kubelet[3162]: W0317 18:20:39.505515 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.505840 kubelet[3162]: E0317 18:20:39.505690 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.506526 kubelet[3162]: E0317 18:20:39.506500 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.506686 kubelet[3162]: W0317 18:20:39.506657 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.506844 kubelet[3162]: E0317 18:20:39.506816 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.507794 kubelet[3162]: E0317 18:20:39.507760 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.508009 kubelet[3162]: W0317 18:20:39.507981 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.508140 kubelet[3162]: E0317 18:20:39.508113 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.508774 kubelet[3162]: E0317 18:20:39.508744 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.508961 kubelet[3162]: W0317 18:20:39.508933 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.509169 kubelet[3162]: E0317 18:20:39.509141 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.509732 kubelet[3162]: E0317 18:20:39.509702 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.509902 kubelet[3162]: W0317 18:20:39.509874 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.510027 kubelet[3162]: E0317 18:20:39.510002 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.511828 kubelet[3162]: E0317 18:20:39.511796 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.512015 kubelet[3162]: W0317 18:20:39.511987 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.512144 kubelet[3162]: E0317 18:20:39.512118 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.512728 kubelet[3162]: E0317 18:20:39.512699 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.512917 kubelet[3162]: W0317 18:20:39.512891 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.513038 kubelet[3162]: E0317 18:20:39.513014 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.513604 kubelet[3162]: E0317 18:20:39.513577 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.513745 kubelet[3162]: W0317 18:20:39.513719 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.513943 kubelet[3162]: E0317 18:20:39.513916 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:39.514757 kubelet[3162]: E0317 18:20:39.514731 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:39.515008 kubelet[3162]: W0317 18:20:39.514983 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:39.515164 kubelet[3162]: E0317 18:20:39.515140 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.379092 env[1923]: time="2025-03-17T18:20:40.379010756Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:40.382761 env[1923]: time="2025-03-17T18:20:40.382692613Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:40.385504 env[1923]: time="2025-03-17T18:20:40.385449719Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:40.387520 env[1923]: time="2025-03-17T18:20:40.387474781Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:40.388939 env[1923]: time="2025-03-17T18:20:40.388866558Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:ece9bca32e64e726de8bbfc9e175a3ca91e0881cd40352bfcd1d107411f4f348\"" Mar 17 18:20:40.395466 env[1923]: time="2025-03-17T18:20:40.395383736Z" level=info msg="CreateContainer within sandbox \"697355467225ad3ac2f278a9f0174bd0a5705e42b72e07793bda04eb9c77a019\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 18:20:40.424703 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1136072385.mount: Deactivated successfully. Mar 17 18:20:40.432243 env[1923]: time="2025-03-17T18:20:40.425471687Z" level=info msg="CreateContainer within sandbox \"697355467225ad3ac2f278a9f0174bd0a5705e42b72e07793bda04eb9c77a019\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cb01145f7cd8523840384d953f3f97095805e46b201f78efcf3f9c4f9383be20\"" Mar 17 18:20:40.435732 env[1923]: time="2025-03-17T18:20:40.433453891Z" level=info msg="StartContainer for \"cb01145f7cd8523840384d953f3f97095805e46b201f78efcf3f9c4f9383be20\"" Mar 17 18:20:40.448518 kubelet[3162]: I0317 18:20:40.448430 3162 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 18:20:40.490508 kubelet[3162]: E0317 18:20:40.490152 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.490508 kubelet[3162]: W0317 18:20:40.490184 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.490508 kubelet[3162]: E0317 18:20:40.490216 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.491249 kubelet[3162]: E0317 18:20:40.490851 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.491249 kubelet[3162]: W0317 18:20:40.490877 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.491249 kubelet[3162]: E0317 18:20:40.490903 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.491878 kubelet[3162]: E0317 18:20:40.491566 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.491878 kubelet[3162]: W0317 18:20:40.491592 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.491878 kubelet[3162]: E0317 18:20:40.491618 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.492287 kubelet[3162]: E0317 18:20:40.492163 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.510206 kubelet[3162]: W0317 18:20:40.492185 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.510206 kubelet[3162]: E0317 18:20:40.506622 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.514819 kubelet[3162]: E0317 18:20:40.514749 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.514819 kubelet[3162]: W0317 18:20:40.514794 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.515076 kubelet[3162]: E0317 18:20:40.514837 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.515512 kubelet[3162]: E0317 18:20:40.515482 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.515616 kubelet[3162]: W0317 18:20:40.515511 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.515616 kubelet[3162]: E0317 18:20:40.515538 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.516067 kubelet[3162]: E0317 18:20:40.516037 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.516067 kubelet[3162]: W0317 18:20:40.516064 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.516224 kubelet[3162]: E0317 18:20:40.516090 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.518532 kubelet[3162]: E0317 18:20:40.518474 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.518532 kubelet[3162]: W0317 18:20:40.518515 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.518872 kubelet[3162]: E0317 18:20:40.518550 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.519005 kubelet[3162]: E0317 18:20:40.518970 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.519005 kubelet[3162]: W0317 18:20:40.519000 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.519171 kubelet[3162]: E0317 18:20:40.519025 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.521509 kubelet[3162]: E0317 18:20:40.521050 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.521509 kubelet[3162]: W0317 18:20:40.521087 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.521509 kubelet[3162]: E0317 18:20:40.521122 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.524262 kubelet[3162]: E0317 18:20:40.521585 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.524262 kubelet[3162]: W0317 18:20:40.521622 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.524262 kubelet[3162]: E0317 18:20:40.521659 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.524262 kubelet[3162]: E0317 18:20:40.522097 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.524262 kubelet[3162]: W0317 18:20:40.522138 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.524262 kubelet[3162]: E0317 18:20:40.522165 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.524262 kubelet[3162]: E0317 18:20:40.522732 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.524262 kubelet[3162]: W0317 18:20:40.522771 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.524262 kubelet[3162]: E0317 18:20:40.522799 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.524262 kubelet[3162]: E0317 18:20:40.524106 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.526030 kubelet[3162]: W0317 18:20:40.524140 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.526030 kubelet[3162]: E0317 18:20:40.524202 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.528497 kubelet[3162]: E0317 18:20:40.528414 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.528497 kubelet[3162]: W0317 18:20:40.528455 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.528497 kubelet[3162]: E0317 18:20:40.528490 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.528985 kubelet[3162]: E0317 18:20:40.528955 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.529091 kubelet[3162]: W0317 18:20:40.528985 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.529091 kubelet[3162]: E0317 18:20:40.529010 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.529996 kubelet[3162]: E0317 18:20:40.529563 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.529996 kubelet[3162]: W0317 18:20:40.529597 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.529996 kubelet[3162]: E0317 18:20:40.529642 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.540546 kubelet[3162]: E0317 18:20:40.530636 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.540546 kubelet[3162]: W0317 18:20:40.530670 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.540546 kubelet[3162]: E0317 18:20:40.530707 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.540546 kubelet[3162]: E0317 18:20:40.531303 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.540546 kubelet[3162]: W0317 18:20:40.531369 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.540546 kubelet[3162]: E0317 18:20:40.531529 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.540546 kubelet[3162]: E0317 18:20:40.531832 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.540546 kubelet[3162]: W0317 18:20:40.531851 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.540546 kubelet[3162]: E0317 18:20:40.532025 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.540546 kubelet[3162]: E0317 18:20:40.532316 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.541138 kubelet[3162]: W0317 18:20:40.532355 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.541138 kubelet[3162]: E0317 18:20:40.532527 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.541138 kubelet[3162]: E0317 18:20:40.532745 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.541138 kubelet[3162]: W0317 18:20:40.532761 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.541138 kubelet[3162]: E0317 18:20:40.532787 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.541138 kubelet[3162]: E0317 18:20:40.533302 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.541138 kubelet[3162]: W0317 18:20:40.533325 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.541138 kubelet[3162]: E0317 18:20:40.533507 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.541138 kubelet[3162]: E0317 18:20:40.533744 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.541138 kubelet[3162]: W0317 18:20:40.533761 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.541747 kubelet[3162]: E0317 18:20:40.533896 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.541747 kubelet[3162]: E0317 18:20:40.534091 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.541747 kubelet[3162]: W0317 18:20:40.534107 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.541747 kubelet[3162]: E0317 18:20:40.534248 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.541747 kubelet[3162]: E0317 18:20:40.534499 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.541747 kubelet[3162]: W0317 18:20:40.534519 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.541747 kubelet[3162]: E0317 18:20:40.534551 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.541747 kubelet[3162]: E0317 18:20:40.535818 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.541747 kubelet[3162]: W0317 18:20:40.535848 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.541747 kubelet[3162]: E0317 18:20:40.535995 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.542282 kubelet[3162]: E0317 18:20:40.536210 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.542282 kubelet[3162]: W0317 18:20:40.536232 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.542282 kubelet[3162]: E0317 18:20:40.536384 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.542282 kubelet[3162]: E0317 18:20:40.536615 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.542282 kubelet[3162]: W0317 18:20:40.536631 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.542282 kubelet[3162]: E0317 18:20:40.536658 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.542282 kubelet[3162]: E0317 18:20:40.536992 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.542282 kubelet[3162]: W0317 18:20:40.537009 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.542282 kubelet[3162]: E0317 18:20:40.537036 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.542282 kubelet[3162]: E0317 18:20:40.537726 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.542900 kubelet[3162]: W0317 18:20:40.537744 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.542900 kubelet[3162]: E0317 18:20:40.537772 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.542900 kubelet[3162]: E0317 18:20:40.538314 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.542900 kubelet[3162]: W0317 18:20:40.538357 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.542900 kubelet[3162]: E0317 18:20:40.538392 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.542900 kubelet[3162]: E0317 18:20:40.538762 3162 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:20:40.542900 kubelet[3162]: W0317 18:20:40.538780 3162 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:20:40.542900 kubelet[3162]: E0317 18:20:40.538807 3162 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:20:40.595166 env[1923]: time="2025-03-17T18:20:40.595090080Z" level=info msg="StartContainer for \"cb01145f7cd8523840384d953f3f97095805e46b201f78efcf3f9c4f9383be20\" returns successfully" Mar 17 18:20:40.893600 systemd[1]: run-containerd-runc-k8s.io-cb01145f7cd8523840384d953f3f97095805e46b201f78efcf3f9c4f9383be20-runc.ZRCvjI.mount: Deactivated successfully. Mar 17 18:20:40.894119 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cb01145f7cd8523840384d953f3f97095805e46b201f78efcf3f9c4f9383be20-rootfs.mount: Deactivated successfully. Mar 17 18:20:40.903765 env[1923]: time="2025-03-17T18:20:40.903697946Z" level=info msg="shim disconnected" id=cb01145f7cd8523840384d953f3f97095805e46b201f78efcf3f9c4f9383be20 Mar 17 18:20:40.904126 env[1923]: time="2025-03-17T18:20:40.904092192Z" level=warning msg="cleaning up after shim disconnected" id=cb01145f7cd8523840384d953f3f97095805e46b201f78efcf3f9c4f9383be20 namespace=k8s.io Mar 17 18:20:40.904256 env[1923]: time="2025-03-17T18:20:40.904228631Z" level=info msg="cleaning up dead shim" Mar 17 18:20:40.918163 env[1923]: time="2025-03-17T18:20:40.918105781Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:20:40Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3853 runtime=io.containerd.runc.v2\n" Mar 17 18:20:41.316644 kubelet[3162]: E0317 18:20:41.316596 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pv4xb" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" Mar 17 18:20:41.458843 env[1923]: time="2025-03-17T18:20:41.458789918Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Mar 17 18:20:41.485804 kubelet[3162]: I0317 18:20:41.485698 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5689d795bc-4bp2d" podStartSLOduration=3.714099305 podStartE2EDuration="6.485675028s" podCreationTimestamp="2025-03-17 18:20:35 +0000 UTC" firstStartedPulling="2025-03-17 18:20:36.109233701 +0000 UTC m=+24.145101093" lastFinishedPulling="2025-03-17 18:20:38.880809448 +0000 UTC m=+26.916676816" observedRunningTime="2025-03-17 18:20:39.476457662 +0000 UTC m=+27.512325066" watchObservedRunningTime="2025-03-17 18:20:41.485675028 +0000 UTC m=+29.521542420" Mar 17 18:20:43.316633 kubelet[3162]: E0317 18:20:43.316311 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pv4xb" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" Mar 17 18:20:45.316219 kubelet[3162]: E0317 18:20:45.316150 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pv4xb" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" Mar 17 18:20:46.543024 env[1923]: time="2025-03-17T18:20:46.542928603Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:46.546413 env[1923]: time="2025-03-17T18:20:46.546319979Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:46.548703 env[1923]: time="2025-03-17T18:20:46.548654329Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/cni:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:46.552994 env[1923]: time="2025-03-17T18:20:46.552934157Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:e5ca62af4ff61b88f55fe4e0d7723151103d3f6a470fd4ebb311a2de27a9597f\"" Mar 17 18:20:46.553575 env[1923]: time="2025-03-17T18:20:46.551786638Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:46.561595 env[1923]: time="2025-03-17T18:20:46.561477410Z" level=info msg="CreateContainer within sandbox \"697355467225ad3ac2f278a9f0174bd0a5705e42b72e07793bda04eb9c77a019\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 18:20:46.597747 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2355684881.mount: Deactivated successfully. Mar 17 18:20:46.602746 env[1923]: time="2025-03-17T18:20:46.602688209Z" level=info msg="CreateContainer within sandbox \"697355467225ad3ac2f278a9f0174bd0a5705e42b72e07793bda04eb9c77a019\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"9a5014ecc22fde4de0b9bb2ec9a981ff6329d9b00de27df43b609475dce57f28\"" Mar 17 18:20:46.603945 env[1923]: time="2025-03-17T18:20:46.603863123Z" level=info msg="StartContainer for \"9a5014ecc22fde4de0b9bb2ec9a981ff6329d9b00de27df43b609475dce57f28\"" Mar 17 18:20:46.706393 kubelet[3162]: I0317 18:20:46.706184 3162 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 18:20:46.745951 env[1923]: time="2025-03-17T18:20:46.745869571Z" level=info msg="StartContainer for \"9a5014ecc22fde4de0b9bb2ec9a981ff6329d9b00de27df43b609475dce57f28\" returns successfully" Mar 17 18:20:46.773000 audit[3911]: NETFILTER_CFG table=filter:95 family=2 entries=17 op=nft_register_rule pid=3911 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:46.777290 kernel: kauditd_printk_skb: 8 callbacks suppressed Mar 17 18:20:46.777455 kernel: audit: type=1325 audit(1742235646.773:294): table=filter:95 family=2 entries=17 op=nft_register_rule pid=3911 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:46.773000 audit[3911]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5908 a0=3 a1=ffffe6c672a0 a2=0 a3=1 items=0 ppid=3332 pid=3911 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:46.793952 kernel: audit: type=1300 audit(1742235646.773:294): arch=c00000b7 syscall=211 success=yes exit=5908 a0=3 a1=ffffe6c672a0 a2=0 a3=1 items=0 ppid=3332 pid=3911 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:46.773000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:46.799421 kernel: audit: type=1327 audit(1742235646.773:294): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:46.799000 audit[3911]: NETFILTER_CFG table=nat:96 family=2 entries=19 op=nft_register_chain pid=3911 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:46.807879 kernel: audit: type=1325 audit(1742235646.799:295): table=nat:96 family=2 entries=19 op=nft_register_chain pid=3911 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:20:46.799000 audit[3911]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffe6c672a0 a2=0 a3=1 items=0 ppid=3332 pid=3911 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:46.820106 kernel: audit: type=1300 audit(1742235646.799:295): arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffe6c672a0 a2=0 a3=1 items=0 ppid=3332 pid=3911 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:46.799000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:46.826304 kernel: audit: type=1327 audit(1742235646.799:295): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:20:47.316362 kubelet[3162]: E0317 18:20:47.316268 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-pv4xb" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" Mar 17 18:20:47.861035 env[1923]: time="2025-03-17T18:20:47.860937198Z" level=error msg="failed to reload cni configuration after receiving fs change event(\"/etc/cni/net.d/calico-kubeconfig\": WRITE)" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 18:20:47.902172 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-9a5014ecc22fde4de0b9bb2ec9a981ff6329d9b00de27df43b609475dce57f28-rootfs.mount: Deactivated successfully. Mar 17 18:20:47.951375 kubelet[3162]: I0317 18:20:47.951212 3162 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 17 18:20:47.997053 kubelet[3162]: I0317 18:20:47.996988 3162 topology_manager.go:215] "Topology Admit Handler" podUID="3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1" podNamespace="kube-system" podName="coredns-7db6d8ff4d-8bbf2" Mar 17 18:20:48.012629 kubelet[3162]: I0317 18:20:48.012580 3162 topology_manager.go:215] "Topology Admit Handler" podUID="cd45a415-9af8-4b7c-ac94-5ad5f9e3b710" podNamespace="kube-system" podName="coredns-7db6d8ff4d-krrrk" Mar 17 18:20:48.017530 kubelet[3162]: W0317 18:20:48.017485 3162 reflector.go:547] object-"kube-system"/"coredns": failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ip-172-31-21-220" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ip-172-31-21-220' and this object Mar 17 18:20:48.017806 kubelet[3162]: E0317 18:20:48.017776 3162 reflector.go:150] object-"kube-system"/"coredns": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "coredns" is forbidden: User "system:node:ip-172-31-21-220" cannot list resource "configmaps" in API group "" in the namespace "kube-system": no relationship found between node 'ip-172-31-21-220' and this object Mar 17 18:20:48.029448 kubelet[3162]: I0317 18:20:48.026826 3162 topology_manager.go:215] "Topology Admit Handler" podUID="bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07" podNamespace="calico-apiserver" podName="calico-apiserver-56f9f75749-9hqth" Mar 17 18:20:48.032819 kubelet[3162]: I0317 18:20:48.032761 3162 topology_manager.go:215] "Topology Admit Handler" podUID="3ad78572-b2b8-4d59-a3a2-ea0333361bca" podNamespace="calico-apiserver" podName="calico-apiserver-56f9f75749-5vmjt" Mar 17 18:20:48.033315 kubelet[3162]: I0317 18:20:48.033270 3162 topology_manager.go:215] "Topology Admit Handler" podUID="049cf128-4b21-4a4b-8889-b2f735eb419e" podNamespace="calico-system" podName="calico-kube-controllers-6f879dc54f-5nvvh" Mar 17 18:20:48.106449 kubelet[3162]: I0317 18:20:48.106402 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f59hb\" (UniqueName: \"kubernetes.io/projected/3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1-kube-api-access-f59hb\") pod \"coredns-7db6d8ff4d-8bbf2\" (UID: \"3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1\") " pod="kube-system/coredns-7db6d8ff4d-8bbf2" Mar 17 18:20:48.106749 kubelet[3162]: I0317 18:20:48.106717 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9z5t\" (UniqueName: \"kubernetes.io/projected/3ad78572-b2b8-4d59-a3a2-ea0333361bca-kube-api-access-z9z5t\") pod \"calico-apiserver-56f9f75749-5vmjt\" (UID: \"3ad78572-b2b8-4d59-a3a2-ea0333361bca\") " pod="calico-apiserver/calico-apiserver-56f9f75749-5vmjt" Mar 17 18:20:48.106934 kubelet[3162]: I0317 18:20:48.106905 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1-config-volume\") pod \"coredns-7db6d8ff4d-8bbf2\" (UID: \"3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1\") " pod="kube-system/coredns-7db6d8ff4d-8bbf2" Mar 17 18:20:48.107132 kubelet[3162]: I0317 18:20:48.107104 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cd45a415-9af8-4b7c-ac94-5ad5f9e3b710-config-volume\") pod \"coredns-7db6d8ff4d-krrrk\" (UID: \"cd45a415-9af8-4b7c-ac94-5ad5f9e3b710\") " pod="kube-system/coredns-7db6d8ff4d-krrrk" Mar 17 18:20:48.107300 kubelet[3162]: I0317 18:20:48.107273 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8h7k\" (UniqueName: \"kubernetes.io/projected/bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07-kube-api-access-l8h7k\") pod \"calico-apiserver-56f9f75749-9hqth\" (UID: \"bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07\") " pod="calico-apiserver/calico-apiserver-56f9f75749-9hqth" Mar 17 18:20:48.107545 kubelet[3162]: I0317 18:20:48.107500 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2dgt\" (UniqueName: \"kubernetes.io/projected/049cf128-4b21-4a4b-8889-b2f735eb419e-kube-api-access-h2dgt\") pod \"calico-kube-controllers-6f879dc54f-5nvvh\" (UID: \"049cf128-4b21-4a4b-8889-b2f735eb419e\") " pod="calico-system/calico-kube-controllers-6f879dc54f-5nvvh" Mar 17 18:20:48.108817 kubelet[3162]: I0317 18:20:48.108771 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07-calico-apiserver-certs\") pod \"calico-apiserver-56f9f75749-9hqth\" (UID: \"bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07\") " pod="calico-apiserver/calico-apiserver-56f9f75749-9hqth" Mar 17 18:20:48.109044 kubelet[3162]: I0317 18:20:48.109017 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/049cf128-4b21-4a4b-8889-b2f735eb419e-tigera-ca-bundle\") pod \"calico-kube-controllers-6f879dc54f-5nvvh\" (UID: \"049cf128-4b21-4a4b-8889-b2f735eb419e\") " pod="calico-system/calico-kube-controllers-6f879dc54f-5nvvh" Mar 17 18:20:48.109244 kubelet[3162]: I0317 18:20:48.109218 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjs5b\" (UniqueName: \"kubernetes.io/projected/cd45a415-9af8-4b7c-ac94-5ad5f9e3b710-kube-api-access-gjs5b\") pod \"coredns-7db6d8ff4d-krrrk\" (UID: \"cd45a415-9af8-4b7c-ac94-5ad5f9e3b710\") " pod="kube-system/coredns-7db6d8ff4d-krrrk" Mar 17 18:20:48.109464 kubelet[3162]: I0317 18:20:48.109437 3162 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3ad78572-b2b8-4d59-a3a2-ea0333361bca-calico-apiserver-certs\") pod \"calico-apiserver-56f9f75749-5vmjt\" (UID: \"3ad78572-b2b8-4d59-a3a2-ea0333361bca\") " pod="calico-apiserver/calico-apiserver-56f9f75749-5vmjt" Mar 17 18:20:48.354436 env[1923]: time="2025-03-17T18:20:48.354321660Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f879dc54f-5nvvh,Uid:049cf128-4b21-4a4b-8889-b2f735eb419e,Namespace:calico-system,Attempt:0,}" Mar 17 18:20:48.364448 env[1923]: time="2025-03-17T18:20:48.364323027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56f9f75749-5vmjt,Uid:3ad78572-b2b8-4d59-a3a2-ea0333361bca,Namespace:calico-apiserver,Attempt:0,}" Mar 17 18:20:48.375180 env[1923]: time="2025-03-17T18:20:48.375114243Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56f9f75749-9hqth,Uid:bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07,Namespace:calico-apiserver,Attempt:0,}" Mar 17 18:20:48.669693 env[1923]: time="2025-03-17T18:20:48.669241475Z" level=info msg="shim disconnected" id=9a5014ecc22fde4de0b9bb2ec9a981ff6329d9b00de27df43b609475dce57f28 Mar 17 18:20:48.670107 env[1923]: time="2025-03-17T18:20:48.670052995Z" level=warning msg="cleaning up after shim disconnected" id=9a5014ecc22fde4de0b9bb2ec9a981ff6329d9b00de27df43b609475dce57f28 namespace=k8s.io Mar 17 18:20:48.670260 env[1923]: time="2025-03-17T18:20:48.670231398Z" level=info msg="cleaning up dead shim" Mar 17 18:20:48.689353 env[1923]: time="2025-03-17T18:20:48.689282765Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:20:48Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3934 runtime=io.containerd.runc.v2\n" Mar 17 18:20:48.889261 env[1923]: time="2025-03-17T18:20:48.888889443Z" level=error msg="Failed to destroy network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.889914 env[1923]: time="2025-03-17T18:20:48.889772183Z" level=error msg="encountered an error cleaning up failed sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.890011 env[1923]: time="2025-03-17T18:20:48.889895482Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56f9f75749-5vmjt,Uid:3ad78572-b2b8-4d59-a3a2-ea0333361bca,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.890272 kubelet[3162]: E0317 18:20:48.890183 3162 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.890446 kubelet[3162]: E0317 18:20:48.890297 3162 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56f9f75749-5vmjt" Mar 17 18:20:48.890446 kubelet[3162]: E0317 18:20:48.890378 3162 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56f9f75749-5vmjt" Mar 17 18:20:48.890649 kubelet[3162]: E0317 18:20:48.890595 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-56f9f75749-5vmjt_calico-apiserver(3ad78572-b2b8-4d59-a3a2-ea0333361bca)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-56f9f75749-5vmjt_calico-apiserver(3ad78572-b2b8-4d59-a3a2-ea0333361bca)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56f9f75749-5vmjt" podUID="3ad78572-b2b8-4d59-a3a2-ea0333361bca" Mar 17 18:20:48.893313 env[1923]: time="2025-03-17T18:20:48.893190811Z" level=error msg="Failed to destroy network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.894326 env[1923]: time="2025-03-17T18:20:48.894228927Z" level=error msg="encountered an error cleaning up failed sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.894592 env[1923]: time="2025-03-17T18:20:48.894536137Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56f9f75749-9hqth,Uid:bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.902731 kubelet[3162]: E0317 18:20:48.902625 3162 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.902731 kubelet[3162]: E0317 18:20:48.902720 3162 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56f9f75749-9hqth" Mar 17 18:20:48.903022 kubelet[3162]: E0317 18:20:48.902757 3162 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-56f9f75749-9hqth" Mar 17 18:20:48.903022 kubelet[3162]: E0317 18:20:48.902827 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-56f9f75749-9hqth_calico-apiserver(bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-56f9f75749-9hqth_calico-apiserver(bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56f9f75749-9hqth" podUID="bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07" Mar 17 18:20:48.923841 env[1923]: time="2025-03-17T18:20:48.922690352Z" level=error msg="Failed to destroy network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.932710 env[1923]: time="2025-03-17T18:20:48.928882396Z" level=error msg="encountered an error cleaning up failed sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.932710 env[1923]: time="2025-03-17T18:20:48.929133087Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f879dc54f-5nvvh,Uid:049cf128-4b21-4a4b-8889-b2f735eb419e,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.933794 kubelet[3162]: E0317 18:20:48.933217 3162 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:48.933794 kubelet[3162]: E0317 18:20:48.933302 3162 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6f879dc54f-5nvvh" Mar 17 18:20:48.933794 kubelet[3162]: E0317 18:20:48.933370 3162 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6f879dc54f-5nvvh" Mar 17 18:20:48.935930 kubelet[3162]: E0317 18:20:48.933451 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6f879dc54f-5nvvh_calico-system(049cf128-4b21-4a4b-8889-b2f735eb419e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6f879dc54f-5nvvh_calico-system(049cf128-4b21-4a4b-8889-b2f735eb419e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6f879dc54f-5nvvh" podUID="049cf128-4b21-4a4b-8889-b2f735eb419e" Mar 17 18:20:48.934565 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d-shm.mount: Deactivated successfully. Mar 17 18:20:49.213039 kubelet[3162]: E0317 18:20:49.211834 3162 configmap.go:199] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Mar 17 18:20:49.213039 kubelet[3162]: E0317 18:20:49.211969 3162 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/cd45a415-9af8-4b7c-ac94-5ad5f9e3b710-config-volume podName:cd45a415-9af8-4b7c-ac94-5ad5f9e3b710 nodeName:}" failed. No retries permitted until 2025-03-17 18:20:49.711938345 +0000 UTC m=+37.747805737 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/cd45a415-9af8-4b7c-ac94-5ad5f9e3b710-config-volume") pod "coredns-7db6d8ff4d-krrrk" (UID: "cd45a415-9af8-4b7c-ac94-5ad5f9e3b710") : failed to sync configmap cache: timed out waiting for the condition Mar 17 18:20:49.213039 kubelet[3162]: E0317 18:20:49.212706 3162 configmap.go:199] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Mar 17 18:20:49.213039 kubelet[3162]: E0317 18:20:49.212826 3162 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1-config-volume podName:3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1 nodeName:}" failed. No retries permitted until 2025-03-17 18:20:49.712799834 +0000 UTC m=+37.748667226 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1-config-volume") pod "coredns-7db6d8ff4d-8bbf2" (UID: "3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1") : failed to sync configmap cache: timed out waiting for the condition Mar 17 18:20:49.321992 env[1923]: time="2025-03-17T18:20:49.321916445Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pv4xb,Uid:7af73a1a-8033-4ba4-ba19-078aeb2052b7,Namespace:calico-system,Attempt:0,}" Mar 17 18:20:49.441694 env[1923]: time="2025-03-17T18:20:49.441600810Z" level=error msg="Failed to destroy network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.446177 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52-shm.mount: Deactivated successfully. Mar 17 18:20:49.447961 env[1923]: time="2025-03-17T18:20:49.447891674Z" level=error msg="encountered an error cleaning up failed sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.448474 env[1923]: time="2025-03-17T18:20:49.448284504Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pv4xb,Uid:7af73a1a-8033-4ba4-ba19-078aeb2052b7,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.449844 kubelet[3162]: E0317 18:20:49.449775 3162 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.450013 kubelet[3162]: E0317 18:20:49.449882 3162 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pv4xb" Mar 17 18:20:49.450013 kubelet[3162]: E0317 18:20:49.449940 3162 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-pv4xb" Mar 17 18:20:49.450153 kubelet[3162]: E0317 18:20:49.450036 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-pv4xb_calico-system(7af73a1a-8033-4ba4-ba19-078aeb2052b7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-pv4xb_calico-system(7af73a1a-8033-4ba4-ba19-078aeb2052b7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-pv4xb" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" Mar 17 18:20:49.478227 kubelet[3162]: I0317 18:20:49.477891 3162 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:20:49.483366 env[1923]: time="2025-03-17T18:20:49.483198867Z" level=info msg="StopPodSandbox for \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\"" Mar 17 18:20:49.486223 kubelet[3162]: I0317 18:20:49.486172 3162 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:20:49.489960 env[1923]: time="2025-03-17T18:20:49.488404504Z" level=info msg="StopPodSandbox for \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\"" Mar 17 18:20:49.492991 kubelet[3162]: I0317 18:20:49.492923 3162 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:20:49.496993 env[1923]: time="2025-03-17T18:20:49.495393393Z" level=info msg="StopPodSandbox for \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\"" Mar 17 18:20:49.507498 env[1923]: time="2025-03-17T18:20:49.503298034Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Mar 17 18:20:49.509082 kubelet[3162]: I0317 18:20:49.508832 3162 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:20:49.511521 env[1923]: time="2025-03-17T18:20:49.511231019Z" level=info msg="StopPodSandbox for \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\"" Mar 17 18:20:49.663755 env[1923]: time="2025-03-17T18:20:49.663622872Z" level=error msg="StopPodSandbox for \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\" failed" error="failed to destroy network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.664223 kubelet[3162]: E0317 18:20:49.664153 3162 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:20:49.664446 kubelet[3162]: E0317 18:20:49.664259 3162 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b"} Mar 17 18:20:49.664570 kubelet[3162]: E0317 18:20:49.664465 3162 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3ad78572-b2b8-4d59-a3a2-ea0333361bca\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:20:49.664570 kubelet[3162]: E0317 18:20:49.664513 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3ad78572-b2b8-4d59-a3a2-ea0333361bca\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56f9f75749-5vmjt" podUID="3ad78572-b2b8-4d59-a3a2-ea0333361bca" Mar 17 18:20:49.668095 env[1923]: time="2025-03-17T18:20:49.667994093Z" level=error msg="StopPodSandbox for \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\" failed" error="failed to destroy network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.668586 kubelet[3162]: E0317 18:20:49.668501 3162 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:20:49.668729 kubelet[3162]: E0317 18:20:49.668623 3162 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210"} Mar 17 18:20:49.668729 kubelet[3162]: E0317 18:20:49.668682 3162 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:20:49.668900 kubelet[3162]: E0317 18:20:49.668746 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-56f9f75749-9hqth" podUID="bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07" Mar 17 18:20:49.670702 env[1923]: time="2025-03-17T18:20:49.670621649Z" level=error msg="StopPodSandbox for \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\" failed" error="failed to destroy network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.671124 kubelet[3162]: E0317 18:20:49.670955 3162 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:20:49.671391 kubelet[3162]: E0317 18:20:49.671145 3162 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52"} Mar 17 18:20:49.671391 kubelet[3162]: E0317 18:20:49.671226 3162 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7af73a1a-8033-4ba4-ba19-078aeb2052b7\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:20:49.671391 kubelet[3162]: E0317 18:20:49.671295 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7af73a1a-8033-4ba4-ba19-078aeb2052b7\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-pv4xb" podUID="7af73a1a-8033-4ba4-ba19-078aeb2052b7" Mar 17 18:20:49.671682 env[1923]: time="2025-03-17T18:20:49.671597413Z" level=error msg="StopPodSandbox for \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\" failed" error="failed to destroy network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.671884 kubelet[3162]: E0317 18:20:49.671829 3162 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:20:49.671978 kubelet[3162]: E0317 18:20:49.671896 3162 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d"} Mar 17 18:20:49.671978 kubelet[3162]: E0317 18:20:49.671947 3162 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"049cf128-4b21-4a4b-8889-b2f735eb419e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:20:49.672143 kubelet[3162]: E0317 18:20:49.671988 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"049cf128-4b21-4a4b-8889-b2f735eb419e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6f879dc54f-5nvvh" podUID="049cf128-4b21-4a4b-8889-b2f735eb419e" Mar 17 18:20:49.821974 env[1923]: time="2025-03-17T18:20:49.821811419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8bbf2,Uid:3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1,Namespace:kube-system,Attempt:0,}" Mar 17 18:20:49.826055 env[1923]: time="2025-03-17T18:20:49.825982769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-krrrk,Uid:cd45a415-9af8-4b7c-ac94-5ad5f9e3b710,Namespace:kube-system,Attempt:0,}" Mar 17 18:20:49.981492 env[1923]: time="2025-03-17T18:20:49.981416788Z" level=error msg="Failed to destroy network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.990114 env[1923]: time="2025-03-17T18:20:49.990002019Z" level=error msg="encountered an error cleaning up failed sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.990536 env[1923]: time="2025-03-17T18:20:49.990451225Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-krrrk,Uid:cd45a415-9af8-4b7c-ac94-5ad5f9e3b710,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.991142 kubelet[3162]: E0317 18:20:49.991057 3162 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:49.991536 kubelet[3162]: E0317 18:20:49.991410 3162 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-krrrk" Mar 17 18:20:49.991713 kubelet[3162]: E0317 18:20:49.991488 3162 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-krrrk" Mar 17 18:20:49.991981 kubelet[3162]: E0317 18:20:49.991893 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-krrrk_kube-system(cd45a415-9af8-4b7c-ac94-5ad5f9e3b710)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-krrrk_kube-system(cd45a415-9af8-4b7c-ac94-5ad5f9e3b710)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-krrrk" podUID="cd45a415-9af8-4b7c-ac94-5ad5f9e3b710" Mar 17 18:20:50.001316 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4-shm.mount: Deactivated successfully. Mar 17 18:20:50.032063 env[1923]: time="2025-03-17T18:20:50.031991131Z" level=error msg="Failed to destroy network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:50.037370 env[1923]: time="2025-03-17T18:20:50.037266356Z" level=error msg="encountered an error cleaning up failed sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:50.037629 env[1923]: time="2025-03-17T18:20:50.037575919Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8bbf2,Uid:3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:50.038760 kubelet[3162]: E0317 18:20:50.038054 3162 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:50.038760 kubelet[3162]: E0317 18:20:50.038153 3162 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8bbf2" Mar 17 18:20:50.038760 kubelet[3162]: E0317 18:20:50.038188 3162 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-8bbf2" Mar 17 18:20:50.039071 kubelet[3162]: E0317 18:20:50.038277 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-8bbf2_kube-system(3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-8bbf2_kube-system(3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8bbf2" podUID="3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1" Mar 17 18:20:50.041367 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988-shm.mount: Deactivated successfully. Mar 17 18:20:50.513282 kubelet[3162]: I0317 18:20:50.513237 3162 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:20:50.516209 env[1923]: time="2025-03-17T18:20:50.514659740Z" level=info msg="StopPodSandbox for \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\"" Mar 17 18:20:50.518265 kubelet[3162]: I0317 18:20:50.518206 3162 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:20:50.520311 env[1923]: time="2025-03-17T18:20:50.519443039Z" level=info msg="StopPodSandbox for \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\"" Mar 17 18:20:50.596645 env[1923]: time="2025-03-17T18:20:50.596541984Z" level=error msg="StopPodSandbox for \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\" failed" error="failed to destroy network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:50.596975 kubelet[3162]: E0317 18:20:50.596893 3162 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:20:50.597092 kubelet[3162]: E0317 18:20:50.597034 3162 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988"} Mar 17 18:20:50.597173 kubelet[3162]: E0317 18:20:50.597117 3162 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:20:50.597286 kubelet[3162]: E0317 18:20:50.597158 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-8bbf2" podUID="3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1" Mar 17 18:20:50.605921 env[1923]: time="2025-03-17T18:20:50.605836783Z" level=error msg="StopPodSandbox for \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\" failed" error="failed to destroy network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:20:50.607261 kubelet[3162]: E0317 18:20:50.606192 3162 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:20:50.607261 kubelet[3162]: E0317 18:20:50.606356 3162 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4"} Mar 17 18:20:50.607261 kubelet[3162]: E0317 18:20:50.606452 3162 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"cd45a415-9af8-4b7c-ac94-5ad5f9e3b710\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:20:50.607261 kubelet[3162]: E0317 18:20:50.606496 3162 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"cd45a415-9af8-4b7c-ac94-5ad5f9e3b710\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-krrrk" podUID="cd45a415-9af8-4b7c-ac94-5ad5f9e3b710" Mar 17 18:20:57.675532 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount101396409.mount: Deactivated successfully. Mar 17 18:20:57.750082 env[1923]: time="2025-03-17T18:20:57.750023404Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:57.755414 env[1923]: time="2025-03-17T18:20:57.755327418Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:57.758602 env[1923]: time="2025-03-17T18:20:57.758541689Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:57.761682 env[1923]: time="2025-03-17T18:20:57.761620889Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:20:57.762859 env[1923]: time="2025-03-17T18:20:57.762796296Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:680b8c280812d12c035ca9f0deedea7c761afe0f1cc65109ea2f96bf63801758\"" Mar 17 18:20:57.799878 env[1923]: time="2025-03-17T18:20:57.799818546Z" level=info msg="CreateContainer within sandbox \"697355467225ad3ac2f278a9f0174bd0a5705e42b72e07793bda04eb9c77a019\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 18:20:57.834099 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2095317211.mount: Deactivated successfully. Mar 17 18:20:57.838327 env[1923]: time="2025-03-17T18:20:57.838243986Z" level=info msg="CreateContainer within sandbox \"697355467225ad3ac2f278a9f0174bd0a5705e42b72e07793bda04eb9c77a019\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"3c8abc5333ad3e3319cf7dac17c6c0821f35b7609605dcae367cc29c9410e227\"" Mar 17 18:20:57.840743 env[1923]: time="2025-03-17T18:20:57.839262338Z" level=info msg="StartContainer for \"3c8abc5333ad3e3319cf7dac17c6c0821f35b7609605dcae367cc29c9410e227\"" Mar 17 18:20:57.958918 env[1923]: time="2025-03-17T18:20:57.958761789Z" level=info msg="StartContainer for \"3c8abc5333ad3e3319cf7dac17c6c0821f35b7609605dcae367cc29c9410e227\" returns successfully" Mar 17 18:20:58.109775 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 18:20:58.109941 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 18:20:59.886000 audit[4366]: AVC avc: denied { write } for pid=4366 comm="tee" name="fd" dev="proc" ino=21401 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:20:59.886000 audit[4366]: SYSCALL arch=c00000b7 syscall=56 success=yes exit=3 a0=ffffffffffffff9c a1=ffffdb778a1a a2=241 a3=1b6 items=1 ppid=4337 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:59.908488 kernel: audit: type=1400 audit(1742235659.886:296): avc: denied { write } for pid=4366 comm="tee" name="fd" dev="proc" ino=21401 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:20:59.908634 kernel: audit: type=1300 audit(1742235659.886:296): arch=c00000b7 syscall=56 success=yes exit=3 a0=ffffffffffffff9c a1=ffffdb778a1a a2=241 a3=1b6 items=1 ppid=4337 pid=4366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:59.886000 audit: CWD cwd="/etc/service/enabled/cni/log" Mar 17 18:20:59.886000 audit: PATH item=0 name="/dev/fd/63" inode=21532 dev=00:0b mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:20:59.922823 kernel: audit: type=1307 audit(1742235659.886:296): cwd="/etc/service/enabled/cni/log" Mar 17 18:20:59.922975 kernel: audit: type=1302 audit(1742235659.886:296): item=0 name="/dev/fd/63" inode=21532 dev=00:0b mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:20:59.886000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:20:59.946871 kernel: audit: type=1327 audit(1742235659.886:296): proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:20:59.914000 audit[4375]: AVC avc: denied { write } for pid=4375 comm="tee" name="fd" dev="proc" ino=21415 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:20:59.955473 kernel: audit: type=1400 audit(1742235659.914:297): avc: denied { write } for pid=4375 comm="tee" name="fd" dev="proc" ino=21415 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:20:59.914000 audit[4375]: SYSCALL arch=c00000b7 syscall=56 success=yes exit=3 a0=ffffffffffffff9c a1=ffffd6fdba18 a2=241 a3=1b6 items=1 ppid=4339 pid=4375 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:59.966105 kernel: audit: type=1300 audit(1742235659.914:297): arch=c00000b7 syscall=56 success=yes exit=3 a0=ffffffffffffff9c a1=ffffd6fdba18 a2=241 a3=1b6 items=1 ppid=4339 pid=4375 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:59.914000 audit: CWD cwd="/etc/service/enabled/felix/log" Mar 17 18:20:59.982905 kernel: audit: type=1307 audit(1742235659.914:297): cwd="/etc/service/enabled/felix/log" Mar 17 18:20:59.914000 audit: PATH item=0 name="/dev/fd/63" inode=21411 dev=00:0b mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:20:59.990213 kernel: audit: type=1302 audit(1742235659.914:297): item=0 name="/dev/fd/63" inode=21411 dev=00:0b mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:20:59.914000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:20:59.920000 audit[4380]: AVC avc: denied { write } for pid=4380 comm="tee" name="fd" dev="proc" ino=21419 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:21:00.017510 kernel: audit: type=1327 audit(1742235659.914:297): proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:20:59.920000 audit[4380]: SYSCALL arch=c00000b7 syscall=56 success=yes exit=3 a0=ffffffffffffff9c a1=fffff2164a18 a2=241 a3=1b6 items=1 ppid=4361 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:20:59.920000 audit: CWD cwd="/etc/service/enabled/bird6/log" Mar 17 18:20:59.920000 audit: PATH item=0 name="/dev/fd/63" inode=21412 dev=00:0b mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:20:59.920000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:21:00.030000 audit[4405]: AVC avc: denied { write } for pid=4405 comm="tee" name="fd" dev="proc" ino=21561 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:21:00.030000 audit[4405]: SYSCALL arch=c00000b7 syscall=56 success=yes exit=3 a0=ffffffffffffff9c a1=ffffde718a08 a2=241 a3=1b6 items=1 ppid=4351 pid=4405 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.030000 audit: CWD cwd="/etc/service/enabled/allocate-tunnel-addrs/log" Mar 17 18:21:00.030000 audit: PATH item=0 name="/dev/fd/63" inode=21556 dev=00:0b mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:21:00.030000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:21:00.037000 audit[4398]: AVC avc: denied { write } for pid=4398 comm="tee" name="fd" dev="proc" ino=21567 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:21:00.040000 audit[4403]: AVC avc: denied { write } for pid=4403 comm="tee" name="fd" dev="proc" ino=21568 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:21:00.042000 audit[4397]: AVC avc: denied { write } for pid=4397 comm="tee" name="fd" dev="proc" ino=21569 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:21:00.040000 audit[4403]: SYSCALL arch=c00000b7 syscall=56 success=yes exit=3 a0=ffffffffffffff9c a1=fffffcd4ca09 a2=241 a3=1b6 items=1 ppid=4346 pid=4403 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.040000 audit: CWD cwd="/etc/service/enabled/node-status-reporter/log" Mar 17 18:21:00.040000 audit: PATH item=0 name="/dev/fd/63" inode=21430 dev=00:0b mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:21:00.040000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:21:00.042000 audit[4397]: SYSCALL arch=c00000b7 syscall=56 success=yes exit=3 a0=ffffffffffffff9c a1=ffffd753ea18 a2=241 a3=1b6 items=1 ppid=4340 pid=4397 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.042000 audit: CWD cwd="/etc/service/enabled/confd/log" Mar 17 18:21:00.042000 audit: PATH item=0 name="/dev/fd/63" inode=21429 dev=00:0b mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:21:00.042000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:21:00.037000 audit[4398]: SYSCALL arch=c00000b7 syscall=56 success=yes exit=3 a0=ffffffffffffff9c a1=ffffeee1ca19 a2=241 a3=1b6 items=1 ppid=4336 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.037000 audit: CWD cwd="/etc/service/enabled/bird/log" Mar 17 18:21:00.037000 audit: PATH item=0 name="/dev/fd/63" inode=21553 dev=00:0b mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:21:00.037000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:21:00.411000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.411000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.411000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.411000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.411000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.411000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.411000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.411000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.411000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.411000 audit: BPF prog-id=10 op=LOAD Mar 17 18:21:00.411000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe6221d38 a2=98 a3=ffffe6221d28 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.411000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.414000 audit: BPF prog-id=10 op=UNLOAD Mar 17 18:21:00.415000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.415000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.415000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.415000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.415000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.415000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.415000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.415000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.415000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.415000 audit: BPF prog-id=11 op=LOAD Mar 17 18:21:00.415000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe62219c8 a2=74 a3=95 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.415000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.430000 audit: BPF prog-id=11 op=UNLOAD Mar 17 18:21:00.430000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.430000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.430000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.430000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.430000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.430000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.430000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.430000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.430000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.430000 audit: BPF prog-id=12 op=LOAD Mar 17 18:21:00.430000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe6221a28 a2=94 a3=2 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.430000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.430000 audit: BPF prog-id=12 op=UNLOAD Mar 17 18:21:00.639000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.639000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.639000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.639000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.639000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.639000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.639000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.639000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.639000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.639000 audit: BPF prog-id=13 op=LOAD Mar 17 18:21:00.639000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe62219e8 a2=40 a3=ffffe6221a18 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.639000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.640000 audit: BPF prog-id=13 op=UNLOAD Mar 17 18:21:00.641000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.641000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=0 a1=ffffe6221b00 a2=50 a3=0 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.641000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.659000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.659000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe6221a58 a2=28 a3=ffffe6221b88 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.659000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.660000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.660000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe6221a88 a2=28 a3=ffffe6221bb8 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.660000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.660000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.660000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe6221938 a2=28 a3=ffffe6221a68 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.660000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.661000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.661000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe6221aa8 a2=28 a3=ffffe6221bd8 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.661000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.662000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.662000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe6221a88 a2=28 a3=ffffe6221bb8 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.662000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.663000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.663000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe6221a78 a2=28 a3=ffffe6221ba8 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.663000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.663000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.663000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe6221aa8 a2=28 a3=ffffe6221bd8 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.663000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.664000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.664000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe6221a88 a2=28 a3=ffffe6221bb8 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.664000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.664000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.664000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe6221aa8 a2=28 a3=ffffe6221bd8 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.664000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.664000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.664000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe6221a78 a2=28 a3=ffffe6221ba8 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.664000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.665000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.665000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe6221af8 a2=28 a3=ffffe6221c38 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.665000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=0 a1=ffffe6221830 a2=50 a3=0 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.666000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit: BPF prog-id=14 op=LOAD Mar 17 18:21:00.666000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe6221838 a2=94 a3=5 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.666000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.666000 audit: BPF prog-id=14 op=UNLOAD Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=0 a1=ffffe6221940 a2=50 a3=0 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.666000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=0 a0=16 a1=ffffe6221a88 a2=4 a3=3 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.666000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { confidentiality } for pid=4429 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:21:00.666000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=5 a1=ffffe6221a68 a2=94 a3=6 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.666000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.666000 audit[4429]: AVC avc: denied { confidentiality } for pid=4429 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:21:00.666000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=5 a1=ffffe6221238 a2=94 a3=83 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.666000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { perfmon } for pid=4429 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { bpf } for pid=4429 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.667000 audit[4429]: AVC avc: denied { confidentiality } for pid=4429 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:21:00.667000 audit[4429]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=5 a1=ffffe6221238 a2=94 a3=83 items=0 ppid=4345 pid=4429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.667000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit: BPF prog-id=15 op=LOAD Mar 17 18:21:00.690000 audit[4451]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffed204768 a2=98 a3=ffffed204758 items=0 ppid=4345 pid=4451 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.690000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:21:00.690000 audit: BPF prog-id=15 op=UNLOAD Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.690000 audit: BPF prog-id=16 op=LOAD Mar 17 18:21:00.690000 audit[4451]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffed204618 a2=74 a3=95 items=0 ppid=4345 pid=4451 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.690000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:21:00.691000 audit: BPF prog-id=16 op=UNLOAD Mar 17 18:21:00.691000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.691000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.691000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.691000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.691000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.691000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.691000 audit[4451]: AVC avc: denied { perfmon } for pid=4451 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.691000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.691000 audit[4451]: AVC avc: denied { bpf } for pid=4451 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:00.691000 audit: BPF prog-id=17 op=LOAD Mar 17 18:21:00.691000 audit[4451]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffed204648 a2=40 a3=ffffed204678 items=0 ppid=4345 pid=4451 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:00.691000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:21:00.691000 audit: BPF prog-id=17 op=UNLOAD Mar 17 18:21:00.851396 systemd-networkd[1583]: vxlan.calico: Link UP Mar 17 18:21:00.851417 systemd-networkd[1583]: vxlan.calico: Gained carrier Mar 17 18:21:00.858437 (udev-worker)[4472]: Network interface NamePolicy= disabled on kernel command line. Mar 17 18:21:01.037056 (udev-worker)[4302]: Network interface NamePolicy= disabled on kernel command line. Mar 17 18:21:01.041000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.041000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.041000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.041000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.041000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.041000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.041000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.041000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.041000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.041000 audit: BPF prog-id=18 op=LOAD Mar 17 18:21:01.041000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb8fccc8 a2=98 a3=ffffeb8fccb8 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.041000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.041000 audit: BPF prog-id=18 op=UNLOAD Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit: BPF prog-id=19 op=LOAD Mar 17 18:21:01.042000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb8fc9a8 a2=74 a3=95 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.042000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.042000 audit: BPF prog-id=19 op=UNLOAD Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.042000 audit: BPF prog-id=20 op=LOAD Mar 17 18:21:01.042000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb8fca08 a2=94 a3=2 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.042000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.043000 audit: BPF prog-id=20 op=UNLOAD Mar 17 18:21:01.043000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.043000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=12 a1=ffffeb8fca38 a2=28 a3=ffffeb8fcb68 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.043000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.043000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.043000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffeb8fca68 a2=28 a3=ffffeb8fcb98 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.043000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.043000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.043000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffeb8fc918 a2=28 a3=ffffeb8fca48 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.043000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.043000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.043000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=12 a1=ffffeb8fca88 a2=28 a3=ffffeb8fcbb8 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.043000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.043000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.043000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=12 a1=ffffeb8fca68 a2=28 a3=ffffeb8fcb98 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.043000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.043000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.043000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=12 a1=ffffeb8fca58 a2=28 a3=ffffeb8fcb88 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.043000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.043000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.043000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=12 a1=ffffeb8fca88 a2=28 a3=ffffeb8fcbb8 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.043000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.043000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.043000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffeb8fca68 a2=28 a3=ffffeb8fcb98 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.043000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.043000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.043000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffeb8fca88 a2=28 a3=ffffeb8fcbb8 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.043000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffeb8fca58 a2=28 a3=ffffeb8fcb88 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.044000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=12 a1=ffffeb8fcad8 a2=28 a3=ffffeb8fcc18 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.044000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.044000 audit: BPF prog-id=21 op=LOAD Mar 17 18:21:01.044000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffeb8fc8f8 a2=40 a3=ffffeb8fc928 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.044000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.046000 audit: BPF prog-id=21 op=UNLOAD Mar 17 18:21:01.047000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.047000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=0 a1=ffffeb8fc920 a2=50 a3=0 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.047000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=0 a1=ffffeb8fc920 a2=50 a3=0 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.050000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.050000 audit: BPF prog-id=22 op=LOAD Mar 17 18:21:01.050000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffeb8fc088 a2=94 a3=2 items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.050000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.051000 audit: BPF prog-id=22 op=UNLOAD Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { perfmon } for pid=4481 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit[4481]: AVC avc: denied { bpf } for pid=4481 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.051000 audit: BPF prog-id=23 op=LOAD Mar 17 18:21:01.051000 audit[4481]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffeb8fc218 a2=94 a3=2d items=0 ppid=4345 pid=4481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.051000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:21:01.067000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.067000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.067000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.067000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.067000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.067000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.067000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.067000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.067000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.067000 audit: BPF prog-id=24 op=LOAD Mar 17 18:21:01.067000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe9b3d958 a2=98 a3=ffffe9b3d948 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.067000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.067000 audit: BPF prog-id=24 op=UNLOAD Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit: BPF prog-id=25 op=LOAD Mar 17 18:21:01.068000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe9b3d5e8 a2=74 a3=95 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.068000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.068000 audit: BPF prog-id=25 op=UNLOAD Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.068000 audit: BPF prog-id=26 op=LOAD Mar 17 18:21:01.068000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe9b3d648 a2=94 a3=2 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.068000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.068000 audit: BPF prog-id=26 op=UNLOAD Mar 17 18:21:01.178000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-172.31.21.220:22-139.178.89.65:44276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:01.179536 systemd[1]: Started sshd@7-172.31.21.220:22-139.178.89.65:44276.service. Mar 17 18:21:01.275000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.275000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.275000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.275000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.275000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.275000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.275000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.275000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.275000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.275000 audit: BPF prog-id=27 op=LOAD Mar 17 18:21:01.275000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffe9b3d608 a2=40 a3=ffffe9b3d638 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.275000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.276000 audit: BPF prog-id=27 op=UNLOAD Mar 17 18:21:01.276000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.276000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=0 a1=ffffe9b3d720 a2=50 a3=0 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.276000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.291000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.291000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe9b3d678 a2=28 a3=ffffe9b3d7a8 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.291000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe9b3d6a8 a2=28 a3=ffffe9b3d7d8 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe9b3d558 a2=28 a3=ffffe9b3d688 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe9b3d6c8 a2=28 a3=ffffe9b3d7f8 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe9b3d6a8 a2=28 a3=ffffe9b3d7d8 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe9b3d698 a2=28 a3=ffffe9b3d7c8 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe9b3d6c8 a2=28 a3=ffffe9b3d7f8 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe9b3d6a8 a2=28 a3=ffffe9b3d7d8 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe9b3d6c8 a2=28 a3=ffffe9b3d7f8 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=12 a1=ffffe9b3d698 a2=28 a3=ffffe9b3d7c8 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=12 a1=ffffe9b3d718 a2=28 a3=ffffe9b3d858 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=0 a1=ffffe9b3d450 a2=50 a3=0 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit: BPF prog-id=28 op=LOAD Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe9b3d458 a2=94 a3=5 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit: BPF prog-id=28 op=UNLOAD Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=0 a1=ffffe9b3d560 a2=50 a3=0 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.292000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.292000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=0 a0=16 a1=ffffe9b3d6a8 a2=4 a3=3 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.292000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.293000 audit[4488]: AVC avc: denied { confidentiality } for pid=4488 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:21:01.293000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=5 a1=ffffe9b3d688 a2=94 a3=6 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.293000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.294000 audit[4488]: AVC avc: denied { confidentiality } for pid=4488 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:21:01.294000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=5 a1=ffffe9b3ce58 a2=94 a3=83 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.294000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.295000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.295000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.295000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.295000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.295000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.295000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.295000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.295000 audit[4488]: AVC avc: denied { perfmon } for pid=4488 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.295000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=no exit=-22 a0=5 a1=ffffe9b3ce58 a2=94 a3=83 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.295000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.296000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.296000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=0 a0=f a1=ffffe9b3e898 a2=10 a3=ffffe9b3e988 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.296000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.297000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.297000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=0 a0=f a1=ffffe9b3e758 a2=10 a3=ffffe9b3e848 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.297000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.297000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.297000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=0 a0=f a1=ffffe9b3e6c8 a2=10 a3=ffffe9b3e848 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.297000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.297000 audit[4488]: AVC avc: denied { bpf } for pid=4488 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:21:01.297000 audit[4488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=0 a0=f a1=ffffe9b3e6c8 a2=10 a3=ffffe9b3e848 items=0 ppid=4345 pid=4488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.297000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:21:01.307000 audit: BPF prog-id=23 op=UNLOAD Mar 17 18:21:01.307000 audit[1518]: SYSCALL arch=c00000b7 syscall=56 success=no exit=-2 a0=ffffffffffffff9c a1=aaaaeb49e8e0 a2=80802 a3=0 items=0 ppid=1 pid=1518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.307000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Mar 17 18:21:01.324310 env[1923]: time="2025-03-17T18:21:01.322123021Z" level=info msg="StopPodSandbox for \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\"" Mar 17 18:21:01.324310 env[1923]: time="2025-03-17T18:21:01.322236519Z" level=info msg="StopPodSandbox for \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\"" Mar 17 18:21:01.376000 audit[4491]: USER_ACCT pid=4491 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:01.379519 sshd[4491]: Accepted publickey for core from 139.178.89.65 port 44276 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:01.381000 audit[4491]: CRED_ACQ pid=4491 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:01.381000 audit[4491]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffc3009630 a2=3 a3=1 items=0 ppid=1 pid=4491 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.381000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:01.386280 sshd[4491]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:01.399532 systemd[1]: Started session-8.scope. Mar 17 18:21:01.400021 systemd-logind[1906]: New session 8 of user core. Mar 17 18:21:01.438000 audit[4491]: USER_START pid=4491 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:01.441000 audit[4548]: CRED_ACQ pid=4548 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:01.489000 audit[4554]: NETFILTER_CFG table=mangle:97 family=2 entries=16 op=nft_register_chain pid=4554 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:01.489000 audit[4554]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=fffffaaa5dc0 a2=0 a3=ffff835d3fa8 items=0 ppid=4345 pid=4554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.489000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:01.512000 audit[4555]: NETFILTER_CFG table=nat:98 family=2 entries=15 op=nft_register_chain pid=4555 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:01.512000 audit[4555]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffe6dfd730 a2=0 a3=ffffab2a0fa8 items=0 ppid=4345 pid=4555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.512000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:01.523000 audit[4556]: NETFILTER_CFG table=filter:99 family=2 entries=39 op=nft_register_chain pid=4556 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:01.523000 audit[4556]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=18968 a0=3 a1=ffffc7d8cc30 a2=0 a3=ffff86447fa8 items=0 ppid=4345 pid=4556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.523000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:01.545000 audit[4553]: NETFILTER_CFG table=raw:100 family=2 entries=21 op=nft_register_chain pid=4553 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:01.545000 audit[4553]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=fffff6c11e80 a2=0 a3=ffff8f644fa8 items=0 ppid=4345 pid=4553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:01.545000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:01.618411 kubelet[3162]: I0317 18:21:01.615921 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-5wz9m" podStartSLOduration=5.201127538 podStartE2EDuration="26.615901059s" podCreationTimestamp="2025-03-17 18:20:35 +0000 UTC" firstStartedPulling="2025-03-17 18:20:36.349454357 +0000 UTC m=+24.385321737" lastFinishedPulling="2025-03-17 18:20:57.764227878 +0000 UTC m=+45.800095258" observedRunningTime="2025-03-17 18:20:58.587564545 +0000 UTC m=+46.623432045" watchObservedRunningTime="2025-03-17 18:21:01.615901059 +0000 UTC m=+49.651768439" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.623 [INFO][4536] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.624 [INFO][4536] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" iface="eth0" netns="/var/run/netns/cni-fad10b12-a94d-0938-e58c-fdc4f5d06832" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.624 [INFO][4536] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" iface="eth0" netns="/var/run/netns/cni-fad10b12-a94d-0938-e58c-fdc4f5d06832" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.626 [INFO][4536] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" iface="eth0" netns="/var/run/netns/cni-fad10b12-a94d-0938-e58c-fdc4f5d06832" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.626 [INFO][4536] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.626 [INFO][4536] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.741 [INFO][4574] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" HandleID="k8s-pod-network.d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.742 [INFO][4574] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.742 [INFO][4574] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.759 [WARNING][4574] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" HandleID="k8s-pod-network.d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.759 [INFO][4574] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" HandleID="k8s-pod-network.d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.766 [INFO][4574] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:01.798497 env[1923]: 2025-03-17 18:21:01.789 [INFO][4536] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:01.800242 env[1923]: time="2025-03-17T18:21:01.800186285Z" level=info msg="TearDown network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\" successfully" Mar 17 18:21:01.800493 env[1923]: time="2025-03-17T18:21:01.800452403Z" level=info msg="StopPodSandbox for \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\" returns successfully" Mar 17 18:21:01.807936 systemd[1]: run-netns-cni\x2dfad10b12\x2da94d\x2d0938\x2de58c\x2dfdc4f5d06832.mount: Deactivated successfully. Mar 17 18:21:01.826996 env[1923]: time="2025-03-17T18:21:01.826889298Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56f9f75749-9hqth,Uid:bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07,Namespace:calico-apiserver,Attempt:1,}" Mar 17 18:21:01.832309 sshd[4491]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:01.833000 audit[4491]: USER_END pid=4491 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:01.834000 audit[4491]: CRED_DISP pid=4491 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:01.837000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-172.31.21.220:22-139.178.89.65:44276 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:01.838296 systemd[1]: sshd@7-172.31.21.220:22-139.178.89.65:44276.service: Deactivated successfully. Mar 17 18:21:01.839840 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 18:21:01.843068 systemd-logind[1906]: Session 8 logged out. Waiting for processes to exit. Mar 17 18:21:01.847242 systemd-logind[1906]: Removed session 8. Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.634 [INFO][4531] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.635 [INFO][4531] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" iface="eth0" netns="/var/run/netns/cni-774463e7-8497-7064-9ba7-a7dd654b9681" Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.636 [INFO][4531] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" iface="eth0" netns="/var/run/netns/cni-774463e7-8497-7064-9ba7-a7dd654b9681" Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.636 [INFO][4531] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" iface="eth0" netns="/var/run/netns/cni-774463e7-8497-7064-9ba7-a7dd654b9681" Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.637 [INFO][4531] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.637 [INFO][4531] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.769 [INFO][4575] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" HandleID="k8s-pod-network.c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.770 [INFO][4575] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.770 [INFO][4575] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.814 [WARNING][4575] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" HandleID="k8s-pod-network.c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.815 [INFO][4575] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" HandleID="k8s-pod-network.c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.823 [INFO][4575] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:01.848428 env[1923]: 2025-03-17 18:21:01.828 [INFO][4531] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:01.860490 env[1923]: time="2025-03-17T18:21:01.860324271Z" level=info msg="TearDown network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\" successfully" Mar 17 18:21:01.860708 env[1923]: time="2025-03-17T18:21:01.860671846Z" level=info msg="StopPodSandbox for \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\" returns successfully" Mar 17 18:21:01.862601 systemd[1]: run-netns-cni\x2d774463e7\x2d8497\x2d7064\x2d9ba7\x2da7dd654b9681.mount: Deactivated successfully. Mar 17 18:21:01.875395 env[1923]: time="2025-03-17T18:21:01.870267917Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-krrrk,Uid:cd45a415-9af8-4b7c-ac94-5ad5f9e3b710,Namespace:kube-system,Attempt:1,}" Mar 17 18:21:02.169861 systemd-networkd[1583]: cali4322127d881: Link UP Mar 17 18:21:02.172448 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 18:21:02.172585 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali4322127d881: link becomes ready Mar 17 18:21:02.173192 (udev-worker)[4489]: Network interface NamePolicy= disabled on kernel command line. Mar 17 18:21:02.174292 systemd-networkd[1583]: cali4322127d881: Gained carrier Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.013 [INFO][4596] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0 coredns-7db6d8ff4d- kube-system cd45a415-9af8-4b7c-ac94-5ad5f9e3b710 839 0 2025-03-17 18:20:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-21-220 coredns-7db6d8ff4d-krrrk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4322127d881 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krrrk" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.013 [INFO][4596] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krrrk" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.081 [INFO][4620] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" HandleID="k8s-pod-network.fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.106 [INFO][4620] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" HandleID="k8s-pod-network.fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002eb1b0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-21-220", "pod":"coredns-7db6d8ff4d-krrrk", "timestamp":"2025-03-17 18:21:02.081184293 +0000 UTC"}, Hostname:"ip-172-31-21-220", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.106 [INFO][4620] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.106 [INFO][4620] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.106 [INFO][4620] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-21-220' Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.109 [INFO][4620] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" host="ip-172-31-21-220" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.116 [INFO][4620] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-21-220" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.124 [INFO][4620] ipam/ipam.go 489: Trying affinity for 192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.127 [INFO][4620] ipam/ipam.go 155: Attempting to load block cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.131 [INFO][4620] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.131 [INFO][4620] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.94.128/26 handle="k8s-pod-network.fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" host="ip-172-31-21-220" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.133 [INFO][4620] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4 Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.141 [INFO][4620] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.94.128/26 handle="k8s-pod-network.fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" host="ip-172-31-21-220" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.154 [INFO][4620] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.94.129/26] block=192.168.94.128/26 handle="k8s-pod-network.fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" host="ip-172-31-21-220" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.154 [INFO][4620] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.94.129/26] handle="k8s-pod-network.fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" host="ip-172-31-21-220" Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.155 [INFO][4620] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:02.218564 env[1923]: 2025-03-17 18:21:02.155 [INFO][4620] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.94.129/26] IPv6=[] ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" HandleID="k8s-pod-network.fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:02.219872 env[1923]: 2025-03-17 18:21:02.160 [INFO][4596] cni-plugin/k8s.go 386: Populated endpoint ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krrrk" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"cd45a415-9af8-4b7c-ac94-5ad5f9e3b710", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"", Pod:"coredns-7db6d8ff4d-krrrk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4322127d881", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:02.219872 env[1923]: 2025-03-17 18:21:02.160 [INFO][4596] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.94.129/32] ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krrrk" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:02.219872 env[1923]: 2025-03-17 18:21:02.161 [INFO][4596] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4322127d881 ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krrrk" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:02.219872 env[1923]: 2025-03-17 18:21:02.178 [INFO][4596] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krrrk" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:02.219872 env[1923]: 2025-03-17 18:21:02.179 [INFO][4596] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krrrk" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"cd45a415-9af8-4b7c-ac94-5ad5f9e3b710", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4", Pod:"coredns-7db6d8ff4d-krrrk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4322127d881", MAC:"8e:cd:7a:7e:5d:60", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:02.219872 env[1923]: 2025-03-17 18:21:02.211 [INFO][4596] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-krrrk" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:02.252000 audit[4641]: NETFILTER_CFG table=filter:101 family=2 entries=34 op=nft_register_chain pid=4641 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:02.252000 audit[4641]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19148 a0=3 a1=ffffc6946950 a2=0 a3=ffff95653fa8 items=0 ppid=4345 pid=4641 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:02.252000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:02.254712 systemd-networkd[1583]: cali9e9e4ce696b: Link UP Mar 17 18:21:02.257717 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali9e9e4ce696b: link becomes ready Mar 17 18:21:02.257169 systemd-networkd[1583]: cali9e9e4ce696b: Gained carrier Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:01.989 [INFO][4590] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0 calico-apiserver-56f9f75749- calico-apiserver bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07 838 0 2025-03-17 18:20:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:56f9f75749 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-21-220 calico-apiserver-56f9f75749-9hqth eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali9e9e4ce696b [] []}} ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-9hqth" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:01.990 [INFO][4590] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-9hqth" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.084 [INFO][4615] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" HandleID="k8s-pod-network.5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.107 [INFO][4615] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" HandleID="k8s-pod-network.5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400029abb0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-21-220", "pod":"calico-apiserver-56f9f75749-9hqth", "timestamp":"2025-03-17 18:21:02.08432268 +0000 UTC"}, Hostname:"ip-172-31-21-220", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.107 [INFO][4615] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.154 [INFO][4615] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.154 [INFO][4615] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-21-220' Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.157 [INFO][4615] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" host="ip-172-31-21-220" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.172 [INFO][4615] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-21-220" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.198 [INFO][4615] ipam/ipam.go 489: Trying affinity for 192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.207 [INFO][4615] ipam/ipam.go 155: Attempting to load block cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.211 [INFO][4615] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.211 [INFO][4615] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.94.128/26 handle="k8s-pod-network.5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" host="ip-172-31-21-220" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.214 [INFO][4615] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0 Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.223 [INFO][4615] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.94.128/26 handle="k8s-pod-network.5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" host="ip-172-31-21-220" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.233 [INFO][4615] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.94.130/26] block=192.168.94.128/26 handle="k8s-pod-network.5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" host="ip-172-31-21-220" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.234 [INFO][4615] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.94.130/26] handle="k8s-pod-network.5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" host="ip-172-31-21-220" Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.234 [INFO][4615] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:02.287266 env[1923]: 2025-03-17 18:21:02.234 [INFO][4615] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.94.130/26] IPv6=[] ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" HandleID="k8s-pod-network.5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:02.288589 env[1923]: 2025-03-17 18:21:02.238 [INFO][4590] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-9hqth" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0", GenerateName:"calico-apiserver-56f9f75749-", Namespace:"calico-apiserver", SelfLink:"", UID:"bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56f9f75749", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"", Pod:"calico-apiserver-56f9f75749-9hqth", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9e9e4ce696b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:02.288589 env[1923]: 2025-03-17 18:21:02.238 [INFO][4590] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.94.130/32] ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-9hqth" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:02.288589 env[1923]: 2025-03-17 18:21:02.238 [INFO][4590] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9e9e4ce696b ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-9hqth" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:02.288589 env[1923]: 2025-03-17 18:21:02.257 [INFO][4590] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-9hqth" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:02.288589 env[1923]: 2025-03-17 18:21:02.258 [INFO][4590] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-9hqth" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0", GenerateName:"calico-apiserver-56f9f75749-", Namespace:"calico-apiserver", SelfLink:"", UID:"bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07", ResourceVersion:"838", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56f9f75749", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0", Pod:"calico-apiserver-56f9f75749-9hqth", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9e9e4ce696b", MAC:"de:65:e6:ea:ba:2b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:02.288589 env[1923]: 2025-03-17 18:21:02.283 [INFO][4590] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-9hqth" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:02.305862 env[1923]: time="2025-03-17T18:21:02.305746391Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:21:02.306053 env[1923]: time="2025-03-17T18:21:02.305892638Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:21:02.306053 env[1923]: time="2025-03-17T18:21:02.305957236Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:21:02.306442 env[1923]: time="2025-03-17T18:21:02.306365088Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4 pid=4659 runtime=io.containerd.runc.v2 Mar 17 18:21:02.331597 env[1923]: time="2025-03-17T18:21:02.331496028Z" level=info msg="StopPodSandbox for \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\"" Mar 17 18:21:02.333000 audit[4681]: NETFILTER_CFG table=filter:102 family=2 entries=44 op=nft_register_chain pid=4681 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:02.338175 env[1923]: time="2025-03-17T18:21:02.338103684Z" level=info msg="StopPodSandbox for \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\"" Mar 17 18:21:02.338789 env[1923]: time="2025-03-17T18:21:02.338731717Z" level=info msg="StopPodSandbox for \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\"" Mar 17 18:21:02.333000 audit[4681]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24680 a0=3 a1=ffffe715af70 a2=0 a3=ffff9c2b3fa8 items=0 ppid=4345 pid=4681 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:02.333000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:02.385969 env[1923]: time="2025-03-17T18:21:02.375386080Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:21:02.385969 env[1923]: time="2025-03-17T18:21:02.375664341Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:21:02.385969 env[1923]: time="2025-03-17T18:21:02.375727534Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:21:02.385969 env[1923]: time="2025-03-17T18:21:02.376640429Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0 pid=4689 runtime=io.containerd.runc.v2 Mar 17 18:21:02.543701 env[1923]: time="2025-03-17T18:21:02.543638201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-krrrk,Uid:cd45a415-9af8-4b7c-ac94-5ad5f9e3b710,Namespace:kube-system,Attempt:1,} returns sandbox id \"fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4\"" Mar 17 18:21:02.551384 env[1923]: time="2025-03-17T18:21:02.551299718Z" level=info msg="CreateContainer within sandbox \"fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 18:21:02.582167 env[1923]: time="2025-03-17T18:21:02.581770237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56f9f75749-9hqth,Uid:bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0\"" Mar 17 18:21:02.585986 env[1923]: time="2025-03-17T18:21:02.585912732Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Mar 17 18:21:02.611071 env[1923]: time="2025-03-17T18:21:02.610965827Z" level=info msg="CreateContainer within sandbox \"fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"903bdbd54d16530cc8fa279285360680c5b8248371e430d771d3966565ad1560\"" Mar 17 18:21:02.614275 env[1923]: time="2025-03-17T18:21:02.614212216Z" level=info msg="StartContainer for \"903bdbd54d16530cc8fa279285360680c5b8248371e430d771d3966565ad1560\"" Mar 17 18:21:02.705062 systemd-networkd[1583]: vxlan.calico: Gained IPv6LL Mar 17 18:21:03.003941 env[1923]: time="2025-03-17T18:21:03.003871558Z" level=info msg="StartContainer for \"903bdbd54d16530cc8fa279285360680c5b8248371e430d771d3966565ad1560\" returns successfully" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:02.682 [INFO][4776] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:02.683 [INFO][4776] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" iface="eth0" netns="/var/run/netns/cni-d9c3c8ad-197a-664c-df14-cb97d27d22a0" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:02.684 [INFO][4776] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" iface="eth0" netns="/var/run/netns/cni-d9c3c8ad-197a-664c-df14-cb97d27d22a0" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:02.695 [INFO][4776] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" iface="eth0" netns="/var/run/netns/cni-d9c3c8ad-197a-664c-df14-cb97d27d22a0" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:02.695 [INFO][4776] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:02.695 [INFO][4776] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:02.988 [INFO][4810] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" HandleID="k8s-pod-network.bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:02.989 [INFO][4810] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:02.989 [INFO][4810] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:03.026 [WARNING][4810] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" HandleID="k8s-pod-network.bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:03.026 [INFO][4810] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" HandleID="k8s-pod-network.bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:03.038 [INFO][4810] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:03.065035 env[1923]: 2025-03-17 18:21:03.049 [INFO][4776] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:03.077643 systemd[1]: run-netns-cni\x2dd9c3c8ad\x2d197a\x2d664c\x2ddf14\x2dcb97d27d22a0.mount: Deactivated successfully. Mar 17 18:21:03.091738 env[1923]: time="2025-03-17T18:21:03.091657761Z" level=info msg="TearDown network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\" successfully" Mar 17 18:21:03.092126 env[1923]: time="2025-03-17T18:21:03.092036309Z" level=info msg="StopPodSandbox for \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\" returns successfully" Mar 17 18:21:03.094835 env[1923]: time="2025-03-17T18:21:03.094774774Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56f9f75749-5vmjt,Uid:3ad78572-b2b8-4d59-a3a2-ea0333361bca,Namespace:calico-apiserver,Attempt:1,}" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:02.724 [INFO][4774] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:02.725 [INFO][4774] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" iface="eth0" netns="/var/run/netns/cni-53dd71ea-2a5c-fa46-413c-e551609ec2e8" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:02.725 [INFO][4774] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" iface="eth0" netns="/var/run/netns/cni-53dd71ea-2a5c-fa46-413c-e551609ec2e8" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:02.725 [INFO][4774] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" iface="eth0" netns="/var/run/netns/cni-53dd71ea-2a5c-fa46-413c-e551609ec2e8" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:02.726 [INFO][4774] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:02.726 [INFO][4774] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:03.038 [INFO][4812] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" HandleID="k8s-pod-network.8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:03.041 [INFO][4812] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:03.042 [INFO][4812] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:03.059 [WARNING][4812] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" HandleID="k8s-pod-network.8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:03.059 [INFO][4812] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" HandleID="k8s-pod-network.8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:03.071 [INFO][4812] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:03.095670 env[1923]: 2025-03-17 18:21:03.088 [INFO][4774] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:03.108067 systemd[1]: run-netns-cni\x2d53dd71ea\x2d2a5c\x2dfa46\x2d413c\x2de551609ec2e8.mount: Deactivated successfully. Mar 17 18:21:03.112819 env[1923]: time="2025-03-17T18:21:03.112745579Z" level=info msg="TearDown network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\" successfully" Mar 17 18:21:03.114208 env[1923]: time="2025-03-17T18:21:03.114090949Z" level=info msg="StopPodSandbox for \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\" returns successfully" Mar 17 18:21:03.116224 env[1923]: time="2025-03-17T18:21:03.116157749Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pv4xb,Uid:7af73a1a-8033-4ba4-ba19-078aeb2052b7,Namespace:calico-system,Attempt:1,}" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:02.881 [INFO][4777] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:02.881 [INFO][4777] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" iface="eth0" netns="/var/run/netns/cni-f6c0118b-de91-cbee-2360-aab69f2e16a4" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:02.882 [INFO][4777] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" iface="eth0" netns="/var/run/netns/cni-f6c0118b-de91-cbee-2360-aab69f2e16a4" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:02.883 [INFO][4777] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" iface="eth0" netns="/var/run/netns/cni-f6c0118b-de91-cbee-2360-aab69f2e16a4" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:02.883 [INFO][4777] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:02.883 [INFO][4777] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:03.127 [INFO][4836] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" HandleID="k8s-pod-network.7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:03.127 [INFO][4836] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:03.127 [INFO][4836] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:03.157 [WARNING][4836] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" HandleID="k8s-pod-network.7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:03.157 [INFO][4836] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" HandleID="k8s-pod-network.7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:03.172 [INFO][4836] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:03.191538 env[1923]: 2025-03-17 18:21:03.187 [INFO][4777] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:03.192883 env[1923]: time="2025-03-17T18:21:03.192827392Z" level=info msg="TearDown network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\" successfully" Mar 17 18:21:03.193060 env[1923]: time="2025-03-17T18:21:03.193022084Z" level=info msg="StopPodSandbox for \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\" returns successfully" Mar 17 18:21:03.194170 env[1923]: time="2025-03-17T18:21:03.194116685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8bbf2,Uid:3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1,Namespace:kube-system,Attempt:1,}" Mar 17 18:21:03.529603 systemd-networkd[1583]: cali786425ac4e6: Link UP Mar 17 18:21:03.536380 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 18:21:03.536513 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali786425ac4e6: link becomes ready Mar 17 18:21:03.539911 systemd-networkd[1583]: cali786425ac4e6: Gained carrier Mar 17 18:21:03.602090 systemd-networkd[1583]: cali4322127d881: Gained IPv6LL Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.341 [INFO][4859] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0 calico-apiserver-56f9f75749- calico-apiserver 3ad78572-b2b8-4d59-a3a2-ea0333361bca 853 0 2025-03-17 18:20:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:56f9f75749 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ip-172-31-21-220 calico-apiserver-56f9f75749-5vmjt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali786425ac4e6 [] []}} ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-5vmjt" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.341 [INFO][4859] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-5vmjt" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.414 [INFO][4900] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" HandleID="k8s-pod-network.da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.439 [INFO][4900] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" HandleID="k8s-pod-network.da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003047f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ip-172-31-21-220", "pod":"calico-apiserver-56f9f75749-5vmjt", "timestamp":"2025-03-17 18:21:03.414726061 +0000 UTC"}, Hostname:"ip-172-31-21-220", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.439 [INFO][4900] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.440 [INFO][4900] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.441 [INFO][4900] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-21-220' Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.445 [INFO][4900] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" host="ip-172-31-21-220" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.454 [INFO][4900] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-21-220" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.465 [INFO][4900] ipam/ipam.go 489: Trying affinity for 192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.468 [INFO][4900] ipam/ipam.go 155: Attempting to load block cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.474 [INFO][4900] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.474 [INFO][4900] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.94.128/26 handle="k8s-pod-network.da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" host="ip-172-31-21-220" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.477 [INFO][4900] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337 Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.488 [INFO][4900] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.94.128/26 handle="k8s-pod-network.da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" host="ip-172-31-21-220" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.512 [INFO][4900] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.94.131/26] block=192.168.94.128/26 handle="k8s-pod-network.da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" host="ip-172-31-21-220" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.512 [INFO][4900] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.94.131/26] handle="k8s-pod-network.da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" host="ip-172-31-21-220" Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.512 [INFO][4900] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:03.613687 env[1923]: 2025-03-17 18:21:03.512 [INFO][4900] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.94.131/26] IPv6=[] ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" HandleID="k8s-pod-network.da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.615590 env[1923]: 2025-03-17 18:21:03.522 [INFO][4859] cni-plugin/k8s.go 386: Populated endpoint ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-5vmjt" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0", GenerateName:"calico-apiserver-56f9f75749-", Namespace:"calico-apiserver", SelfLink:"", UID:"3ad78572-b2b8-4d59-a3a2-ea0333361bca", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56f9f75749", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"", Pod:"calico-apiserver-56f9f75749-5vmjt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali786425ac4e6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:03.615590 env[1923]: 2025-03-17 18:21:03.522 [INFO][4859] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.94.131/32] ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-5vmjt" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.615590 env[1923]: 2025-03-17 18:21:03.523 [INFO][4859] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali786425ac4e6 ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-5vmjt" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.615590 env[1923]: 2025-03-17 18:21:03.542 [INFO][4859] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-5vmjt" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.615590 env[1923]: 2025-03-17 18:21:03.552 [INFO][4859] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-5vmjt" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0", GenerateName:"calico-apiserver-56f9f75749-", Namespace:"calico-apiserver", SelfLink:"", UID:"3ad78572-b2b8-4d59-a3a2-ea0333361bca", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56f9f75749", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337", Pod:"calico-apiserver-56f9f75749-5vmjt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali786425ac4e6", MAC:"4a:21:0d:bf:9e:e7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:03.615590 env[1923]: 2025-03-17 18:21:03.577 [INFO][4859] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337" Namespace="calico-apiserver" Pod="calico-apiserver-56f9f75749-5vmjt" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:03.633000 audit[4916]: NETFILTER_CFG table=filter:103 family=2 entries=38 op=nft_register_chain pid=4916 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:03.633000 audit[4916]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21516 a0=3 a1=fffff4ec72b0 a2=0 a3=ffff814a3fa8 items=0 ppid=4345 pid=4916 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:03.633000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:03.666016 systemd-networkd[1583]: cali9e9e4ce696b: Gained IPv6LL Mar 17 18:21:03.686585 kubelet[3162]: I0317 18:21:03.685622 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-krrrk" podStartSLOduration=38.685578516 podStartE2EDuration="38.685578516s" podCreationTimestamp="2025-03-17 18:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:21:03.65331865 +0000 UTC m=+51.689186042" watchObservedRunningTime="2025-03-17 18:21:03.685578516 +0000 UTC m=+51.721445908" Mar 17 18:21:03.693574 systemd-networkd[1583]: cali32ac00b841f: Link UP Mar 17 18:21:03.706710 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali32ac00b841f: link becomes ready Mar 17 18:21:03.707078 systemd-networkd[1583]: cali32ac00b841f: Gained carrier Mar 17 18:21:03.764000 audit[4930]: NETFILTER_CFG table=filter:104 family=2 entries=16 op=nft_register_rule pid=4930 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:03.764000 audit[4930]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5908 a0=3 a1=ffffce9e4c10 a2=0 a3=1 items=0 ppid=3332 pid=4930 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:03.764000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:03.773556 env[1923]: time="2025-03-17T18:21:03.769521201Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:21:03.773556 env[1923]: time="2025-03-17T18:21:03.769592806Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:21:03.773556 env[1923]: time="2025-03-17T18:21:03.769618451Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:21:03.773556 env[1923]: time="2025-03-17T18:21:03.769861527Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337 pid=4939 runtime=io.containerd.runc.v2 Mar 17 18:21:03.771000 audit[4930]: NETFILTER_CFG table=nat:105 family=2 entries=14 op=nft_register_rule pid=4930 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:03.771000 audit[4930]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffce9e4c10 a2=0 a3=1 items=0 ppid=3332 pid=4930 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:03.771000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.313 [INFO][4861] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0 csi-node-driver- calico-system 7af73a1a-8033-4ba4-ba19-078aeb2052b7 855 0 2025-03-17 18:20:35 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ip-172-31-21-220 csi-node-driver-pv4xb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali32ac00b841f [] []}} ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Namespace="calico-system" Pod="csi-node-driver-pv4xb" WorkloadEndpoint="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.313 [INFO][4861] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Namespace="calico-system" Pod="csi-node-driver-pv4xb" WorkloadEndpoint="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.458 [INFO][4895] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" HandleID="k8s-pod-network.5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.516 [INFO][4895] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" HandleID="k8s-pod-network.5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002a1510), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-21-220", "pod":"csi-node-driver-pv4xb", "timestamp":"2025-03-17 18:21:03.458480074 +0000 UTC"}, Hostname:"ip-172-31-21-220", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.516 [INFO][4895] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.516 [INFO][4895] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.516 [INFO][4895] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-21-220' Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.521 [INFO][4895] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" host="ip-172-31-21-220" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.550 [INFO][4895] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-21-220" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.594 [INFO][4895] ipam/ipam.go 489: Trying affinity for 192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.604 [INFO][4895] ipam/ipam.go 155: Attempting to load block cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.611 [INFO][4895] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.611 [INFO][4895] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.94.128/26 handle="k8s-pod-network.5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" host="ip-172-31-21-220" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.618 [INFO][4895] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.632 [INFO][4895] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.94.128/26 handle="k8s-pod-network.5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" host="ip-172-31-21-220" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.649 [INFO][4895] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.94.132/26] block=192.168.94.128/26 handle="k8s-pod-network.5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" host="ip-172-31-21-220" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.649 [INFO][4895] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.94.132/26] handle="k8s-pod-network.5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" host="ip-172-31-21-220" Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.650 [INFO][4895] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:03.783359 env[1923]: 2025-03-17 18:21:03.650 [INFO][4895] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.94.132/26] IPv6=[] ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" HandleID="k8s-pod-network.5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.784721 env[1923]: 2025-03-17 18:21:03.663 [INFO][4861] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Namespace="calico-system" Pod="csi-node-driver-pv4xb" WorkloadEndpoint="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7af73a1a-8033-4ba4-ba19-078aeb2052b7", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"", Pod:"csi-node-driver-pv4xb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.94.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali32ac00b841f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:03.784721 env[1923]: 2025-03-17 18:21:03.664 [INFO][4861] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.94.132/32] ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Namespace="calico-system" Pod="csi-node-driver-pv4xb" WorkloadEndpoint="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.784721 env[1923]: 2025-03-17 18:21:03.664 [INFO][4861] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali32ac00b841f ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Namespace="calico-system" Pod="csi-node-driver-pv4xb" WorkloadEndpoint="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.784721 env[1923]: 2025-03-17 18:21:03.711 [INFO][4861] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Namespace="calico-system" Pod="csi-node-driver-pv4xb" WorkloadEndpoint="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.784721 env[1923]: 2025-03-17 18:21:03.715 [INFO][4861] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Namespace="calico-system" Pod="csi-node-driver-pv4xb" WorkloadEndpoint="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7af73a1a-8033-4ba4-ba19-078aeb2052b7", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb", Pod:"csi-node-driver-pv4xb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.94.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali32ac00b841f", MAC:"5a:de:7b:bf:64:02", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:03.784721 env[1923]: 2025-03-17 18:21:03.767 [INFO][4861] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb" Namespace="calico-system" Pod="csi-node-driver-pv4xb" WorkloadEndpoint="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:03.831134 systemd[1]: run-netns-cni\x2df6c0118b\x2dde91\x2dcbee\x2d2360\x2daab69f2e16a4.mount: Deactivated successfully. Mar 17 18:21:03.854000 audit[4968]: NETFILTER_CFG table=filter:106 family=2 entries=13 op=nft_register_rule pid=4968 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:03.854000 audit[4968]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3676 a0=3 a1=fffff8f08460 a2=0 a3=1 items=0 ppid=3332 pid=4968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:03.854000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:03.861000 audit[4968]: NETFILTER_CFG table=nat:107 family=2 entries=35 op=nft_register_chain pid=4968 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:03.861000 audit[4968]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=fffff8f08460 a2=0 a3=1 items=0 ppid=3332 pid=4968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:03.861000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:03.903000 audit[4977]: NETFILTER_CFG table=filter:108 family=2 entries=46 op=nft_register_chain pid=4977 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:03.903000 audit[4977]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=22712 a0=3 a1=ffffc7471830 a2=0 a3=ffff9dfc0fa8 items=0 ppid=4345 pid=4977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:03.903000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:03.906912 systemd-networkd[1583]: cali260d29a9deb: Link UP Mar 17 18:21:03.916073 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali260d29a9deb: link becomes ready Mar 17 18:21:03.914771 systemd-networkd[1583]: cali260d29a9deb: Gained carrier Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.343 [INFO][4880] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0 coredns-7db6d8ff4d- kube-system 3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1 860 0 2025-03-17 18:20:25 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ip-172-31-21-220 coredns-7db6d8ff4d-8bbf2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali260d29a9deb [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8bbf2" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.344 [INFO][4880] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8bbf2" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.517 [INFO][4901] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" HandleID="k8s-pod-network.b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.619 [INFO][4901] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" HandleID="k8s-pod-network.b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400011cdf0), Attrs:map[string]string{"namespace":"kube-system", "node":"ip-172-31-21-220", "pod":"coredns-7db6d8ff4d-8bbf2", "timestamp":"2025-03-17 18:21:03.517859782 +0000 UTC"}, Hostname:"ip-172-31-21-220", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.619 [INFO][4901] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.650 [INFO][4901] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.650 [INFO][4901] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-21-220' Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.669 [INFO][4901] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" host="ip-172-31-21-220" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.722 [INFO][4901] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-21-220" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.779 [INFO][4901] ipam/ipam.go 489: Trying affinity for 192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.787 [INFO][4901] ipam/ipam.go 155: Attempting to load block cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.793 [INFO][4901] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.793 [INFO][4901] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.94.128/26 handle="k8s-pod-network.b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" host="ip-172-31-21-220" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.796 [INFO][4901] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0 Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.815 [INFO][4901] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.94.128/26 handle="k8s-pod-network.b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" host="ip-172-31-21-220" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.869 [INFO][4901] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.94.133/26] block=192.168.94.128/26 handle="k8s-pod-network.b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" host="ip-172-31-21-220" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.869 [INFO][4901] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.94.133/26] handle="k8s-pod-network.b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" host="ip-172-31-21-220" Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.871 [INFO][4901] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:03.961982 env[1923]: 2025-03-17 18:21:03.871 [INFO][4901] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.94.133/26] IPv6=[] ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" HandleID="k8s-pod-network.b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.963320 env[1923]: 2025-03-17 18:21:03.884 [INFO][4880] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8bbf2" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"", Pod:"coredns-7db6d8ff4d-8bbf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali260d29a9deb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:03.963320 env[1923]: 2025-03-17 18:21:03.885 [INFO][4880] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.94.133/32] ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8bbf2" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.963320 env[1923]: 2025-03-17 18:21:03.885 [INFO][4880] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali260d29a9deb ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8bbf2" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.963320 env[1923]: 2025-03-17 18:21:03.930 [INFO][4880] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8bbf2" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.963320 env[1923]: 2025-03-17 18:21:03.936 [INFO][4880] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8bbf2" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1", ResourceVersion:"860", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0", Pod:"coredns-7db6d8ff4d-8bbf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali260d29a9deb", MAC:"76:78:54:5b:56:43", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:03.963320 env[1923]: 2025-03-17 18:21:03.956 [INFO][4880] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0" Namespace="kube-system" Pod="coredns-7db6d8ff4d-8bbf2" WorkloadEndpoint="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:03.970475 env[1923]: time="2025-03-17T18:21:03.970400658Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-56f9f75749-5vmjt,Uid:3ad78572-b2b8-4d59-a3a2-ea0333361bca,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337\"" Mar 17 18:21:04.014163 env[1923]: time="2025-03-17T18:21:04.013978919Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:21:04.014405 env[1923]: time="2025-03-17T18:21:04.014188935Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:21:04.014481 env[1923]: time="2025-03-17T18:21:04.014320230Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:21:04.014000 audit[5007]: NETFILTER_CFG table=filter:109 family=2 entries=42 op=nft_register_chain pid=5007 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:04.014000 audit[5007]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=20596 a0=3 a1=fffff1beda40 a2=0 a3=ffff9b20ffa8 items=0 ppid=4345 pid=5007 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:04.014000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:04.047582 env[1923]: time="2025-03-17T18:21:04.019528988Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb pid=5008 runtime=io.containerd.runc.v2 Mar 17 18:21:04.052048 env[1923]: time="2025-03-17T18:21:04.049462434Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:21:04.052048 env[1923]: time="2025-03-17T18:21:04.049547215Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:21:04.052048 env[1923]: time="2025-03-17T18:21:04.049572800Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:21:04.052048 env[1923]: time="2025-03-17T18:21:04.049884385Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0 pid=5028 runtime=io.containerd.runc.v2 Mar 17 18:21:04.293573 env[1923]: time="2025-03-17T18:21:04.293515107Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-8bbf2,Uid:3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1,Namespace:kube-system,Attempt:1,} returns sandbox id \"b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0\"" Mar 17 18:21:04.296512 env[1923]: time="2025-03-17T18:21:04.296445970Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-pv4xb,Uid:7af73a1a-8033-4ba4-ba19-078aeb2052b7,Namespace:calico-system,Attempt:1,} returns sandbox id \"5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb\"" Mar 17 18:21:04.305988 env[1923]: time="2025-03-17T18:21:04.305844331Z" level=info msg="CreateContainer within sandbox \"b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 18:21:04.322251 env[1923]: time="2025-03-17T18:21:04.322192850Z" level=info msg="StopPodSandbox for \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\"" Mar 17 18:21:04.357829 env[1923]: time="2025-03-17T18:21:04.357763017Z" level=info msg="CreateContainer within sandbox \"b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fc8e7829705f9f208bc3a7442df74ee26d380ce9ada1a3f18a0476d66f050fef\"" Mar 17 18:21:04.359967 env[1923]: time="2025-03-17T18:21:04.359757755Z" level=info msg="StartContainer for \"fc8e7829705f9f208bc3a7442df74ee26d380ce9ada1a3f18a0476d66f050fef\"" Mar 17 18:21:04.549382 env[1923]: time="2025-03-17T18:21:04.545104354Z" level=info msg="StartContainer for \"fc8e7829705f9f208bc3a7442df74ee26d380ce9ada1a3f18a0476d66f050fef\" returns successfully" Mar 17 18:21:04.716916 kubelet[3162]: I0317 18:21:04.716549 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-8bbf2" podStartSLOduration=39.716502984 podStartE2EDuration="39.716502984s" podCreationTimestamp="2025-03-17 18:20:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:21:04.683779426 +0000 UTC m=+52.719646806" watchObservedRunningTime="2025-03-17 18:21:04.716502984 +0000 UTC m=+52.752370376" Mar 17 18:21:04.736000 audit[5149]: NETFILTER_CFG table=filter:110 family=2 entries=10 op=nft_register_rule pid=5149 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:04.736000 audit[5149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3676 a0=3 a1=ffffff1c5b60 a2=0 a3=1 items=0 ppid=3332 pid=5149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:04.736000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.526 [INFO][5101] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.526 [INFO][5101] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" iface="eth0" netns="/var/run/netns/cni-f8b2e3a6-c5dc-cb7e-ebb2-ffccb121a5c5" Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.526 [INFO][5101] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" iface="eth0" netns="/var/run/netns/cni-f8b2e3a6-c5dc-cb7e-ebb2-ffccb121a5c5" Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.527 [INFO][5101] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" iface="eth0" netns="/var/run/netns/cni-f8b2e3a6-c5dc-cb7e-ebb2-ffccb121a5c5" Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.527 [INFO][5101] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.527 [INFO][5101] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.678 [INFO][5136] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" HandleID="k8s-pod-network.8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.678 [INFO][5136] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.678 [INFO][5136] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.720 [WARNING][5136] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" HandleID="k8s-pod-network.8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.720 [INFO][5136] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" HandleID="k8s-pod-network.8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.741 [INFO][5136] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:04.748488 env[1923]: 2025-03-17 18:21:04.744 [INFO][5101] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:04.750258 env[1923]: time="2025-03-17T18:21:04.750202473Z" level=info msg="TearDown network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\" successfully" Mar 17 18:21:04.750462 env[1923]: time="2025-03-17T18:21:04.750427369Z" level=info msg="StopPodSandbox for \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\" returns successfully" Mar 17 18:21:04.753587 env[1923]: time="2025-03-17T18:21:04.753521087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f879dc54f-5nvvh,Uid:049cf128-4b21-4a4b-8889-b2f735eb419e,Namespace:calico-system,Attempt:1,}" Mar 17 18:21:04.755000 audit[5149]: NETFILTER_CFG table=nat:111 family=2 entries=44 op=nft_register_rule pid=5149 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:04.755000 audit[5149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffff1c5b60 a2=0 a3=1 items=0 ppid=3332 pid=5149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:04.755000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:04.817833 systemd[1]: run-containerd-runc-k8s.io-5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb-runc.BxYZTF.mount: Deactivated successfully. Mar 17 18:21:04.818581 systemd[1]: run-netns-cni\x2df8b2e3a6\x2dc5dc\x2dcb7e\x2debb2\x2dffccb121a5c5.mount: Deactivated successfully. Mar 17 18:21:04.881197 systemd-networkd[1583]: cali786425ac4e6: Gained IPv6LL Mar 17 18:21:04.949675 systemd-networkd[1583]: cali32ac00b841f: Gained IPv6LL Mar 17 18:21:05.049717 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 18:21:05.049893 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): calidc25a4043cd: link becomes ready Mar 17 18:21:05.054223 systemd-networkd[1583]: calidc25a4043cd: Link UP Mar 17 18:21:05.056610 systemd-networkd[1583]: calidc25a4043cd: Gained carrier Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.871 [INFO][5150] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0 calico-kube-controllers-6f879dc54f- calico-system 049cf128-4b21-4a4b-8889-b2f735eb419e 890 0 2025-03-17 18:20:35 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6f879dc54f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ip-172-31-21-220 calico-kube-controllers-6f879dc54f-5nvvh eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calidc25a4043cd [] []}} ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Namespace="calico-system" Pod="calico-kube-controllers-6f879dc54f-5nvvh" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.871 [INFO][5150] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Namespace="calico-system" Pod="calico-kube-controllers-6f879dc54f-5nvvh" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.964 [INFO][5162] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" HandleID="k8s-pod-network.cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.981 [INFO][5162] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" HandleID="k8s-pod-network.cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400028c3b0), Attrs:map[string]string{"namespace":"calico-system", "node":"ip-172-31-21-220", "pod":"calico-kube-controllers-6f879dc54f-5nvvh", "timestamp":"2025-03-17 18:21:04.964050531 +0000 UTC"}, Hostname:"ip-172-31-21-220", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.981 [INFO][5162] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.981 [INFO][5162] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.981 [INFO][5162] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ip-172-31-21-220' Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.984 [INFO][5162] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" host="ip-172-31-21-220" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.992 [INFO][5162] ipam/ipam.go 372: Looking up existing affinities for host host="ip-172-31-21-220" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:04.999 [INFO][5162] ipam/ipam.go 489: Trying affinity for 192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:05.002 [INFO][5162] ipam/ipam.go 155: Attempting to load block cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:05.007 [INFO][5162] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.94.128/26 host="ip-172-31-21-220" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:05.007 [INFO][5162] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.94.128/26 handle="k8s-pod-network.cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" host="ip-172-31-21-220" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:05.009 [INFO][5162] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4 Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:05.016 [INFO][5162] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.94.128/26 handle="k8s-pod-network.cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" host="ip-172-31-21-220" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:05.032 [INFO][5162] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.94.134/26] block=192.168.94.128/26 handle="k8s-pod-network.cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" host="ip-172-31-21-220" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:05.032 [INFO][5162] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.94.134/26] handle="k8s-pod-network.cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" host="ip-172-31-21-220" Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:05.033 [INFO][5162] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:05.115882 env[1923]: 2025-03-17 18:21:05.033 [INFO][5162] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.94.134/26] IPv6=[] ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" HandleID="k8s-pod-network.cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:05.117425 env[1923]: 2025-03-17 18:21:05.037 [INFO][5150] cni-plugin/k8s.go 386: Populated endpoint ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Namespace="calico-system" Pod="calico-kube-controllers-6f879dc54f-5nvvh" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0", GenerateName:"calico-kube-controllers-6f879dc54f-", Namespace:"calico-system", SelfLink:"", UID:"049cf128-4b21-4a4b-8889-b2f735eb419e", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f879dc54f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"", Pod:"calico-kube-controllers-6f879dc54f-5nvvh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.94.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidc25a4043cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:05.117425 env[1923]: 2025-03-17 18:21:05.037 [INFO][5150] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.94.134/32] ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Namespace="calico-system" Pod="calico-kube-controllers-6f879dc54f-5nvvh" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:05.117425 env[1923]: 2025-03-17 18:21:05.037 [INFO][5150] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidc25a4043cd ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Namespace="calico-system" Pod="calico-kube-controllers-6f879dc54f-5nvvh" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:05.117425 env[1923]: 2025-03-17 18:21:05.051 [INFO][5150] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Namespace="calico-system" Pod="calico-kube-controllers-6f879dc54f-5nvvh" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:05.117425 env[1923]: 2025-03-17 18:21:05.059 [INFO][5150] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Namespace="calico-system" Pod="calico-kube-controllers-6f879dc54f-5nvvh" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0", GenerateName:"calico-kube-controllers-6f879dc54f-", Namespace:"calico-system", SelfLink:"", UID:"049cf128-4b21-4a4b-8889-b2f735eb419e", ResourceVersion:"890", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f879dc54f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4", Pod:"calico-kube-controllers-6f879dc54f-5nvvh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.94.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidc25a4043cd", MAC:"12:95:9e:b7:aa:a6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:05.117425 env[1923]: 2025-03-17 18:21:05.101 [INFO][5150] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4" Namespace="calico-system" Pod="calico-kube-controllers-6f879dc54f-5nvvh" WorkloadEndpoint="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:05.204594 env[1923]: time="2025-03-17T18:21:05.204486333Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:21:05.204878 env[1923]: time="2025-03-17T18:21:05.204812031Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:21:05.205123 env[1923]: time="2025-03-17T18:21:05.205056440Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:21:05.245114 env[1923]: time="2025-03-17T18:21:05.214248331Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4 pid=5188 runtime=io.containerd.runc.v2 Mar 17 18:21:05.264000 audit[5201]: NETFILTER_CFG table=filter:112 family=2 entries=50 op=nft_register_chain pid=5201 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:05.271671 kernel: kauditd_printk_skb: 557 callbacks suppressed Mar 17 18:21:05.271869 kernel: audit: type=1325 audit(1742235665.264:418): table=filter:112 family=2 entries=50 op=nft_register_chain pid=5201 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:21:05.264000 audit[5201]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23392 a0=3 a1=ffffca73fb80 a2=0 a3=ffff801d8fa8 items=0 ppid=4345 pid=5201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:05.290496 kernel: audit: type=1300 audit(1742235665.264:418): arch=c00000b7 syscall=211 success=yes exit=23392 a0=3 a1=ffffca73fb80 a2=0 a3=ffff801d8fa8 items=0 ppid=4345 pid=5201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:05.299408 kernel: audit: type=1327 audit(1742235665.264:418): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:05.264000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:21:05.326062 systemd[1]: run-containerd-runc-k8s.io-cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4-runc.9rVvmN.mount: Deactivated successfully. Mar 17 18:21:05.659672 env[1923]: time="2025-03-17T18:21:05.655997055Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6f879dc54f-5nvvh,Uid:049cf128-4b21-4a4b-8889-b2f735eb419e,Namespace:calico-system,Attempt:1,} returns sandbox id \"cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4\"" Mar 17 18:21:05.811961 kernel: audit: type=1325 audit(1742235665.791:419): table=filter:113 family=2 entries=10 op=nft_register_rule pid=5226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:05.812121 kernel: audit: type=1300 audit(1742235665.791:419): arch=c00000b7 syscall=211 success=yes exit=3676 a0=3 a1=ffffe93257d0 a2=0 a3=1 items=0 ppid=3332 pid=5226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:05.791000 audit[5226]: NETFILTER_CFG table=filter:113 family=2 entries=10 op=nft_register_rule pid=5226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:05.791000 audit[5226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3676 a0=3 a1=ffffe93257d0 a2=0 a3=1 items=0 ppid=3332 pid=5226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:05.834654 kernel: audit: type=1327 audit(1742235665.791:419): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:05.791000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:05.854068 kernel: audit: type=1325 audit(1742235665.836:420): table=nat:114 family=2 entries=56 op=nft_register_chain pid=5226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:05.854217 kernel: audit: type=1300 audit(1742235665.836:420): arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffe93257d0 a2=0 a3=1 items=0 ppid=3332 pid=5226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:05.836000 audit[5226]: NETFILTER_CFG table=nat:114 family=2 entries=56 op=nft_register_chain pid=5226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:05.836000 audit[5226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffe93257d0 a2=0 a3=1 items=0 ppid=3332 pid=5226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:05.836000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:05.872253 kernel: audit: type=1327 audit(1742235665.836:420): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:05.904893 systemd-networkd[1583]: cali260d29a9deb: Gained IPv6LL Mar 17 18:21:06.717310 env[1923]: time="2025-03-17T18:21:06.716888337Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:06.721852 env[1923]: time="2025-03-17T18:21:06.721776023Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:06.742205 env[1923]: time="2025-03-17T18:21:06.742140021Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:06.761419 env[1923]: time="2025-03-17T18:21:06.759627880Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:06.761419 env[1923]: time="2025-03-17T18:21:06.760324864Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\"" Mar 17 18:21:06.768032 env[1923]: time="2025-03-17T18:21:06.767172401Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Mar 17 18:21:06.779296 env[1923]: time="2025-03-17T18:21:06.776478120Z" level=info msg="CreateContainer within sandbox \"5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 18:21:06.806709 systemd-networkd[1583]: calidc25a4043cd: Gained IPv6LL Mar 17 18:21:06.810471 env[1923]: time="2025-03-17T18:21:06.810408993Z" level=info msg="CreateContainer within sandbox \"5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"8ad6ebdf16492a8b444bcd7e5f9f8f43cabaf5b081263a50203c98a44d9761a4\"" Mar 17 18:21:06.812561 env[1923]: time="2025-03-17T18:21:06.812508850Z" level=info msg="StartContainer for \"8ad6ebdf16492a8b444bcd7e5f9f8f43cabaf5b081263a50203c98a44d9761a4\"" Mar 17 18:21:06.867800 systemd[1]: Started sshd@8-172.31.21.220:22-139.178.89.65:44290.service. Mar 17 18:21:06.866000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-172.31.21.220:22-139.178.89.65:44290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:06.888416 kernel: audit: type=1130 audit(1742235666.866:421): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-172.31.21.220:22-139.178.89.65:44290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:06.980012 systemd[1]: run-containerd-runc-k8s.io-8ad6ebdf16492a8b444bcd7e5f9f8f43cabaf5b081263a50203c98a44d9761a4-runc.nISRNC.mount: Deactivated successfully. Mar 17 18:21:07.113000 audit[5241]: USER_ACCT pid=5241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:07.115568 sshd[5241]: Accepted publickey for core from 139.178.89.65 port 44290 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:07.116000 audit[5241]: CRED_ACQ pid=5241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:07.116000 audit[5241]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=fffffd680570 a2=3 a3=1 items=0 ppid=1 pid=5241 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:07.116000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:07.119026 sshd[5241]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:07.134829 systemd[1]: Started session-9.scope. Mar 17 18:21:07.137497 systemd-logind[1906]: New session 9 of user core. Mar 17 18:21:07.160419 env[1923]: time="2025-03-17T18:21:07.160319776Z" level=info msg="StartContainer for \"8ad6ebdf16492a8b444bcd7e5f9f8f43cabaf5b081263a50203c98a44d9761a4\" returns successfully" Mar 17 18:21:07.159000 audit[5241]: USER_START pid=5241 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:07.163000 audit[5269]: CRED_ACQ pid=5269 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:07.233264 env[1923]: time="2025-03-17T18:21:07.233118486Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:07.239691 env[1923]: time="2025-03-17T18:21:07.239633493Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:07.244753 env[1923]: time="2025-03-17T18:21:07.244704803Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:07.250717 env[1923]: time="2025-03-17T18:21:07.250659712Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:07.258781 env[1923]: time="2025-03-17T18:21:07.252461039Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:5451b31bd8d0784796fa1204c4ec22975a270e21feadf2c5095fe41a38524c6c\"" Mar 17 18:21:07.261366 env[1923]: time="2025-03-17T18:21:07.260738072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Mar 17 18:21:07.269382 env[1923]: time="2025-03-17T18:21:07.269142151Z" level=info msg="CreateContainer within sandbox \"da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 18:21:07.293209 env[1923]: time="2025-03-17T18:21:07.293129323Z" level=info msg="CreateContainer within sandbox \"da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"5c3cfe2258cd1b84bb2db7e47e71996b1ed8ca153d046f8b395da2e6dcdcca87\"" Mar 17 18:21:07.294302 env[1923]: time="2025-03-17T18:21:07.294239233Z" level=info msg="StartContainer for \"5c3cfe2258cd1b84bb2db7e47e71996b1ed8ca153d046f8b395da2e6dcdcca87\"" Mar 17 18:21:07.488074 sshd[5241]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:07.489000 audit[5241]: USER_END pid=5241 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:07.490000 audit[5241]: CRED_DISP pid=5241 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:07.494000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-172.31.21.220:22-139.178.89.65:44290 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:07.495203 systemd[1]: sshd@8-172.31.21.220:22-139.178.89.65:44290.service: Deactivated successfully. Mar 17 18:21:07.499150 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 18:21:07.502730 systemd-logind[1906]: Session 9 logged out. Waiting for processes to exit. Mar 17 18:21:07.509837 systemd-logind[1906]: Removed session 9. Mar 17 18:21:07.576438 env[1923]: time="2025-03-17T18:21:07.576353507Z" level=info msg="StartContainer for \"5c3cfe2258cd1b84bb2db7e47e71996b1ed8ca153d046f8b395da2e6dcdcca87\" returns successfully" Mar 17 18:21:07.769153 kubelet[3162]: I0317 18:21:07.768944 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-56f9f75749-9hqth" podStartSLOduration=29.588252914 podStartE2EDuration="33.768890781s" podCreationTimestamp="2025-03-17 18:20:34 +0000 UTC" firstStartedPulling="2025-03-17 18:21:02.584971698 +0000 UTC m=+50.620839078" lastFinishedPulling="2025-03-17 18:21:06.765609553 +0000 UTC m=+54.801476945" observedRunningTime="2025-03-17 18:21:07.728974082 +0000 UTC m=+55.764841486" watchObservedRunningTime="2025-03-17 18:21:07.768890781 +0000 UTC m=+55.804758173" Mar 17 18:21:07.771671 kubelet[3162]: I0317 18:21:07.771570 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-56f9f75749-5vmjt" podStartSLOduration=30.485001577 podStartE2EDuration="33.771544518s" podCreationTimestamp="2025-03-17 18:20:34 +0000 UTC" firstStartedPulling="2025-03-17 18:21:03.973919774 +0000 UTC m=+52.009787142" lastFinishedPulling="2025-03-17 18:21:07.260462715 +0000 UTC m=+55.296330083" observedRunningTime="2025-03-17 18:21:07.764115228 +0000 UTC m=+55.799982620" watchObservedRunningTime="2025-03-17 18:21:07.771544518 +0000 UTC m=+55.807411970" Mar 17 18:21:07.843000 audit[5314]: NETFILTER_CFG table=filter:115 family=2 entries=10 op=nft_register_rule pid=5314 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:07.843000 audit[5314]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3676 a0=3 a1=ffffc8740630 a2=0 a3=1 items=0 ppid=3332 pid=5314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:07.843000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:07.852000 audit[5314]: NETFILTER_CFG table=nat:116 family=2 entries=20 op=nft_register_rule pid=5314 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:07.852000 audit[5314]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc8740630 a2=0 a3=1 items=0 ppid=3332 pid=5314 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:07.852000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:07.887000 audit[5320]: NETFILTER_CFG table=filter:117 family=2 entries=10 op=nft_register_rule pid=5320 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:07.887000 audit[5320]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3676 a0=3 a1=fffff9178f90 a2=0 a3=1 items=0 ppid=3332 pid=5320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:07.887000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:07.894000 audit[5320]: NETFILTER_CFG table=nat:118 family=2 entries=20 op=nft_register_rule pid=5320 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:07.894000 audit[5320]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff9178f90 a2=0 a3=1 items=0 ppid=3332 pid=5320 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:07.894000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:09.048220 env[1923]: time="2025-03-17T18:21:09.048145016Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:09.053177 env[1923]: time="2025-03-17T18:21:09.053123031Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:09.059196 env[1923]: time="2025-03-17T18:21:09.059137455Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/csi:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:09.062563 env[1923]: time="2025-03-17T18:21:09.062508860Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:09.066401 env[1923]: time="2025-03-17T18:21:09.065192811Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:3c11734f3001b7070e7e2b5e64938f89891cf8c44f8997e86aa23c5d5bf70163\"" Mar 17 18:21:09.069023 env[1923]: time="2025-03-17T18:21:09.068959647Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Mar 17 18:21:09.071872 env[1923]: time="2025-03-17T18:21:09.071815164Z" level=info msg="CreateContainer within sandbox \"5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 18:21:09.116833 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount239257313.mount: Deactivated successfully. Mar 17 18:21:09.133750 env[1923]: time="2025-03-17T18:21:09.133683661Z" level=info msg="CreateContainer within sandbox \"5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"553cfde59219a6ceb99b0b619b8474c34e10185d3784629c58bbfd731986024a\"" Mar 17 18:21:09.135222 env[1923]: time="2025-03-17T18:21:09.135162769Z" level=info msg="StartContainer for \"553cfde59219a6ceb99b0b619b8474c34e10185d3784629c58bbfd731986024a\"" Mar 17 18:21:09.351538 env[1923]: time="2025-03-17T18:21:09.351400390Z" level=info msg="StartContainer for \"553cfde59219a6ceb99b0b619b8474c34e10185d3784629c58bbfd731986024a\" returns successfully" Mar 17 18:21:09.671000 audit[5357]: NETFILTER_CFG table=filter:119 family=2 entries=9 op=nft_register_rule pid=5357 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:09.671000 audit[5357]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2932 a0=3 a1=ffffd7c20560 a2=0 a3=1 items=0 ppid=3332 pid=5357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:09.671000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:09.678000 audit[5357]: NETFILTER_CFG table=nat:120 family=2 entries=27 op=nft_register_chain pid=5357 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:09.678000 audit[5357]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=9348 a0=3 a1=ffffd7c20560 a2=0 a3=1 items=0 ppid=3332 pid=5357 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:09.678000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:10.128000 audit[5359]: NETFILTER_CFG table=filter:121 family=2 entries=8 op=nft_register_rule pid=5359 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:10.128000 audit[5359]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2932 a0=3 a1=fffff7409960 a2=0 a3=1 items=0 ppid=3332 pid=5359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:10.128000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:10.138000 audit[5359]: NETFILTER_CFG table=nat:122 family=2 entries=34 op=nft_register_chain pid=5359 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:10.138000 audit[5359]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=11236 a0=3 a1=fffff7409960 a2=0 a3=1 items=0 ppid=3332 pid=5359 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:10.138000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:11.663680 systemd[1]: run-containerd-runc-k8s.io-3c8abc5333ad3e3319cf7dac17c6c0821f35b7609605dcae367cc29c9410e227-runc.oxvqlK.mount: Deactivated successfully. Mar 17 18:21:11.729606 env[1923]: time="2025-03-17T18:21:11.729537401Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:11.734950 env[1923]: time="2025-03-17T18:21:11.734891101Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:11.738635 env[1923]: time="2025-03-17T18:21:11.738567967Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:11.743278 env[1923]: time="2025-03-17T18:21:11.743223449Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:11.745385 env[1923]: time="2025-03-17T18:21:11.745217362Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:32c335fdb9d757e7ba6a76a9cfa8d292a5a229101ae7ea37b42f53c28adf2db1\"" Mar 17 18:21:11.752764 env[1923]: time="2025-03-17T18:21:11.752685206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Mar 17 18:21:11.790587 env[1923]: time="2025-03-17T18:21:11.790529353Z" level=info msg="CreateContainer within sandbox \"cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 18:21:11.806080 env[1923]: time="2025-03-17T18:21:11.806016676Z" level=info msg="CreateContainer within sandbox \"cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"1b04737a4a65ee8932224ea657faca70a2efdea0328c9f91a58ac6a24ff94483\"" Mar 17 18:21:11.808409 env[1923]: time="2025-03-17T18:21:11.808298378Z" level=info msg="StartContainer for \"1b04737a4a65ee8932224ea657faca70a2efdea0328c9f91a58ac6a24ff94483\"" Mar 17 18:21:11.997751 env[1923]: time="2025-03-17T18:21:11.997686627Z" level=info msg="StartContainer for \"1b04737a4a65ee8932224ea657faca70a2efdea0328c9f91a58ac6a24ff94483\" returns successfully" Mar 17 18:21:12.282145 env[1923]: time="2025-03-17T18:21:12.281851203Z" level=info msg="StopPodSandbox for \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\"" Mar 17 18:21:12.513000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-172.31.21.220:22-139.178.89.65:48646 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:12.514051 systemd[1]: Started sshd@9-172.31.21.220:22-139.178.89.65:48646.service. Mar 17 18:21:12.516496 kernel: kauditd_printk_skb: 34 callbacks suppressed Mar 17 18:21:12.516588 kernel: audit: type=1130 audit(1742235672.513:438): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-172.31.21.220:22-139.178.89.65:48646 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.361 [WARNING][5462] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0", GenerateName:"calico-kube-controllers-6f879dc54f-", Namespace:"calico-system", SelfLink:"", UID:"049cf128-4b21-4a4b-8889-b2f735eb419e", ResourceVersion:"906", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f879dc54f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4", Pod:"calico-kube-controllers-6f879dc54f-5nvvh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.94.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidc25a4043cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.361 [INFO][5462] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.361 [INFO][5462] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" iface="eth0" netns="" Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.361 [INFO][5462] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.362 [INFO][5462] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.504 [INFO][5470] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" HandleID="k8s-pod-network.8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.505 [INFO][5470] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.506 [INFO][5470] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.572 [WARNING][5470] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" HandleID="k8s-pod-network.8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.572 [INFO][5470] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" HandleID="k8s-pod-network.8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.582 [INFO][5470] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:12.593373 env[1923]: 2025-03-17 18:21:12.589 [INFO][5462] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:12.594426 env[1923]: time="2025-03-17T18:21:12.594374502Z" level=info msg="TearDown network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\" successfully" Mar 17 18:21:12.594573 env[1923]: time="2025-03-17T18:21:12.594536265Z" level=info msg="StopPodSandbox for \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\" returns successfully" Mar 17 18:21:12.598632 env[1923]: time="2025-03-17T18:21:12.598554343Z" level=info msg="RemovePodSandbox for \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\"" Mar 17 18:21:12.598939 env[1923]: time="2025-03-17T18:21:12.598847267Z" level=info msg="Forcibly stopping sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\"" Mar 17 18:21:12.755000 audit[5476]: USER_ACCT pid=5476 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:12.760461 sshd[5476]: Accepted publickey for core from 139.178.89.65 port 48646 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:12.762735 sshd[5476]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:12.760000 audit[5476]: CRED_ACQ pid=5476 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:12.776626 kubelet[3162]: I0317 18:21:12.774830 3162 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-6f879dc54f-5nvvh" podStartSLOduration=31.689492079 podStartE2EDuration="37.774786626s" podCreationTimestamp="2025-03-17 18:20:35 +0000 UTC" firstStartedPulling="2025-03-17 18:21:05.662564643 +0000 UTC m=+53.698432023" lastFinishedPulling="2025-03-17 18:21:11.74785919 +0000 UTC m=+59.783726570" observedRunningTime="2025-03-17 18:21:12.774584339 +0000 UTC m=+60.810451743" watchObservedRunningTime="2025-03-17 18:21:12.774786626 +0000 UTC m=+60.810654018" Mar 17 18:21:12.781613 kernel: audit: type=1101 audit(1742235672.755:439): pid=5476 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:12.781767 kernel: audit: type=1103 audit(1742235672.760:440): pid=5476 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:12.790237 systemd[1]: Started session-10.scope. Mar 17 18:21:12.791639 systemd-logind[1906]: New session 10 of user core. Mar 17 18:21:12.760000 audit[5476]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffd6d68130 a2=3 a3=1 items=0 ppid=1 pid=5476 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:12.811527 kernel: audit: type=1006 audit(1742235672.760:441): pid=5476 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Mar 17 18:21:12.811657 kernel: audit: type=1300 audit(1742235672.760:441): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffd6d68130 a2=3 a3=1 items=0 ppid=1 pid=5476 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:12.760000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:12.815810 kernel: audit: type=1327 audit(1742235672.760:441): proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:12.830000 audit[5476]: USER_START pid=5476 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:12.843000 audit[5499]: CRED_ACQ pid=5499 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:12.870886 kernel: audit: type=1105 audit(1742235672.830:442): pid=5476 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:12.871072 kernel: audit: type=1103 audit(1742235672.843:443): pid=5499 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:12.932988 systemd[1]: run-containerd-runc-k8s.io-1b04737a4a65ee8932224ea657faca70a2efdea0328c9f91a58ac6a24ff94483-runc.AmzEG3.mount: Deactivated successfully. Mar 17 18:21:13.243592 sshd[5476]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:13.244000 audit[5476]: USER_END pid=5476 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:13.256477 systemd[1]: sshd@9-172.31.21.220:22-139.178.89.65:48646.service: Deactivated successfully. Mar 17 18:21:13.259201 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 18:21:13.259243 systemd-logind[1906]: Session 10 logged out. Waiting for processes to exit. Mar 17 18:21:13.261725 systemd-logind[1906]: Removed session 10. Mar 17 18:21:13.244000 audit[5476]: CRED_DISP pid=5476 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:13.278542 kernel: audit: type=1106 audit(1742235673.244:444): pid=5476 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:13.278684 kernel: audit: type=1104 audit(1742235673.244:445): pid=5476 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:13.255000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-172.31.21.220:22-139.178.89.65:48646 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:12.884 [WARNING][5491] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0", GenerateName:"calico-kube-controllers-6f879dc54f-", Namespace:"calico-system", SelfLink:"", UID:"049cf128-4b21-4a4b-8889-b2f735eb419e", ResourceVersion:"979", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6f879dc54f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"cb66af12c9b4e880a813c2476d884155ad0fffc002f525ec1017f29ad10fc2c4", Pod:"calico-kube-controllers-6f879dc54f-5nvvh", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.94.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidc25a4043cd", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:12.885 [INFO][5491] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:12.885 [INFO][5491] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" iface="eth0" netns="" Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:12.885 [INFO][5491] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:12.885 [INFO][5491] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:13.270 [INFO][5505] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" HandleID="k8s-pod-network.8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:13.270 [INFO][5505] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:13.270 [INFO][5505] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:13.290 [WARNING][5505] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" HandleID="k8s-pod-network.8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:13.290 [INFO][5505] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" HandleID="k8s-pod-network.8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Workload="ip--172--31--21--220-k8s-calico--kube--controllers--6f879dc54f--5nvvh-eth0" Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:13.292 [INFO][5505] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:13.298397 env[1923]: 2025-03-17 18:21:13.295 [INFO][5491] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d" Mar 17 18:21:13.299899 env[1923]: time="2025-03-17T18:21:13.299831298Z" level=info msg="TearDown network for sandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\" successfully" Mar 17 18:21:13.314041 env[1923]: time="2025-03-17T18:21:13.313906751Z" level=info msg="RemovePodSandbox \"8cb506c4bf0bca22e8160986883759fe69e84b03e836a9843060d6045cd8314d\" returns successfully" Mar 17 18:21:13.315920 env[1923]: time="2025-03-17T18:21:13.315832573Z" level=info msg="StopPodSandbox for \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\"" Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.530 [WARNING][5547] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0", Pod:"coredns-7db6d8ff4d-8bbf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali260d29a9deb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.531 [INFO][5547] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.531 [INFO][5547] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" iface="eth0" netns="" Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.531 [INFO][5547] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.531 [INFO][5547] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.670 [INFO][5553] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" HandleID="k8s-pod-network.7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.672 [INFO][5553] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.672 [INFO][5553] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.690 [WARNING][5553] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" HandleID="k8s-pod-network.7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.691 [INFO][5553] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" HandleID="k8s-pod-network.7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.694 [INFO][5553] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:13.711400 env[1923]: 2025-03-17 18:21:13.706 [INFO][5547] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:13.711400 env[1923]: time="2025-03-17T18:21:13.709840797Z" level=info msg="TearDown network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\" successfully" Mar 17 18:21:13.711400 env[1923]: time="2025-03-17T18:21:13.709888065Z" level=info msg="StopPodSandbox for \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\" returns successfully" Mar 17 18:21:13.718803 env[1923]: time="2025-03-17T18:21:13.718736689Z" level=info msg="RemovePodSandbox for \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\"" Mar 17 18:21:13.723030 env[1923]: time="2025-03-17T18:21:13.722924279Z" level=info msg="Forcibly stopping sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\"" Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.886 [WARNING][5573] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"3e4c0f7a-d7dd-4bcd-96fa-a56665c299b1", ResourceVersion:"895", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"b60c8468af9cb61302c9c926be8b09003e97762daf33b2e491cf70e35a0118d0", Pod:"coredns-7db6d8ff4d-8bbf2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali260d29a9deb", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.887 [INFO][5573] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.887 [INFO][5573] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" iface="eth0" netns="" Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.887 [INFO][5573] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.887 [INFO][5573] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.953 [INFO][5579] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" HandleID="k8s-pod-network.7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.959 [INFO][5579] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.959 [INFO][5579] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.979 [WARNING][5579] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" HandleID="k8s-pod-network.7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.979 [INFO][5579] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" HandleID="k8s-pod-network.7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--8bbf2-eth0" Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.982 [INFO][5579] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:13.987927 env[1923]: 2025-03-17 18:21:13.984 [INFO][5573] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988" Mar 17 18:21:13.987927 env[1923]: time="2025-03-17T18:21:13.987969477Z" level=info msg="TearDown network for sandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\" successfully" Mar 17 18:21:13.999268 env[1923]: time="2025-03-17T18:21:13.999197289Z" level=info msg="RemovePodSandbox \"7c6170a780d8c4f2194ff7795ce2da18bbbd570c9d20acb41d40390c440d1988\" returns successfully" Mar 17 18:21:14.000181 env[1923]: time="2025-03-17T18:21:14.000104406Z" level=info msg="StopPodSandbox for \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\"" Mar 17 18:21:14.156954 env[1923]: time="2025-03-17T18:21:14.156896660Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:14.163584 env[1923]: time="2025-03-17T18:21:14.163515625Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:14.168127 env[1923]: time="2025-03-17T18:21:14.168074799Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:14.173718 env[1923]: time="2025-03-17T18:21:14.173652774Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:21:14.175532 env[1923]: time="2025-03-17T18:21:14.175435914Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:3eb557f7694f230afd24a75a691bcda4c0a7bfe87a981386dcd4ecf2b0701349\"" Mar 17 18:21:14.183121 env[1923]: time="2025-03-17T18:21:14.183047245Z" level=info msg="CreateContainer within sandbox \"5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 18:21:14.226999 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2857182885.mount: Deactivated successfully. Mar 17 18:21:14.230394 env[1923]: time="2025-03-17T18:21:14.229522479Z" level=info msg="CreateContainer within sandbox \"5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"4922262a8d262c7353d0a5886574a7846af737a7f75470d480c9f778102f31d3\"" Mar 17 18:21:14.234997 env[1923]: time="2025-03-17T18:21:14.234145602Z" level=info msg="StartContainer for \"4922262a8d262c7353d0a5886574a7846af737a7f75470d480c9f778102f31d3\"" Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.153 [WARNING][5599] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0", GenerateName:"calico-apiserver-56f9f75749-", Namespace:"calico-apiserver", SelfLink:"", UID:"3ad78572-b2b8-4d59-a3a2-ea0333361bca", ResourceVersion:"955", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56f9f75749", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337", Pod:"calico-apiserver-56f9f75749-5vmjt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali786425ac4e6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.153 [INFO][5599] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.153 [INFO][5599] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" iface="eth0" netns="" Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.154 [INFO][5599] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.154 [INFO][5599] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.243 [INFO][5605] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" HandleID="k8s-pod-network.bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.243 [INFO][5605] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.243 [INFO][5605] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.283 [WARNING][5605] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" HandleID="k8s-pod-network.bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.283 [INFO][5605] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" HandleID="k8s-pod-network.bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.285 [INFO][5605] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:14.291183 env[1923]: 2025-03-17 18:21:14.288 [INFO][5599] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:14.294301 env[1923]: time="2025-03-17T18:21:14.291113550Z" level=info msg="TearDown network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\" successfully" Mar 17 18:21:14.294301 env[1923]: time="2025-03-17T18:21:14.292622498Z" level=info msg="StopPodSandbox for \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\" returns successfully" Mar 17 18:21:14.294301 env[1923]: time="2025-03-17T18:21:14.293491766Z" level=info msg="RemovePodSandbox for \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\"" Mar 17 18:21:14.294301 env[1923]: time="2025-03-17T18:21:14.293550506Z" level=info msg="Forcibly stopping sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\"" Mar 17 18:21:14.353268 systemd[1]: run-containerd-runc-k8s.io-4922262a8d262c7353d0a5886574a7846af737a7f75470d480c9f778102f31d3-runc.mZDFK9.mount: Deactivated successfully. Mar 17 18:21:14.458385 env[1923]: time="2025-03-17T18:21:14.458300043Z" level=info msg="StartContainer for \"4922262a8d262c7353d0a5886574a7846af737a7f75470d480c9f778102f31d3\" returns successfully" Mar 17 18:21:14.507167 kubelet[3162]: I0317 18:21:14.506444 3162 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 18:21:14.507167 kubelet[3162]: I0317 18:21:14.506531 3162 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.436 [WARNING][5641] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0", GenerateName:"calico-apiserver-56f9f75749-", Namespace:"calico-apiserver", SelfLink:"", UID:"3ad78572-b2b8-4d59-a3a2-ea0333361bca", ResourceVersion:"955", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56f9f75749", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"da0095efb0d886314bd8d3aa3c4b45ed3ef92dfee43b3672ed33dfa6f287c337", Pod:"calico-apiserver-56f9f75749-5vmjt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali786425ac4e6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.437 [INFO][5641] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.437 [INFO][5641] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" iface="eth0" netns="" Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.437 [INFO][5641] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.437 [INFO][5641] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.491 [INFO][5660] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" HandleID="k8s-pod-network.bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.491 [INFO][5660] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.491 [INFO][5660] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.518 [WARNING][5660] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" HandleID="k8s-pod-network.bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.518 [INFO][5660] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" HandleID="k8s-pod-network.bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--5vmjt-eth0" Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.526 [INFO][5660] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:14.538380 env[1923]: 2025-03-17 18:21:14.529 [INFO][5641] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b" Mar 17 18:21:14.538380 env[1923]: time="2025-03-17T18:21:14.537011612Z" level=info msg="TearDown network for sandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\" successfully" Mar 17 18:21:14.548391 env[1923]: time="2025-03-17T18:21:14.548125538Z" level=info msg="RemovePodSandbox \"bd09e43e84f0dd5fc92e48f179461bcad66be4061480f9dc7451f36a0fc8ea7b\" returns successfully" Mar 17 18:21:14.548933 env[1923]: time="2025-03-17T18:21:14.548888160Z" level=info msg="StopPodSandbox for \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\"" Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.616 [WARNING][5687] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7af73a1a-8033-4ba4-ba19-078aeb2052b7", ResourceVersion:"878", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb", Pod:"csi-node-driver-pv4xb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.94.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali32ac00b841f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.616 [INFO][5687] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.616 [INFO][5687] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" iface="eth0" netns="" Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.616 [INFO][5687] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.616 [INFO][5687] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.652 [INFO][5694] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" HandleID="k8s-pod-network.8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.652 [INFO][5694] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.653 [INFO][5694] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.665 [WARNING][5694] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" HandleID="k8s-pod-network.8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.665 [INFO][5694] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" HandleID="k8s-pod-network.8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.670 [INFO][5694] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:14.676317 env[1923]: 2025-03-17 18:21:14.673 [INFO][5687] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:14.677638 env[1923]: time="2025-03-17T18:21:14.677588911Z" level=info msg="TearDown network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\" successfully" Mar 17 18:21:14.677769 env[1923]: time="2025-03-17T18:21:14.677737209Z" level=info msg="StopPodSandbox for \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\" returns successfully" Mar 17 18:21:14.678974 env[1923]: time="2025-03-17T18:21:14.678638685Z" level=info msg="RemovePodSandbox for \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\"" Mar 17 18:21:14.679128 env[1923]: time="2025-03-17T18:21:14.678982214Z" level=info msg="Forcibly stopping sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\"" Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.831 [WARNING][5714] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7af73a1a-8033-4ba4-ba19-078aeb2052b7", ResourceVersion:"1002", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 35, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"5680ab97c91c7fefc7b1c7cdcd4982502f706613086b44faf8c64c9a4417b1fb", Pod:"csi-node-driver-pv4xb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.94.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali32ac00b841f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.831 [INFO][5714] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.832 [INFO][5714] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" iface="eth0" netns="" Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.832 [INFO][5714] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.832 [INFO][5714] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.899 [INFO][5720] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" HandleID="k8s-pod-network.8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.899 [INFO][5720] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.899 [INFO][5720] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.913 [WARNING][5720] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" HandleID="k8s-pod-network.8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.913 [INFO][5720] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" HandleID="k8s-pod-network.8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Workload="ip--172--31--21--220-k8s-csi--node--driver--pv4xb-eth0" Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.918 [INFO][5720] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:14.925094 env[1923]: 2025-03-17 18:21:14.921 [INFO][5714] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52" Mar 17 18:21:14.925094 env[1923]: time="2025-03-17T18:21:14.925050631Z" level=info msg="TearDown network for sandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\" successfully" Mar 17 18:21:14.932686 env[1923]: time="2025-03-17T18:21:14.931680120Z" level=info msg="RemovePodSandbox \"8a4f0afb716b309f878285e576aa9507d19b364af54aa9948f3e9b5a722eec52\" returns successfully" Mar 17 18:21:14.934740 env[1923]: time="2025-03-17T18:21:14.934683449Z" level=info msg="StopPodSandbox for \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\"" Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.026 [WARNING][5740] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"cd45a415-9af8-4b7c-ac94-5ad5f9e3b710", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4", Pod:"coredns-7db6d8ff4d-krrrk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4322127d881", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.027 [INFO][5740] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.027 [INFO][5740] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" iface="eth0" netns="" Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.027 [INFO][5740] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.027 [INFO][5740] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.067 [INFO][5746] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" HandleID="k8s-pod-network.c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.067 [INFO][5746] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.067 [INFO][5746] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.082 [WARNING][5746] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" HandleID="k8s-pod-network.c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.082 [INFO][5746] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" HandleID="k8s-pod-network.c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.084 [INFO][5746] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:15.090499 env[1923]: 2025-03-17 18:21:15.087 [INFO][5740] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:15.091651 env[1923]: time="2025-03-17T18:21:15.091591056Z" level=info msg="TearDown network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\" successfully" Mar 17 18:21:15.091795 env[1923]: time="2025-03-17T18:21:15.091762382Z" level=info msg="StopPodSandbox for \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\" returns successfully" Mar 17 18:21:15.092641 env[1923]: time="2025-03-17T18:21:15.092585209Z" level=info msg="RemovePodSandbox for \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\"" Mar 17 18:21:15.092774 env[1923]: time="2025-03-17T18:21:15.092644622Z" level=info msg="Forcibly stopping sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\"" Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.169 [WARNING][5766] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"cd45a415-9af8-4b7c-ac94-5ad5f9e3b710", ResourceVersion:"873", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"fef35381b1dc0b0bc61a681cdbb3c454d002fa40aa9cd9a234efe01a148d0ef4", Pod:"coredns-7db6d8ff4d-krrrk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.94.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4322127d881", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.170 [INFO][5766] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.170 [INFO][5766] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" iface="eth0" netns="" Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.170 [INFO][5766] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.170 [INFO][5766] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.244 [INFO][5772] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" HandleID="k8s-pod-network.c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.244 [INFO][5772] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.244 [INFO][5772] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.257 [WARNING][5772] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" HandleID="k8s-pod-network.c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.257 [INFO][5772] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" HandleID="k8s-pod-network.c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Workload="ip--172--31--21--220-k8s-coredns--7db6d8ff4d--krrrk-eth0" Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.259 [INFO][5772] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:15.266110 env[1923]: 2025-03-17 18:21:15.262 [INFO][5766] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4" Mar 17 18:21:15.267597 env[1923]: time="2025-03-17T18:21:15.267547358Z" level=info msg="TearDown network for sandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\" successfully" Mar 17 18:21:15.274314 env[1923]: time="2025-03-17T18:21:15.274259598Z" level=info msg="RemovePodSandbox \"c4567f78758afee4b3d4ce0d1e7a3ff6854ba776c03d18bdbcd5e5bb94aeb7d4\" returns successfully" Mar 17 18:21:15.275513 env[1923]: time="2025-03-17T18:21:15.275466417Z" level=info msg="StopPodSandbox for \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\"" Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.349 [WARNING][5790] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0", GenerateName:"calico-apiserver-56f9f75749-", Namespace:"calico-apiserver", SelfLink:"", UID:"bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56f9f75749", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0", Pod:"calico-apiserver-56f9f75749-9hqth", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9e9e4ce696b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.349 [INFO][5790] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.349 [INFO][5790] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" iface="eth0" netns="" Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.349 [INFO][5790] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.349 [INFO][5790] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.388 [INFO][5798] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" HandleID="k8s-pod-network.d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.388 [INFO][5798] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.389 [INFO][5798] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.401 [WARNING][5798] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" HandleID="k8s-pod-network.d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.401 [INFO][5798] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" HandleID="k8s-pod-network.d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.404 [INFO][5798] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:15.410451 env[1923]: 2025-03-17 18:21:15.407 [INFO][5790] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:15.411481 env[1923]: time="2025-03-17T18:21:15.411431118Z" level=info msg="TearDown network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\" successfully" Mar 17 18:21:15.411615 env[1923]: time="2025-03-17T18:21:15.411580832Z" level=info msg="StopPodSandbox for \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\" returns successfully" Mar 17 18:21:15.412426 env[1923]: time="2025-03-17T18:21:15.412304909Z" level=info msg="RemovePodSandbox for \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\"" Mar 17 18:21:15.412789 env[1923]: time="2025-03-17T18:21:15.412721459Z" level=info msg="Forcibly stopping sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\"" Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.481 [WARNING][5817] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0", GenerateName:"calico-apiserver-56f9f75749-", Namespace:"calico-apiserver", SelfLink:"", UID:"bab3167d-c9fd-4c15-a13a-1fa8a2ac7c07", ResourceVersion:"946", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 20, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"56f9f75749", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ip-172-31-21-220", ContainerID:"5e797ac7ab2b362110146b6cb00b05633768b3d2ed39a427f9326d5463461ea0", Pod:"calico-apiserver-56f9f75749-9hqth", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.94.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali9e9e4ce696b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.481 [INFO][5817] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.481 [INFO][5817] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" iface="eth0" netns="" Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.481 [INFO][5817] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.482 [INFO][5817] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.530 [INFO][5823] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" HandleID="k8s-pod-network.d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.530 [INFO][5823] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.530 [INFO][5823] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.543 [WARNING][5823] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" HandleID="k8s-pod-network.d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.543 [INFO][5823] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" HandleID="k8s-pod-network.d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Workload="ip--172--31--21--220-k8s-calico--apiserver--56f9f75749--9hqth-eth0" Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.546 [INFO][5823] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:21:15.552059 env[1923]: 2025-03-17 18:21:15.548 [INFO][5817] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210" Mar 17 18:21:15.553548 env[1923]: time="2025-03-17T18:21:15.553493098Z" level=info msg="TearDown network for sandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\" successfully" Mar 17 18:21:15.560758 env[1923]: time="2025-03-17T18:21:15.560640979Z" level=info msg="RemovePodSandbox \"d54deaffbbf9c21baa3cd5d76f8507dfcfa460f11b5f86275b2e16f14da93210\" returns successfully" Mar 17 18:21:18.266000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-172.31.21.220:22-139.178.89.65:48654 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:18.267367 systemd[1]: Started sshd@10-172.31.21.220:22-139.178.89.65:48654.service. Mar 17 18:21:18.270144 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:21:18.270248 kernel: audit: type=1130 audit(1742235678.266:447): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-172.31.21.220:22-139.178.89.65:48654 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:18.449000 audit[5829]: USER_ACCT pid=5829 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.461695 kernel: audit: type=1101 audit(1742235678.449:448): pid=5829 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.462091 sshd[5829]: Accepted publickey for core from 139.178.89.65 port 48654 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:18.461000 audit[5829]: CRED_ACQ pid=5829 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.464702 sshd[5829]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:18.477978 kernel: audit: type=1103 audit(1742235678.461:449): pid=5829 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.478110 kernel: audit: type=1006 audit(1742235678.462:450): pid=5829 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Mar 17 18:21:18.462000 audit[5829]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffeffefe30 a2=3 a3=1 items=0 ppid=1 pid=5829 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:18.486033 systemd[1]: Started session-11.scope. Mar 17 18:21:18.488274 kernel: audit: type=1300 audit(1742235678.462:450): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffeffefe30 a2=3 a3=1 items=0 ppid=1 pid=5829 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:18.462000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:18.488497 systemd-logind[1906]: New session 11 of user core. Mar 17 18:21:18.492164 kernel: audit: type=1327 audit(1742235678.462:450): proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:18.500000 audit[5829]: USER_START pid=5829 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.516598 kernel: audit: type=1105 audit(1742235678.500:451): pid=5829 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.517784 kernel: audit: type=1103 audit(1742235678.503:452): pid=5851 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.503000 audit[5851]: CRED_ACQ pid=5851 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.776509 sshd[5829]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:18.777000 audit[5829]: USER_END pid=5829 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.782801 systemd-logind[1906]: Session 11 logged out. Waiting for processes to exit. Mar 17 18:21:18.785587 systemd[1]: sshd@10-172.31.21.220:22-139.178.89.65:48654.service: Deactivated successfully. Mar 17 18:21:18.787140 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 18:21:18.777000 audit[5829]: CRED_DISP pid=5829 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.798637 kernel: audit: type=1106 audit(1742235678.777:453): pid=5829 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.798811 kernel: audit: type=1104 audit(1742235678.777:454): pid=5829 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.790542 systemd-logind[1906]: Removed session 11. Mar 17 18:21:18.783000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-172.31.21.220:22-139.178.89.65:48654 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:18.806243 systemd[1]: Started sshd@11-172.31.21.220:22-139.178.89.65:48666.service. Mar 17 18:21:18.805000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-172.31.21.220:22-139.178.89.65:48666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:18.973000 audit[5862]: USER_ACCT pid=5862 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.975556 sshd[5862]: Accepted publickey for core from 139.178.89.65 port 48666 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:18.976000 audit[5862]: CRED_ACQ pid=5862 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:18.976000 audit[5862]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=fffff8d12b40 a2=3 a3=1 items=0 ppid=1 pid=5862 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:18.976000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:18.979543 sshd[5862]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:18.987117 systemd-logind[1906]: New session 12 of user core. Mar 17 18:21:18.988708 systemd[1]: Started session-12.scope. Mar 17 18:21:19.001000 audit[5862]: USER_START pid=5862 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.004000 audit[5865]: CRED_ACQ pid=5865 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.315082 sshd[5862]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:19.317000 audit[5862]: USER_END pid=5862 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.317000 audit[5862]: CRED_DISP pid=5862 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.321000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-172.31.21.220:22-139.178.89.65:48666 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:19.322201 systemd[1]: sshd@11-172.31.21.220:22-139.178.89.65:48666.service: Deactivated successfully. Mar 17 18:21:19.325784 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 18:21:19.327043 systemd-logind[1906]: Session 12 logged out. Waiting for processes to exit. Mar 17 18:21:19.330987 systemd-logind[1906]: Removed session 12. Mar 17 18:21:19.356765 systemd[1]: Started sshd@12-172.31.21.220:22-139.178.89.65:48676.service. Mar 17 18:21:19.358000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-172.31.21.220:22-139.178.89.65:48676 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:19.531000 audit[5872]: USER_ACCT pid=5872 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.534581 sshd[5872]: Accepted publickey for core from 139.178.89.65 port 48676 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:19.534000 audit[5872]: CRED_ACQ pid=5872 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.534000 audit[5872]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffd5133ce0 a2=3 a3=1 items=0 ppid=1 pid=5872 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:19.534000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:19.537083 sshd[5872]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:19.545035 systemd-logind[1906]: New session 13 of user core. Mar 17 18:21:19.545977 systemd[1]: Started session-13.scope. Mar 17 18:21:19.555000 audit[5872]: USER_START pid=5872 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.558000 audit[5875]: CRED_ACQ pid=5875 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.790691 sshd[5872]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:19.791000 audit[5872]: USER_END pid=5872 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.792000 audit[5872]: CRED_DISP pid=5872 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:19.795000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-172.31.21.220:22-139.178.89.65:48676 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:19.796019 systemd[1]: sshd@12-172.31.21.220:22-139.178.89.65:48676.service: Deactivated successfully. Mar 17 18:21:19.797491 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 18:21:19.801558 systemd-logind[1906]: Session 13 logged out. Waiting for processes to exit. Mar 17 18:21:19.803676 systemd-logind[1906]: Removed session 13. Mar 17 18:21:24.819655 systemd[1]: Started sshd@13-172.31.21.220:22-139.178.89.65:34678.service. Mar 17 18:21:24.824398 kernel: kauditd_printk_skb: 23 callbacks suppressed Mar 17 18:21:24.824524 kernel: audit: type=1130 audit(1742235684.820:474): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-172.31.21.220:22-139.178.89.65:34678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:24.820000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-172.31.21.220:22-139.178.89.65:34678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:25.004000 audit[5894]: USER_ACCT pid=5894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.014550 sshd[5894]: Accepted publickey for core from 139.178.89.65 port 34678 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:25.017407 kernel: audit: type=1101 audit(1742235685.004:475): pid=5894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.018136 sshd[5894]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:25.015000 audit[5894]: CRED_ACQ pid=5894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.032804 kernel: audit: type=1103 audit(1742235685.015:476): pid=5894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.032933 kernel: audit: type=1006 audit(1742235685.015:477): pid=5894 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Mar 17 18:21:25.015000 audit[5894]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=fffff85c66c0 a2=3 a3=1 items=0 ppid=1 pid=5894 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:25.015000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:25.045673 kernel: audit: type=1300 audit(1742235685.015:477): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=fffff85c66c0 a2=3 a3=1 items=0 ppid=1 pid=5894 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:25.045813 kernel: audit: type=1327 audit(1742235685.015:477): proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:25.051523 systemd-logind[1906]: New session 14 of user core. Mar 17 18:21:25.053803 systemd[1]: Started session-14.scope. Mar 17 18:21:25.063000 audit[5894]: USER_START pid=5894 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.076000 audit[5898]: CRED_ACQ pid=5898 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.086213 kernel: audit: type=1105 audit(1742235685.063:478): pid=5894 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.086380 kernel: audit: type=1103 audit(1742235685.076:479): pid=5898 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.378570 sshd[5894]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:25.380000 audit[5894]: USER_END pid=5894 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.394583 systemd[1]: sshd@13-172.31.21.220:22-139.178.89.65:34678.service: Deactivated successfully. Mar 17 18:21:25.396863 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 18:21:25.397202 systemd-logind[1906]: Session 14 logged out. Waiting for processes to exit. Mar 17 18:21:25.401192 systemd-logind[1906]: Removed session 14. Mar 17 18:21:25.380000 audit[5894]: CRED_DISP pid=5894 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.410936 kernel: audit: type=1106 audit(1742235685.380:480): pid=5894 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.411064 kernel: audit: type=1104 audit(1742235685.380:481): pid=5894 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:25.393000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-172.31.21.220:22-139.178.89.65:34678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:30.406875 systemd[1]: Started sshd@14-172.31.21.220:22-139.178.89.65:34686.service. Mar 17 18:21:30.417461 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:21:30.417622 kernel: audit: type=1130 audit(1742235690.406:483): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-172.31.21.220:22-139.178.89.65:34686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:30.406000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-172.31.21.220:22-139.178.89.65:34686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:30.582000 audit[5912]: USER_ACCT pid=5912 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.584998 sshd[5912]: Accepted publickey for core from 139.178.89.65 port 34686 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:30.596402 kernel: audit: type=1101 audit(1742235690.582:484): pid=5912 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.595000 audit[5912]: CRED_ACQ pid=5912 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.603792 sshd[5912]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:30.611843 kernel: audit: type=1103 audit(1742235690.595:485): pid=5912 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.611979 kernel: audit: type=1006 audit(1742235690.595:486): pid=5912 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Mar 17 18:21:30.595000 audit[5912]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe9152030 a2=3 a3=1 items=0 ppid=1 pid=5912 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:30.595000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:30.629191 kernel: audit: type=1300 audit(1742235690.595:486): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe9152030 a2=3 a3=1 items=0 ppid=1 pid=5912 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:30.629294 kernel: audit: type=1327 audit(1742235690.595:486): proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:30.634151 systemd[1]: Started session-15.scope. Mar 17 18:21:30.634254 systemd-logind[1906]: New session 15 of user core. Mar 17 18:21:30.656000 audit[5912]: USER_START pid=5912 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.667000 audit[5915]: CRED_ACQ pid=5915 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.677807 kernel: audit: type=1105 audit(1742235690.656:487): pid=5912 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.677930 kernel: audit: type=1103 audit(1742235690.667:488): pid=5915 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.917119 sshd[5912]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:30.918000 audit[5912]: USER_END pid=5912 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.918000 audit[5912]: CRED_DISP pid=5912 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.930820 systemd[1]: sshd@14-172.31.21.220:22-139.178.89.65:34686.service: Deactivated successfully. Mar 17 18:21:30.938766 kernel: audit: type=1106 audit(1742235690.918:489): pid=5912 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.938903 kernel: audit: type=1104 audit(1742235690.918:490): pid=5912 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:30.929000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-172.31.21.220:22-139.178.89.65:34686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:30.939731 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 18:21:30.940515 systemd-logind[1906]: Session 15 logged out. Waiting for processes to exit. Mar 17 18:21:30.943170 systemd-logind[1906]: Removed session 15. Mar 17 18:21:35.945521 systemd[1]: Started sshd@15-172.31.21.220:22-139.178.89.65:34060.service. Mar 17 18:21:35.945000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-172.31.21.220:22-139.178.89.65:34060 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:35.949925 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:21:35.950073 kernel: audit: type=1130 audit(1742235695.945:492): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-172.31.21.220:22-139.178.89.65:34060 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:36.129000 audit[5944]: USER_ACCT pid=5944 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.131060 sshd[5944]: Accepted publickey for core from 139.178.89.65 port 34060 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:36.140000 audit[5944]: CRED_ACQ pid=5944 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.143796 sshd[5944]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:36.151736 kernel: audit: type=1101 audit(1742235696.129:493): pid=5944 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.151876 kernel: audit: type=1103 audit(1742235696.140:494): pid=5944 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.154999 systemd[1]: Started session-16.scope. Mar 17 18:21:36.158146 kernel: audit: type=1006 audit(1742235696.140:495): pid=5944 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Mar 17 18:21:36.158302 kernel: audit: type=1300 audit(1742235696.140:495): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe3514c00 a2=3 a3=1 items=0 ppid=1 pid=5944 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:36.140000 audit[5944]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe3514c00 a2=3 a3=1 items=0 ppid=1 pid=5944 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:36.157468 systemd-logind[1906]: New session 16 of user core. Mar 17 18:21:36.140000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:36.181440 kernel: audit: type=1327 audit(1742235696.140:495): proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:36.170000 audit[5944]: USER_START pid=5944 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.193220 kernel: audit: type=1105 audit(1742235696.170:496): pid=5944 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.173000 audit[5947]: CRED_ACQ pid=5947 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.205081 kernel: audit: type=1103 audit(1742235696.173:497): pid=5947 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.446688 sshd[5944]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:36.448000 audit[5944]: USER_END pid=5944 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.460203 systemd[1]: sshd@15-172.31.21.220:22-139.178.89.65:34060.service: Deactivated successfully. Mar 17 18:21:36.462270 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 18:21:36.448000 audit[5944]: CRED_DISP pid=5944 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.473246 kernel: audit: type=1106 audit(1742235696.448:498): pid=5944 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.473396 kernel: audit: type=1104 audit(1742235696.448:499): pid=5944 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:36.473690 systemd-logind[1906]: Session 16 logged out. Waiting for processes to exit. Mar 17 18:21:36.459000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-172.31.21.220:22-139.178.89.65:34060 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:36.478306 systemd-logind[1906]: Removed session 16. Mar 17 18:21:41.474000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-172.31.21.220:22-139.178.89.65:38770 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:41.475168 systemd[1]: Started sshd@16-172.31.21.220:22-139.178.89.65:38770.service. Mar 17 18:21:41.479360 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:21:41.479501 kernel: audit: type=1130 audit(1742235701.474:501): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-172.31.21.220:22-139.178.89.65:38770 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:41.636113 systemd[1]: run-containerd-runc-k8s.io-3c8abc5333ad3e3319cf7dac17c6c0821f35b7609605dcae367cc29c9410e227-runc.ZGCsP4.mount: Deactivated successfully. Mar 17 18:21:41.679000 audit[5957]: USER_ACCT pid=5957 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:41.684578 sshd[5957]: Accepted publickey for core from 139.178.89.65 port 38770 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:41.692753 sshd[5957]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:41.690000 audit[5957]: CRED_ACQ pid=5957 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:41.704251 kernel: audit: type=1101 audit(1742235701.679:502): pid=5957 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:41.704418 kernel: audit: type=1103 audit(1742235701.690:503): pid=5957 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:41.710115 kernel: audit: type=1006 audit(1742235701.690:504): pid=5957 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Mar 17 18:21:41.690000 audit[5957]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffc6736570 a2=3 a3=1 items=0 ppid=1 pid=5957 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:41.719966 kernel: audit: type=1300 audit(1742235701.690:504): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffc6736570 a2=3 a3=1 items=0 ppid=1 pid=5957 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:41.690000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:41.722842 systemd[1]: Started session-17.scope. Mar 17 18:21:41.723370 systemd-logind[1906]: New session 17 of user core. Mar 17 18:21:41.725385 kernel: audit: type=1327 audit(1742235701.690:504): proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:41.742000 audit[5957]: USER_START pid=5957 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:41.756000 audit[5987]: CRED_ACQ pid=5987 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:41.768181 kernel: audit: type=1105 audit(1742235701.742:505): pid=5957 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:41.768408 kernel: audit: type=1103 audit(1742235701.756:506): pid=5987 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:42.040083 sshd[5957]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:42.042000 audit[5957]: USER_END pid=5957 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:42.047727 systemd[1]: sshd@16-172.31.21.220:22-139.178.89.65:38770.service: Deactivated successfully. Mar 17 18:21:42.049221 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 18:21:42.055380 kernel: audit: type=1106 audit(1742235702.042:507): pid=5957 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:42.042000 audit[5957]: CRED_DISP pid=5957 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:42.055840 systemd-logind[1906]: Session 17 logged out. Waiting for processes to exit. Mar 17 18:21:42.066961 kernel: audit: type=1104 audit(1742235702.042:508): pid=5957 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:42.042000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-172.31.21.220:22-139.178.89.65:38770 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:42.067064 systemd-logind[1906]: Removed session 17. Mar 17 18:21:47.062000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-172.31.21.220:22-139.178.89.65:38776 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:47.063745 systemd[1]: Started sshd@17-172.31.21.220:22-139.178.89.65:38776.service. Mar 17 18:21:47.065875 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:21:47.066031 kernel: audit: type=1130 audit(1742235707.062:510): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-172.31.21.220:22-139.178.89.65:38776 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:47.248000 audit[5999]: USER_ACCT pid=5999 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.251932 sshd[5999]: Accepted publickey for core from 139.178.89.65 port 38776 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:47.255190 sshd[5999]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:47.248000 audit[5999]: CRED_ACQ pid=5999 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.265480 systemd[1]: Started session-18.scope. Mar 17 18:21:47.267454 kernel: audit: type=1101 audit(1742235707.248:511): pid=5999 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.267571 kernel: audit: type=1103 audit(1742235707.248:512): pid=5999 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.267165 systemd-logind[1906]: New session 18 of user core. Mar 17 18:21:47.273313 kernel: audit: type=1006 audit(1742235707.248:513): pid=5999 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Mar 17 18:21:47.248000 audit[5999]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=fffffb491440 a2=3 a3=1 items=0 ppid=1 pid=5999 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:47.287038 kernel: audit: type=1300 audit(1742235707.248:513): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=fffffb491440 a2=3 a3=1 items=0 ppid=1 pid=5999 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:47.288700 kernel: audit: type=1327 audit(1742235707.248:513): proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:47.248000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:47.276000 audit[5999]: USER_START pid=5999 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.302582 kernel: audit: type=1105 audit(1742235707.276:514): pid=5999 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.302787 kernel: audit: type=1103 audit(1742235707.290:515): pid=6002 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.290000 audit[6002]: CRED_ACQ pid=6002 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.531264 sshd[5999]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:47.532000 audit[5999]: USER_END pid=5999 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.537411 systemd-logind[1906]: Session 18 logged out. Waiting for processes to exit. Mar 17 18:21:47.540842 systemd[1]: sshd@17-172.31.21.220:22-139.178.89.65:38776.service: Deactivated successfully. Mar 17 18:21:47.542368 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 18:21:47.533000 audit[5999]: CRED_DISP pid=5999 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.545669 systemd-logind[1906]: Removed session 18. Mar 17 18:21:47.553947 kernel: audit: type=1106 audit(1742235707.532:516): pid=5999 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.554084 kernel: audit: type=1104 audit(1742235707.533:517): pid=5999 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.540000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-172.31.21.220:22-139.178.89.65:38776 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:47.558000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-172.31.21.220:22-139.178.89.65:38778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:47.559790 systemd[1]: Started sshd@18-172.31.21.220:22-139.178.89.65:38778.service. Mar 17 18:21:47.730000 audit[6012]: USER_ACCT pid=6012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.731847 sshd[6012]: Accepted publickey for core from 139.178.89.65 port 38778 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:47.732000 audit[6012]: CRED_ACQ pid=6012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.732000 audit[6012]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=fffff30052e0 a2=3 a3=1 items=0 ppid=1 pid=6012 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:47.732000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:47.734566 sshd[6012]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:47.742669 systemd-logind[1906]: New session 19 of user core. Mar 17 18:21:47.743948 systemd[1]: Started session-19.scope. Mar 17 18:21:47.754000 audit[6012]: USER_START pid=6012 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:47.757000 audit[6015]: CRED_ACQ pid=6015 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:48.242907 sshd[6012]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:48.243000 audit[6012]: USER_END pid=6012 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:48.244000 audit[6012]: CRED_DISP pid=6012 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:48.247000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-172.31.21.220:22-139.178.89.65:38778 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:48.248453 systemd[1]: sshd@18-172.31.21.220:22-139.178.89.65:38778.service: Deactivated successfully. Mar 17 18:21:48.252265 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 18:21:48.253300 systemd-logind[1906]: Session 19 logged out. Waiting for processes to exit. Mar 17 18:21:48.255251 systemd-logind[1906]: Removed session 19. Mar 17 18:21:48.269990 systemd[1]: Started sshd@19-172.31.21.220:22-139.178.89.65:38788.service. Mar 17 18:21:48.269000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-172.31.21.220:22-139.178.89.65:38788 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:48.396006 systemd[1]: run-containerd-runc-k8s.io-1b04737a4a65ee8932224ea657faca70a2efdea0328c9f91a58ac6a24ff94483-runc.x2QJ84.mount: Deactivated successfully. Mar 17 18:21:48.463000 audit[6023]: USER_ACCT pid=6023 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:48.465506 sshd[6023]: Accepted publickey for core from 139.178.89.65 port 38788 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:48.465000 audit[6023]: CRED_ACQ pid=6023 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:48.466000 audit[6023]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffd1fdc430 a2=3 a3=1 items=0 ppid=1 pid=6023 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:48.466000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:48.468147 sshd[6023]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:48.477477 systemd[1]: Started session-20.scope. Mar 17 18:21:48.478168 systemd-logind[1906]: New session 20 of user core. Mar 17 18:21:48.488000 audit[6023]: USER_START pid=6023 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:48.493000 audit[6046]: CRED_ACQ pid=6046 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:51.751000 audit[6056]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=6056 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:51.751000 audit[6056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=11860 a0=3 a1=fffffe086e20 a2=0 a3=1 items=0 ppid=3332 pid=6056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:51.751000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:51.757000 audit[6056]: NETFILTER_CFG table=nat:124 family=2 entries=22 op=nft_register_rule pid=6056 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:51.757000 audit[6056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6540 a0=3 a1=fffffe086e20 a2=0 a3=1 items=0 ppid=3332 pid=6056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:51.757000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:51.774232 sshd[6023]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:51.776000 audit[6023]: USER_END pid=6023 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:51.777000 audit[6023]: CRED_DISP pid=6023 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:51.781000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-172.31.21.220:22-139.178.89.65:38788 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:51.782023 systemd[1]: sshd@19-172.31.21.220:22-139.178.89.65:38788.service: Deactivated successfully. Mar 17 18:21:51.785472 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 18:21:51.786875 systemd-logind[1906]: Session 20 logged out. Waiting for processes to exit. Mar 17 18:21:51.790014 systemd-logind[1906]: Removed session 20. Mar 17 18:21:51.800000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-172.31.21.220:22-139.178.89.65:33376 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:51.801548 systemd[1]: Started sshd@20-172.31.21.220:22-139.178.89.65:33376.service. Mar 17 18:21:51.853000 audit[6062]: NETFILTER_CFG table=filter:125 family=2 entries=32 op=nft_register_rule pid=6062 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:51.853000 audit[6062]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=11860 a0=3 a1=ffffc8ddcbb0 a2=0 a3=1 items=0 ppid=3332 pid=6062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:51.853000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:51.856000 audit[6062]: NETFILTER_CFG table=nat:126 family=2 entries=22 op=nft_register_rule pid=6062 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:51.856000 audit[6062]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6540 a0=3 a1=ffffc8ddcbb0 a2=0 a3=1 items=0 ppid=3332 pid=6062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:51.856000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:51.998000 audit[6060]: USER_ACCT pid=6060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.000098 sshd[6060]: Accepted publickey for core from 139.178.89.65 port 33376 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:52.001000 audit[6060]: CRED_ACQ pid=6060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.001000 audit[6060]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe0faf2c0 a2=3 a3=1 items=0 ppid=1 pid=6060 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:52.001000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:52.003898 sshd[6060]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:52.014542 systemd[1]: Started session-21.scope. Mar 17 18:21:52.015422 systemd-logind[1906]: New session 21 of user core. Mar 17 18:21:52.024000 audit[6060]: USER_START pid=6060 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.028000 audit[6064]: CRED_ACQ pid=6064 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.584727 sshd[6060]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:52.589313 kernel: kauditd_printk_skb: 43 callbacks suppressed Mar 17 18:21:52.589491 kernel: audit: type=1106 audit(1742235712.585:547): pid=6060 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.585000 audit[6060]: USER_END pid=6060 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.590524 systemd[1]: sshd@20-172.31.21.220:22-139.178.89.65:33376.service: Deactivated successfully. Mar 17 18:21:52.592008 systemd[1]: session-21.scope: Deactivated successfully. Mar 17 18:21:52.600574 systemd-logind[1906]: Session 21 logged out. Waiting for processes to exit. Mar 17 18:21:52.586000 audit[6060]: CRED_DISP pid=6060 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.610248 systemd[1]: Started sshd@21-172.31.21.220:22-139.178.89.65:33384.service. Mar 17 18:21:52.610511 kernel: audit: type=1104 audit(1742235712.586:548): pid=6060 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.617580 systemd-logind[1906]: Removed session 21. Mar 17 18:21:52.589000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-172.31.21.220:22-139.178.89.65:33376 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:52.635282 kernel: audit: type=1131 audit(1742235712.589:549): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-172.31.21.220:22-139.178.89.65:33376 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:52.644903 kernel: audit: type=1130 audit(1742235712.609:550): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-172.31.21.220:22-139.178.89.65:33384 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:52.609000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-172.31.21.220:22-139.178.89.65:33384 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:52.787000 audit[6072]: USER_ACCT pid=6072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.788656 sshd[6072]: Accepted publickey for core from 139.178.89.65 port 33384 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:52.791131 sshd[6072]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:52.789000 audit[6072]: CRED_ACQ pid=6072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.808112 kernel: audit: type=1101 audit(1742235712.787:551): pid=6072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.808269 kernel: audit: type=1103 audit(1742235712.789:552): pid=6072 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.813551 kernel: audit: type=1006 audit(1742235712.789:553): pid=6072 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Mar 17 18:21:52.789000 audit[6072]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe9e56c40 a2=3 a3=1 items=0 ppid=1 pid=6072 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:52.823295 kernel: audit: type=1300 audit(1742235712.789:553): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe9e56c40 a2=3 a3=1 items=0 ppid=1 pid=6072 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:52.789000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:52.826803 kernel: audit: type=1327 audit(1742235712.789:553): proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:52.827695 systemd-logind[1906]: New session 22 of user core. Mar 17 18:21:52.830452 systemd[1]: Started session-22.scope. Mar 17 18:21:52.841000 audit[6072]: USER_START pid=6072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.854000 audit[6075]: CRED_ACQ pid=6075 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:52.857431 kernel: audit: type=1105 audit(1742235712.841:554): pid=6072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:53.091722 sshd[6072]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:53.092000 audit[6072]: USER_END pid=6072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:53.093000 audit[6072]: CRED_DISP pid=6072 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:53.097000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-172.31.21.220:22-139.178.89.65:33384 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:53.098024 systemd[1]: sshd@21-172.31.21.220:22-139.178.89.65:33384.service: Deactivated successfully. Mar 17 18:21:53.099753 systemd[1]: session-22.scope: Deactivated successfully. Mar 17 18:21:53.100237 systemd-logind[1906]: Session 22 logged out. Waiting for processes to exit. Mar 17 18:21:53.102143 systemd-logind[1906]: Removed session 22. Mar 17 18:21:58.117000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-172.31.21.220:22-139.178.89.65:33398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:58.118656 systemd[1]: Started sshd@22-172.31.21.220:22-139.178.89.65:33398.service. Mar 17 18:21:58.121142 kernel: kauditd_printk_skb: 4 callbacks suppressed Mar 17 18:21:58.121211 kernel: audit: type=1130 audit(1742235718.117:559): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-172.31.21.220:22-139.178.89.65:33398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:58.293000 audit[6087]: USER_ACCT pid=6087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.295130 sshd[6087]: Accepted publickey for core from 139.178.89.65 port 33398 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:21:58.305379 kernel: audit: type=1101 audit(1742235718.293:560): pid=6087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.304000 audit[6087]: CRED_ACQ pid=6087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.307236 sshd[6087]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:21:58.315739 kernel: audit: type=1103 audit(1742235718.304:561): pid=6087 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.315881 kernel: audit: type=1006 audit(1742235718.304:562): pid=6087 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Mar 17 18:21:58.304000 audit[6087]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe164eb90 a2=3 a3=1 items=0 ppid=1 pid=6087 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:58.329603 kernel: audit: type=1300 audit(1742235718.304:562): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe164eb90 a2=3 a3=1 items=0 ppid=1 pid=6087 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:58.304000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:58.333503 kernel: audit: type=1327 audit(1742235718.304:562): proctitle=737368643A20636F7265205B707269765D Mar 17 18:21:58.338442 systemd-logind[1906]: New session 23 of user core. Mar 17 18:21:58.339935 systemd[1]: Started session-23.scope. Mar 17 18:21:58.352000 audit[6087]: USER_START pid=6087 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.365000 audit[6090]: CRED_ACQ pid=6090 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.375287 kernel: audit: type=1105 audit(1742235718.352:563): pid=6087 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.375450 kernel: audit: type=1103 audit(1742235718.365:564): pid=6090 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.604735 sshd[6087]: pam_unix(sshd:session): session closed for user core Mar 17 18:21:58.605000 audit[6087]: USER_END pid=6087 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.609824 systemd-logind[1906]: Session 23 logged out. Waiting for processes to exit. Mar 17 18:21:58.612699 systemd[1]: sshd@22-172.31.21.220:22-139.178.89.65:33398.service: Deactivated successfully. Mar 17 18:21:58.614235 systemd[1]: session-23.scope: Deactivated successfully. Mar 17 18:21:58.616941 systemd-logind[1906]: Removed session 23. Mar 17 18:21:58.605000 audit[6087]: CRED_DISP pid=6087 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.626576 kernel: audit: type=1106 audit(1742235718.605:565): pid=6087 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.626712 kernel: audit: type=1104 audit(1742235718.605:566): pid=6087 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:21:58.611000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-172.31.21.220:22-139.178.89.65:33398 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:21:59.286000 audit[6101]: NETFILTER_CFG table=filter:127 family=2 entries=20 op=nft_register_rule pid=6101 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:59.286000 audit[6101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2932 a0=3 a1=fffffc77f390 a2=0 a3=1 items=0 ppid=3332 pid=6101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:59.286000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:21:59.299000 audit[6101]: NETFILTER_CFG table=nat:128 family=2 entries=106 op=nft_register_chain pid=6101 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:21:59.299000 audit[6101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=49452 a0=3 a1=fffffc77f390 a2=0 a3=1 items=0 ppid=3332 pid=6101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:21:59.299000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:22:03.631093 systemd[1]: Started sshd@23-172.31.21.220:22-139.178.89.65:47640.service. Mar 17 18:22:03.642409 kernel: kauditd_printk_skb: 7 callbacks suppressed Mar 17 18:22:03.642465 kernel: audit: type=1130 audit(1742235723.630:570): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-172.31.21.220:22-139.178.89.65:47640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:03.630000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-172.31.21.220:22-139.178.89.65:47640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:03.804000 audit[6103]: USER_ACCT pid=6103 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:03.806033 sshd[6103]: Accepted publickey for core from 139.178.89.65 port 47640 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:22:03.809178 sshd[6103]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:22:03.807000 audit[6103]: CRED_ACQ pid=6103 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:03.818569 systemd[1]: Started session-24.scope. Mar 17 18:22:03.820869 systemd-logind[1906]: New session 24 of user core. Mar 17 18:22:03.827399 kernel: audit: type=1101 audit(1742235723.804:571): pid=6103 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:03.827567 kernel: audit: type=1103 audit(1742235723.807:572): pid=6103 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:03.827627 kernel: audit: type=1006 audit(1742235723.807:573): pid=6103 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Mar 17 18:22:03.807000 audit[6103]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffc64de530 a2=3 a3=1 items=0 ppid=1 pid=6103 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:03.840199 kernel: audit: type=1300 audit(1742235723.807:573): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffc64de530 a2=3 a3=1 items=0 ppid=1 pid=6103 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:03.807000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:03.846547 kernel: audit: type=1327 audit(1742235723.807:573): proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:03.840000 audit[6103]: USER_START pid=6103 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:03.858168 kernel: audit: type=1105 audit(1742235723.840:574): pid=6103 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:03.844000 audit[6106]: CRED_ACQ pid=6106 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:03.866614 kernel: audit: type=1103 audit(1742235723.844:575): pid=6106 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:04.092381 sshd[6103]: pam_unix(sshd:session): session closed for user core Mar 17 18:22:04.092000 audit[6103]: USER_END pid=6103 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:04.097466 systemd-logind[1906]: Session 24 logged out. Waiting for processes to exit. Mar 17 18:22:04.093000 audit[6103]: CRED_DISP pid=6103 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:04.114316 kernel: audit: type=1106 audit(1742235724.092:576): pid=6103 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:04.114531 kernel: audit: type=1104 audit(1742235724.093:577): pid=6103 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:04.112203 systemd[1]: sshd@23-172.31.21.220:22-139.178.89.65:47640.service: Deactivated successfully. Mar 17 18:22:04.111000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-172.31.21.220:22-139.178.89.65:47640 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:04.113662 systemd[1]: session-24.scope: Deactivated successfully. Mar 17 18:22:04.116806 systemd-logind[1906]: Removed session 24. Mar 17 18:22:09.114946 systemd[1]: Started sshd@24-172.31.21.220:22-139.178.89.65:47650.service. Mar 17 18:22:09.114000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-172.31.21.220:22-139.178.89.65:47650 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:09.118928 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:22:09.119054 kernel: audit: type=1130 audit(1742235729.114:579): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-172.31.21.220:22-139.178.89.65:47650 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:09.292000 audit[6116]: USER_ACCT pid=6116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.295467 sshd[6116]: Accepted publickey for core from 139.178.89.65 port 47650 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:22:09.298799 sshd[6116]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:22:09.293000 audit[6116]: CRED_ACQ pid=6116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.313272 kernel: audit: type=1101 audit(1742235729.292:580): pid=6116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.313378 kernel: audit: type=1103 audit(1742235729.293:581): pid=6116 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.313461 kernel: audit: type=1006 audit(1742235729.293:582): pid=6116 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Mar 17 18:22:09.293000 audit[6116]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe271a4b0 a2=3 a3=1 items=0 ppid=1 pid=6116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:09.330168 kernel: audit: type=1300 audit(1742235729.293:582): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe271a4b0 a2=3 a3=1 items=0 ppid=1 pid=6116 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:09.327790 systemd[1]: Started session-25.scope. Mar 17 18:22:09.328176 systemd-logind[1906]: New session 25 of user core. Mar 17 18:22:09.293000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:09.334185 kernel: audit: type=1327 audit(1742235729.293:582): proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:09.337000 audit[6116]: USER_START pid=6116 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.350439 kernel: audit: type=1105 audit(1742235729.337:583): pid=6116 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.350537 kernel: audit: type=1103 audit(1742235729.337:584): pid=6119 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.337000 audit[6119]: CRED_ACQ pid=6119 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.579660 sshd[6116]: pam_unix(sshd:session): session closed for user core Mar 17 18:22:09.580000 audit[6116]: USER_END pid=6116 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.584435 systemd-logind[1906]: Session 25 logged out. Waiting for processes to exit. Mar 17 18:22:09.586956 systemd[1]: sshd@24-172.31.21.220:22-139.178.89.65:47650.service: Deactivated successfully. Mar 17 18:22:09.588358 systemd[1]: session-25.scope: Deactivated successfully. Mar 17 18:22:09.591362 systemd-logind[1906]: Removed session 25. Mar 17 18:22:09.580000 audit[6116]: CRED_DISP pid=6116 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.602313 kernel: audit: type=1106 audit(1742235729.580:585): pid=6116 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.602481 kernel: audit: type=1104 audit(1742235729.580:586): pid=6116 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:09.586000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-172.31.21.220:22-139.178.89.65:47650 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:11.625100 systemd[1]: run-containerd-runc-k8s.io-3c8abc5333ad3e3319cf7dac17c6c0821f35b7609605dcae367cc29c9410e227-runc.PTa13j.mount: Deactivated successfully. Mar 17 18:22:14.606389 systemd[1]: Started sshd@25-172.31.21.220:22-139.178.89.65:38158.service. Mar 17 18:22:14.606000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-172.31.21.220:22-139.178.89.65:38158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:14.609370 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:22:14.609501 kernel: audit: type=1130 audit(1742235734.606:588): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-172.31.21.220:22-139.178.89.65:38158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:14.792000 audit[6151]: USER_ACCT pid=6151 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:14.793994 sshd[6151]: Accepted publickey for core from 139.178.89.65 port 38158 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:22:14.804390 kernel: audit: type=1101 audit(1742235734.792:589): pid=6151 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:14.803000 audit[6151]: CRED_ACQ pid=6151 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:14.814817 sshd[6151]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:22:14.820346 kernel: audit: type=1103 audit(1742235734.803:590): pid=6151 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:14.820466 kernel: audit: type=1006 audit(1742235734.803:591): pid=6151 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Mar 17 18:22:14.820548 kernel: audit: type=1300 audit(1742235734.803:591): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffd1ef54e0 a2=3 a3=1 items=0 ppid=1 pid=6151 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:14.803000 audit[6151]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffd1ef54e0 a2=3 a3=1 items=0 ppid=1 pid=6151 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:14.803000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:14.832885 kernel: audit: type=1327 audit(1742235734.803:591): proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:14.840935 systemd[1]: Started session-26.scope. Mar 17 18:22:14.841595 systemd-logind[1906]: New session 26 of user core. Mar 17 18:22:14.851000 audit[6151]: USER_START pid=6151 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:14.864000 audit[6154]: CRED_ACQ pid=6154 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:14.877800 kernel: audit: type=1105 audit(1742235734.851:592): pid=6151 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:14.877944 kernel: audit: type=1103 audit(1742235734.864:593): pid=6154 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:15.103324 sshd[6151]: pam_unix(sshd:session): session closed for user core Mar 17 18:22:15.104000 audit[6151]: USER_END pid=6151 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:15.108456 systemd[1]: sshd@25-172.31.21.220:22-139.178.89.65:38158.service: Deactivated successfully. Mar 17 18:22:15.109831 systemd[1]: session-26.scope: Deactivated successfully. Mar 17 18:22:15.104000 audit[6151]: CRED_DISP pid=6151 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:15.119112 systemd-logind[1906]: Session 26 logged out. Waiting for processes to exit. Mar 17 18:22:15.120892 systemd-logind[1906]: Removed session 26. Mar 17 18:22:15.125681 kernel: audit: type=1106 audit(1742235735.104:594): pid=6151 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:15.125842 kernel: audit: type=1104 audit(1742235735.104:595): pid=6151 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:15.105000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-172.31.21.220:22-139.178.89.65:38158 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:18.394638 systemd[1]: run-containerd-runc-k8s.io-1b04737a4a65ee8932224ea657faca70a2efdea0328c9f91a58ac6a24ff94483-runc.Gd2T68.mount: Deactivated successfully. Mar 17 18:22:20.127000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-172.31.21.220:22-139.178.89.65:38170 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:20.131732 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:22:20.131834 kernel: audit: type=1130 audit(1742235740.127:597): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-172.31.21.220:22-139.178.89.65:38170 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:20.128547 systemd[1]: Started sshd@26-172.31.21.220:22-139.178.89.65:38170.service. Mar 17 18:22:20.300000 audit[6185]: USER_ACCT pid=6185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.302193 sshd[6185]: Accepted publickey for core from 139.178.89.65 port 38170 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:22:20.305820 sshd[6185]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:22:20.303000 audit[6185]: CRED_ACQ pid=6185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.320461 kernel: audit: type=1101 audit(1742235740.300:598): pid=6185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.320615 kernel: audit: type=1103 audit(1742235740.303:599): pid=6185 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.328664 kernel: audit: type=1006 audit(1742235740.303:600): pid=6185 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Mar 17 18:22:20.303000 audit[6185]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffea3e9370 a2=3 a3=1 items=0 ppid=1 pid=6185 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:20.338379 kernel: audit: type=1300 audit(1742235740.303:600): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffea3e9370 a2=3 a3=1 items=0 ppid=1 pid=6185 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:20.303000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:20.344089 kernel: audit: type=1327 audit(1742235740.303:600): proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:20.343681 systemd-logind[1906]: New session 27 of user core. Mar 17 18:22:20.344534 systemd[1]: Started session-27.scope. Mar 17 18:22:20.354000 audit[6185]: USER_START pid=6185 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.357000 audit[6188]: CRED_ACQ pid=6188 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.377582 kernel: audit: type=1105 audit(1742235740.354:601): pid=6185 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.377768 kernel: audit: type=1103 audit(1742235740.357:602): pid=6188 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.602833 sshd[6185]: pam_unix(sshd:session): session closed for user core Mar 17 18:22:20.604000 audit[6185]: USER_END pid=6185 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.608224 systemd[1]: sshd@26-172.31.21.220:22-139.178.89.65:38170.service: Deactivated successfully. Mar 17 18:22:20.610000 systemd[1]: session-27.scope: Deactivated successfully. Mar 17 18:22:20.604000 audit[6185]: CRED_DISP pid=6185 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.620232 kernel: audit: type=1106 audit(1742235740.604:603): pid=6185 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:20.619419 systemd-logind[1906]: Session 27 logged out. Waiting for processes to exit. Mar 17 18:22:20.630044 systemd-logind[1906]: Removed session 27. Mar 17 18:22:20.607000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-172.31.21.220:22-139.178.89.65:38170 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:20.631362 kernel: audit: type=1104 audit(1742235740.604:604): pid=6185 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:25.626000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-172.31.21.220:22-139.178.89.65:60570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:25.630880 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:22:25.630968 kernel: audit: type=1130 audit(1742235745.626:606): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-172.31.21.220:22-139.178.89.65:60570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:25.627173 systemd[1]: Started sshd@27-172.31.21.220:22-139.178.89.65:60570.service. Mar 17 18:22:25.801000 audit[6206]: USER_ACCT pid=6206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:25.806113 sshd[6206]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:22:25.813692 sshd[6206]: Accepted publickey for core from 139.178.89.65 port 60570 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:22:25.814927 kernel: audit: type=1101 audit(1742235745.801:607): pid=6206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:25.803000 audit[6206]: CRED_ACQ pid=6206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:25.822162 systemd-logind[1906]: New session 28 of user core. Mar 17 18:22:25.824592 systemd[1]: Started session-28.scope. Mar 17 18:22:25.842132 kernel: audit: type=1103 audit(1742235745.803:608): pid=6206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:25.842277 kernel: audit: type=1006 audit(1742235745.804:609): pid=6206 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Mar 17 18:22:25.804000 audit[6206]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe8f511f0 a2=3 a3=1 items=0 ppid=1 pid=6206 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:25.852177 kernel: audit: type=1300 audit(1742235745.804:609): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffe8f511f0 a2=3 a3=1 items=0 ppid=1 pid=6206 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:25.804000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:25.855995 kernel: audit: type=1327 audit(1742235745.804:609): proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:25.860000 audit[6206]: USER_START pid=6206 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:25.874519 kernel: audit: type=1105 audit(1742235745.860:610): pid=6206 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:25.874688 kernel: audit: type=1103 audit(1742235745.872:611): pid=6209 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:25.872000 audit[6209]: CRED_ACQ pid=6209 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:26.106695 sshd[6206]: pam_unix(sshd:session): session closed for user core Mar 17 18:22:26.107000 audit[6206]: USER_END pid=6206 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:26.120655 systemd[1]: sshd@27-172.31.21.220:22-139.178.89.65:60570.service: Deactivated successfully. Mar 17 18:22:26.109000 audit[6206]: CRED_DISP pid=6206 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:26.130443 kernel: audit: type=1106 audit(1742235746.107:612): pid=6206 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:26.130658 kernel: audit: type=1104 audit(1742235746.109:613): pid=6206 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:26.122170 systemd[1]: session-28.scope: Deactivated successfully. Mar 17 18:22:26.124826 systemd-logind[1906]: Session 28 logged out. Waiting for processes to exit. Mar 17 18:22:26.127043 systemd-logind[1906]: Removed session 28. Mar 17 18:22:26.119000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-172.31.21.220:22-139.178.89.65:60570 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:31.134000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-172.31.21.220:22-139.178.89.65:35810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:31.135959 systemd[1]: Started sshd@28-172.31.21.220:22-139.178.89.65:35810.service. Mar 17 18:22:31.138199 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:22:31.138257 kernel: audit: type=1130 audit(1742235751.134:615): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-172.31.21.220:22-139.178.89.65:35810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:31.313000 audit[6222]: USER_ACCT pid=6222 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.315474 sshd[6222]: Accepted publickey for core from 139.178.89.65 port 35810 ssh2: RSA SHA256:azelU3G0DadBCmAXuAehsKOCz630heU8UfFnUiqM6ac Mar 17 18:22:31.324000 audit[6222]: CRED_ACQ pid=6222 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.327291 sshd[6222]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:22:31.334719 kernel: audit: type=1101 audit(1742235751.313:616): pid=6222 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.334903 kernel: audit: type=1103 audit(1742235751.324:617): pid=6222 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.340838 kernel: audit: type=1006 audit(1742235751.325:618): pid=6222 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=29 res=1 Mar 17 18:22:31.325000 audit[6222]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffd147e620 a2=3 a3=1 items=0 ppid=1 pid=6222 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:31.347837 systemd[1]: Started session-29.scope. Mar 17 18:22:31.349415 systemd-logind[1906]: New session 29 of user core. Mar 17 18:22:31.353561 kernel: audit: type=1300 audit(1742235751.325:618): arch=c00000b7 syscall=64 success=yes exit=3 a0=5 a1=ffffd147e620 a2=3 a3=1 items=0 ppid=1 pid=6222 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:22:31.353705 kernel: audit: type=1327 audit(1742235751.325:618): proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:31.325000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:22:31.364000 audit[6222]: USER_START pid=6222 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.367000 audit[6225]: CRED_ACQ pid=6225 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.385083 kernel: audit: type=1105 audit(1742235751.364:619): pid=6222 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.385231 kernel: audit: type=1103 audit(1742235751.367:620): pid=6225 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.613144 sshd[6222]: pam_unix(sshd:session): session closed for user core Mar 17 18:22:31.613000 audit[6222]: USER_END pid=6222 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.617860 systemd[1]: sshd@28-172.31.21.220:22-139.178.89.65:35810.service: Deactivated successfully. Mar 17 18:22:31.619304 systemd[1]: session-29.scope: Deactivated successfully. Mar 17 18:22:31.613000 audit[6222]: CRED_DISP pid=6222 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.627484 kernel: audit: type=1106 audit(1742235751.613:621): pid=6222 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Mar 17 18:22:31.627325 systemd-logind[1906]: Session 29 logged out. Waiting for processes to exit. Mar 17 18:22:31.616000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-172.31.21.220:22-139.178.89.65:35810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:22:31.637788 systemd-logind[1906]: Removed session 29. Mar 17 18:22:31.638382 kernel: audit: type=1104 audit(1742235751.613:622): pid=6222 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success'