Mar 17 19:39:27.892882 kernel: Linux version 5.15.179-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 11.3.1_p20221209 p3) 11.3.1 20221209, GNU ld (Gentoo 2.39 p5) 2.39.0) #1 SMP Mon Mar 17 17:12:34 -00 2025 Mar 17 19:39:27.892905 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=249ccd113f901380672c0d31e18f792e8e0344094c0e39eedc449f039418b31a Mar 17 19:39:27.892916 kernel: BIOS-provided physical RAM map: Mar 17 19:39:27.892926 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable Mar 17 19:39:27.892933 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved Mar 17 19:39:27.892940 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved Mar 17 19:39:27.892983 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdcfff] usable Mar 17 19:39:27.892991 kernel: BIOS-e820: [mem 0x00000000bffdd000-0x00000000bfffffff] reserved Mar 17 19:39:27.892998 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved Mar 17 19:39:27.893005 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved Mar 17 19:39:27.893013 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000013fffffff] usable Mar 17 19:39:27.893019 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved Mar 17 19:39:27.893029 kernel: NX (Execute Disable) protection: active Mar 17 19:39:27.893036 kernel: SMBIOS 3.0.0 present. Mar 17 19:39:27.893045 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.16.3-debian-1.16.3-2 04/01/2014 Mar 17 19:39:27.893053 kernel: Hypervisor detected: KVM Mar 17 19:39:27.893060 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 17 19:39:27.893067 kernel: kvm-clock: cpu 0, msr 1719a001, primary cpu clock Mar 17 19:39:27.893077 kernel: kvm-clock: using sched offset of 4317423071 cycles Mar 17 19:39:27.893085 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 17 19:39:27.893093 kernel: tsc: Detected 1996.249 MHz processor Mar 17 19:39:27.893101 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 17 19:39:27.893109 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 17 19:39:27.893117 kernel: last_pfn = 0x140000 max_arch_pfn = 0x400000000 Mar 17 19:39:27.893124 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 17 19:39:27.893132 kernel: last_pfn = 0xbffdd max_arch_pfn = 0x400000000 Mar 17 19:39:27.893140 kernel: ACPI: Early table checksum verification disabled Mar 17 19:39:27.893149 kernel: ACPI: RSDP 0x00000000000F51E0 000014 (v00 BOCHS ) Mar 17 19:39:27.893157 kernel: ACPI: RSDT 0x00000000BFFE1B65 000030 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 19:39:27.893165 kernel: ACPI: FACP 0x00000000BFFE1A49 000074 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 19:39:27.893173 kernel: ACPI: DSDT 0x00000000BFFE0040 001A09 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 19:39:27.893180 kernel: ACPI: FACS 0x00000000BFFE0000 000040 Mar 17 19:39:27.893188 kernel: ACPI: APIC 0x00000000BFFE1ABD 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 19:39:27.893196 kernel: ACPI: WAET 0x00000000BFFE1B3D 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 19:39:27.893203 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1a49-0xbffe1abc] Mar 17 19:39:27.893213 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffe0040-0xbffe1a48] Mar 17 19:39:27.893221 kernel: ACPI: Reserving FACS table memory at [mem 0xbffe0000-0xbffe003f] Mar 17 19:39:27.893229 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe1abd-0xbffe1b3c] Mar 17 19:39:27.893236 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1b3d-0xbffe1b64] Mar 17 19:39:27.893244 kernel: No NUMA configuration found Mar 17 19:39:27.893255 kernel: Faking a node at [mem 0x0000000000000000-0x000000013fffffff] Mar 17 19:39:27.893263 kernel: NODE_DATA(0) allocated [mem 0x13fff7000-0x13fffcfff] Mar 17 19:39:27.893273 kernel: Zone ranges: Mar 17 19:39:27.893281 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 17 19:39:27.893289 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 17 19:39:27.893297 kernel: Normal [mem 0x0000000100000000-0x000000013fffffff] Mar 17 19:39:27.893305 kernel: Movable zone start for each node Mar 17 19:39:27.893313 kernel: Early memory node ranges Mar 17 19:39:27.893321 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] Mar 17 19:39:27.893329 kernel: node 0: [mem 0x0000000000100000-0x00000000bffdcfff] Mar 17 19:39:27.893339 kernel: node 0: [mem 0x0000000100000000-0x000000013fffffff] Mar 17 19:39:27.893347 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000013fffffff] Mar 17 19:39:27.893355 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 17 19:39:27.893363 kernel: On node 0, zone DMA: 97 pages in unavailable ranges Mar 17 19:39:27.893371 kernel: On node 0, zone Normal: 35 pages in unavailable ranges Mar 17 19:39:27.893379 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 17 19:39:27.893387 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 17 19:39:27.893395 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 17 19:39:27.893403 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 17 19:39:27.893413 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 17 19:39:27.893421 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 17 19:39:27.893429 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 17 19:39:27.893437 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 17 19:39:27.893445 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 17 19:39:27.893453 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 17 19:39:27.893461 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices Mar 17 19:39:27.893469 kernel: Booting paravirtualized kernel on KVM Mar 17 19:39:27.893477 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 17 19:39:27.893487 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:512 nr_cpu_ids:2 nr_node_ids:1 Mar 17 19:39:27.893495 kernel: percpu: Embedded 56 pages/cpu s188696 r8192 d32488 u1048576 Mar 17 19:39:27.893503 kernel: pcpu-alloc: s188696 r8192 d32488 u1048576 alloc=1*2097152 Mar 17 19:39:27.893511 kernel: pcpu-alloc: [0] 0 1 Mar 17 19:39:27.893519 kernel: kvm-guest: stealtime: cpu 0, msr 13bc1c0c0 Mar 17 19:39:27.893527 kernel: kvm-guest: PV spinlocks disabled, no host support Mar 17 19:39:27.893535 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1031901 Mar 17 19:39:27.893543 kernel: Policy zone: Normal Mar 17 19:39:27.893552 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=249ccd113f901380672c0d31e18f792e8e0344094c0e39eedc449f039418b31a Mar 17 19:39:27.893562 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 19:39:27.893570 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 19:39:27.893578 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 19:39:27.893587 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 19:39:27.893595 kernel: Memory: 3968276K/4193772K available (12294K kernel code, 2278K rwdata, 13724K rodata, 47472K init, 4108K bss, 225236K reserved, 0K cma-reserved) Mar 17 19:39:27.893603 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 17 19:39:27.893612 kernel: ftrace: allocating 34580 entries in 136 pages Mar 17 19:39:27.893620 kernel: ftrace: allocated 136 pages with 2 groups Mar 17 19:39:27.893629 kernel: rcu: Hierarchical RCU implementation. Mar 17 19:39:27.893639 kernel: rcu: RCU event tracing is enabled. Mar 17 19:39:27.893647 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 17 19:39:27.893655 kernel: Rude variant of Tasks RCU enabled. Mar 17 19:39:27.893664 kernel: Tracing variant of Tasks RCU enabled. Mar 17 19:39:27.893672 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 19:39:27.893680 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 17 19:39:27.893688 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 16 Mar 17 19:39:27.893696 kernel: Console: colour VGA+ 80x25 Mar 17 19:39:27.893706 kernel: printk: console [tty0] enabled Mar 17 19:39:27.893714 kernel: printk: console [ttyS0] enabled Mar 17 19:39:27.893722 kernel: ACPI: Core revision 20210730 Mar 17 19:39:27.893730 kernel: APIC: Switch to symmetric I/O mode setup Mar 17 19:39:27.893738 kernel: x2apic enabled Mar 17 19:39:27.893746 kernel: Switched APIC routing to physical x2apic. Mar 17 19:39:27.893755 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 17 19:39:27.893763 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 17 19:39:27.893771 kernel: Calibrating delay loop (skipped) preset value.. 3992.49 BogoMIPS (lpj=1996249) Mar 17 19:39:27.893781 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 Mar 17 19:39:27.893789 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 Mar 17 19:39:27.893797 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 17 19:39:27.893805 kernel: Spectre V2 : Mitigation: Retpolines Mar 17 19:39:27.893813 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 17 19:39:27.893821 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 17 19:39:27.893830 kernel: Speculative Store Bypass: Vulnerable Mar 17 19:39:27.893838 kernel: x86/fpu: x87 FPU will use FXSAVE Mar 17 19:39:27.893846 kernel: Freeing SMP alternatives memory: 32K Mar 17 19:39:27.893855 kernel: pid_max: default: 32768 minimum: 301 Mar 17 19:39:27.893863 kernel: LSM: Security Framework initializing Mar 17 19:39:27.893871 kernel: SELinux: Initializing. Mar 17 19:39:27.893879 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 19:39:27.893887 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 19:39:27.893896 kernel: smpboot: CPU0: AMD Intel Core i7 9xx (Nehalem Class Core i7) (family: 0x6, model: 0x1a, stepping: 0x3) Mar 17 19:39:27.893910 kernel: Performance Events: AMD PMU driver. Mar 17 19:39:27.893920 kernel: ... version: 0 Mar 17 19:39:27.893928 kernel: ... bit width: 48 Mar 17 19:39:27.893936 kernel: ... generic registers: 4 Mar 17 19:39:27.894982 kernel: ... value mask: 0000ffffffffffff Mar 17 19:39:27.894998 kernel: ... max period: 00007fffffffffff Mar 17 19:39:27.895010 kernel: ... fixed-purpose events: 0 Mar 17 19:39:27.895019 kernel: ... event mask: 000000000000000f Mar 17 19:39:27.895027 kernel: signal: max sigframe size: 1440 Mar 17 19:39:27.895036 kernel: rcu: Hierarchical SRCU implementation. Mar 17 19:39:27.895044 kernel: smp: Bringing up secondary CPUs ... Mar 17 19:39:27.895054 kernel: x86: Booting SMP configuration: Mar 17 19:39:27.895062 kernel: .... node #0, CPUs: #1 Mar 17 19:39:27.895071 kernel: kvm-clock: cpu 1, msr 1719a041, secondary cpu clock Mar 17 19:39:27.895079 kernel: kvm-guest: stealtime: cpu 1, msr 13bd1c0c0 Mar 17 19:39:27.895088 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 19:39:27.895096 kernel: smpboot: Max logical packages: 2 Mar 17 19:39:27.895105 kernel: smpboot: Total of 2 processors activated (7984.99 BogoMIPS) Mar 17 19:39:27.895113 kernel: devtmpfs: initialized Mar 17 19:39:27.895121 kernel: x86/mm: Memory block size: 128MB Mar 17 19:39:27.895131 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 19:39:27.895140 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 17 19:39:27.895149 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 19:39:27.895157 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 19:39:27.895165 kernel: audit: initializing netlink subsys (disabled) Mar 17 19:39:27.895174 kernel: audit: type=2000 audit(1742240366.639:1): state=initialized audit_enabled=0 res=1 Mar 17 19:39:27.895182 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 19:39:27.895191 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 17 19:39:27.895199 kernel: cpuidle: using governor menu Mar 17 19:39:27.895210 kernel: ACPI: bus type PCI registered Mar 17 19:39:27.895219 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 19:39:27.895227 kernel: dca service started, version 1.12.1 Mar 17 19:39:27.895235 kernel: PCI: Using configuration type 1 for base access Mar 17 19:39:27.895244 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 17 19:39:27.895252 kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 19:39:27.895261 kernel: ACPI: Added _OSI(Module Device) Mar 17 19:39:27.895269 kernel: ACPI: Added _OSI(Processor Device) Mar 17 19:39:27.895277 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 19:39:27.895287 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 19:39:27.895295 kernel: ACPI: Added _OSI(Linux-Dell-Video) Mar 17 19:39:27.895304 kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Mar 17 19:39:27.895312 kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Mar 17 19:39:27.895321 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 19:39:27.895329 kernel: ACPI: Interpreter enabled Mar 17 19:39:27.895337 kernel: ACPI: PM: (supports S0 S3 S5) Mar 17 19:39:27.895345 kernel: ACPI: Using IOAPIC for interrupt routing Mar 17 19:39:27.895354 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 17 19:39:27.895364 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F Mar 17 19:39:27.895372 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 17 19:39:27.895513 kernel: acpi PNP0A03:00: _OSC: OS supports [ASPM ClockPM Segments MSI HPX-Type3] Mar 17 19:39:27.895606 kernel: acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. Mar 17 19:39:27.895620 kernel: acpiphp: Slot [3] registered Mar 17 19:39:27.895628 kernel: acpiphp: Slot [4] registered Mar 17 19:39:27.895637 kernel: acpiphp: Slot [5] registered Mar 17 19:39:27.895645 kernel: acpiphp: Slot [6] registered Mar 17 19:39:27.895658 kernel: acpiphp: Slot [7] registered Mar 17 19:39:27.895666 kernel: acpiphp: Slot [8] registered Mar 17 19:39:27.895674 kernel: acpiphp: Slot [9] registered Mar 17 19:39:27.895683 kernel: acpiphp: Slot [10] registered Mar 17 19:39:27.895691 kernel: acpiphp: Slot [11] registered Mar 17 19:39:27.895699 kernel: acpiphp: Slot [12] registered Mar 17 19:39:27.895708 kernel: acpiphp: Slot [13] registered Mar 17 19:39:27.895716 kernel: acpiphp: Slot [14] registered Mar 17 19:39:27.895724 kernel: acpiphp: Slot [15] registered Mar 17 19:39:27.895734 kernel: acpiphp: Slot [16] registered Mar 17 19:39:27.895743 kernel: acpiphp: Slot [17] registered Mar 17 19:39:27.895751 kernel: acpiphp: Slot [18] registered Mar 17 19:39:27.895759 kernel: acpiphp: Slot [19] registered Mar 17 19:39:27.895767 kernel: acpiphp: Slot [20] registered Mar 17 19:39:27.895775 kernel: acpiphp: Slot [21] registered Mar 17 19:39:27.895784 kernel: acpiphp: Slot [22] registered Mar 17 19:39:27.895792 kernel: acpiphp: Slot [23] registered Mar 17 19:39:27.895800 kernel: acpiphp: Slot [24] registered Mar 17 19:39:27.895809 kernel: acpiphp: Slot [25] registered Mar 17 19:39:27.895818 kernel: acpiphp: Slot [26] registered Mar 17 19:39:27.895827 kernel: acpiphp: Slot [27] registered Mar 17 19:39:27.895835 kernel: acpiphp: Slot [28] registered Mar 17 19:39:27.895843 kernel: acpiphp: Slot [29] registered Mar 17 19:39:27.895852 kernel: acpiphp: Slot [30] registered Mar 17 19:39:27.895860 kernel: acpiphp: Slot [31] registered Mar 17 19:39:27.895868 kernel: PCI host bridge to bus 0000:00 Mar 17 19:39:27.895978 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 17 19:39:27.896067 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 17 19:39:27.896146 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 17 19:39:27.896225 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 17 19:39:27.896301 kernel: pci_bus 0000:00: root bus resource [mem 0xc000000000-0xc07fffffff window] Mar 17 19:39:27.896378 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 17 19:39:27.896471 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 Mar 17 19:39:27.896565 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 Mar 17 19:39:27.896660 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 Mar 17 19:39:27.896742 kernel: pci 0000:00:01.1: reg 0x20: [io 0xc120-0xc12f] Mar 17 19:39:27.896826 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x10: [io 0x01f0-0x01f7] Mar 17 19:39:27.896906 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x14: [io 0x03f6] Mar 17 19:39:27.899043 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x18: [io 0x0170-0x0177] Mar 17 19:39:27.899141 kernel: pci 0000:00:01.1: legacy IDE quirk: reg 0x1c: [io 0x0376] Mar 17 19:39:27.899244 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 Mar 17 19:39:27.899335 kernel: pci 0000:00:01.3: quirk: [io 0x0600-0x063f] claimed by PIIX4 ACPI Mar 17 19:39:27.899425 kernel: pci 0000:00:01.3: quirk: [io 0x0700-0x070f] claimed by PIIX4 SMB Mar 17 19:39:27.899522 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 Mar 17 19:39:27.899611 kernel: pci 0000:00:02.0: reg 0x10: [mem 0xfe000000-0xfe7fffff pref] Mar 17 19:39:27.899704 kernel: pci 0000:00:02.0: reg 0x18: [mem 0xc000000000-0xc000003fff 64bit pref] Mar 17 19:39:27.899793 kernel: pci 0000:00:02.0: reg 0x20: [mem 0xfeb90000-0xfeb90fff] Mar 17 19:39:27.899886 kernel: pci 0000:00:02.0: reg 0x30: [mem 0xfeb80000-0xfeb8ffff pref] Mar 17 19:39:27.902084 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 17 19:39:27.902196 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 Mar 17 19:39:27.902289 kernel: pci 0000:00:03.0: reg 0x10: [io 0xc080-0xc0bf] Mar 17 19:39:27.902379 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xfeb91000-0xfeb91fff] Mar 17 19:39:27.902480 kernel: pci 0000:00:03.0: reg 0x20: [mem 0xc000004000-0xc000007fff 64bit pref] Mar 17 19:39:27.902572 kernel: pci 0000:00:03.0: reg 0x30: [mem 0xfeb00000-0xfeb7ffff pref] Mar 17 19:39:27.902673 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 Mar 17 19:39:27.902763 kernel: pci 0000:00:04.0: reg 0x10: [io 0xc000-0xc07f] Mar 17 19:39:27.902852 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xfeb92000-0xfeb92fff] Mar 17 19:39:27.902941 kernel: pci 0000:00:04.0: reg 0x20: [mem 0xc000008000-0xc00000bfff 64bit pref] Mar 17 19:39:27.903073 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 Mar 17 19:39:27.903164 kernel: pci 0000:00:05.0: reg 0x10: [io 0xc0c0-0xc0ff] Mar 17 19:39:27.903254 kernel: pci 0000:00:05.0: reg 0x20: [mem 0xc00000c000-0xc00000ffff 64bit pref] Mar 17 19:39:27.903354 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 Mar 17 19:39:27.903445 kernel: pci 0000:00:06.0: reg 0x10: [io 0xc100-0xc11f] Mar 17 19:39:27.903534 kernel: pci 0000:00:06.0: reg 0x14: [mem 0xfeb93000-0xfeb93fff] Mar 17 19:39:27.903624 kernel: pci 0000:00:06.0: reg 0x20: [mem 0xc000010000-0xc000013fff 64bit pref] Mar 17 19:39:27.903637 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 17 19:39:27.903646 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 17 19:39:27.903655 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 17 19:39:27.903667 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 17 19:39:27.903678 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9 Mar 17 19:39:27.903687 kernel: iommu: Default domain type: Translated Mar 17 19:39:27.903696 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 17 19:39:27.903783 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device Mar 17 19:39:27.903871 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 17 19:39:27.904834 kernel: pci 0000:00:02.0: vgaarb: bridge control possible Mar 17 19:39:27.904850 kernel: vgaarb: loaded Mar 17 19:39:27.904859 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 17 19:39:27.904870 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 17 19:39:27.904879 kernel: PTP clock support registered Mar 17 19:39:27.904887 kernel: PCI: Using ACPI for IRQ routing Mar 17 19:39:27.904895 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 17 19:39:27.904903 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] Mar 17 19:39:27.904911 kernel: e820: reserve RAM buffer [mem 0xbffdd000-0xbfffffff] Mar 17 19:39:27.904919 kernel: clocksource: Switched to clocksource kvm-clock Mar 17 19:39:27.904928 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 19:39:27.904936 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 19:39:27.907968 kernel: pnp: PnP ACPI init Mar 17 19:39:27.908076 kernel: pnp 00:03: [dma 2] Mar 17 19:39:27.908091 kernel: pnp: PnP ACPI: found 5 devices Mar 17 19:39:27.908101 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 17 19:39:27.908110 kernel: NET: Registered PF_INET protocol family Mar 17 19:39:27.908119 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 19:39:27.908128 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 19:39:27.908137 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 19:39:27.908149 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 19:39:27.908158 kernel: TCP bind hash table entries: 32768 (order: 7, 524288 bytes, linear) Mar 17 19:39:27.908166 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 19:39:27.908175 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 19:39:27.908184 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 19:39:27.908192 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 19:39:27.908201 kernel: NET: Registered PF_XDP protocol family Mar 17 19:39:27.908283 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 17 19:39:27.908363 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 17 19:39:27.908446 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 17 19:39:27.908518 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window] Mar 17 19:39:27.908589 kernel: pci_bus 0000:00: resource 8 [mem 0xc000000000-0xc07fffffff window] Mar 17 19:39:27.908676 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release Mar 17 19:39:27.908762 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers Mar 17 19:39:27.908848 kernel: pci 0000:00:01.0: Activating ISA DMA hang workarounds Mar 17 19:39:27.908860 kernel: PCI: CLS 0 bytes, default 64 Mar 17 19:39:27.908869 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 17 19:39:27.908880 kernel: software IO TLB: mapped [mem 0x00000000bbfdd000-0x00000000bffdd000] (64MB) Mar 17 19:39:27.908888 kernel: Initialise system trusted keyrings Mar 17 19:39:27.908897 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 19:39:27.908905 kernel: Key type asymmetric registered Mar 17 19:39:27.908913 kernel: Asymmetric key parser 'x509' registered Mar 17 19:39:27.908921 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 17 19:39:27.908929 kernel: io scheduler mq-deadline registered Mar 17 19:39:27.908937 kernel: io scheduler kyber registered Mar 17 19:39:27.908959 kernel: io scheduler bfq registered Mar 17 19:39:27.908969 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 17 19:39:27.908978 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10 Mar 17 19:39:27.908986 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11 Mar 17 19:39:27.908994 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11 Mar 17 19:39:27.909002 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10 Mar 17 19:39:27.909011 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 19:39:27.909019 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 17 19:39:27.909027 kernel: random: crng init done Mar 17 19:39:27.909035 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 17 19:39:27.909045 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 17 19:39:27.909053 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 17 19:39:27.909061 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 17 19:39:27.909148 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 17 19:39:27.909227 kernel: rtc_cmos 00:04: registered as rtc0 Mar 17 19:39:27.909307 kernel: rtc_cmos 00:04: setting system clock to 2025-03-17T19:39:27 UTC (1742240367) Mar 17 19:39:27.909383 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram Mar 17 19:39:27.909394 kernel: NET: Registered PF_INET6 protocol family Mar 17 19:39:27.909405 kernel: Segment Routing with IPv6 Mar 17 19:39:27.909413 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 19:39:27.909422 kernel: NET: Registered PF_PACKET protocol family Mar 17 19:39:27.909430 kernel: Key type dns_resolver registered Mar 17 19:39:27.909438 kernel: IPI shorthand broadcast: enabled Mar 17 19:39:27.909446 kernel: sched_clock: Marking stable (847054445, 158471712)->(1109159950, -103633793) Mar 17 19:39:27.909454 kernel: registered taskstats version 1 Mar 17 19:39:27.909462 kernel: Loading compiled-in X.509 certificates Mar 17 19:39:27.909471 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 5.15.179-flatcar: d5b956bbabb2d386c0246a969032c0de9eaa8220' Mar 17 19:39:27.909481 kernel: Key type .fscrypt registered Mar 17 19:39:27.909489 kernel: Key type fscrypt-provisioning registered Mar 17 19:39:27.909497 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 19:39:27.909505 kernel: ima: Allocated hash algorithm: sha1 Mar 17 19:39:27.909513 kernel: ima: No architecture policies found Mar 17 19:39:27.909521 kernel: clk: Disabling unused clocks Mar 17 19:39:27.909529 kernel: Freeing unused kernel image (initmem) memory: 47472K Mar 17 19:39:27.909538 kernel: Write protecting the kernel read-only data: 28672k Mar 17 19:39:27.909547 kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Mar 17 19:39:27.909556 kernel: Freeing unused kernel image (rodata/data gap) memory: 612K Mar 17 19:39:27.909564 kernel: Run /init as init process Mar 17 19:39:27.909572 kernel: with arguments: Mar 17 19:39:27.909580 kernel: /init Mar 17 19:39:27.909588 kernel: with environment: Mar 17 19:39:27.909595 kernel: HOME=/ Mar 17 19:39:27.909603 kernel: TERM=linux Mar 17 19:39:27.909611 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 19:39:27.909623 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Mar 17 19:39:27.909635 systemd[1]: Detected virtualization kvm. Mar 17 19:39:27.909644 systemd[1]: Detected architecture x86-64. Mar 17 19:39:27.909653 systemd[1]: Running in initrd. Mar 17 19:39:27.909661 systemd[1]: No hostname configured, using default hostname. Mar 17 19:39:27.909670 systemd[1]: Hostname set to . Mar 17 19:39:27.909679 systemd[1]: Initializing machine ID from VM UUID. Mar 17 19:39:27.909689 systemd[1]: Queued start job for default target initrd.target. Mar 17 19:39:27.909698 systemd[1]: Started systemd-ask-password-console.path. Mar 17 19:39:27.909707 systemd[1]: Reached target cryptsetup.target. Mar 17 19:39:27.909715 systemd[1]: Reached target paths.target. Mar 17 19:39:27.909724 systemd[1]: Reached target slices.target. Mar 17 19:39:27.909732 systemd[1]: Reached target swap.target. Mar 17 19:39:27.909741 systemd[1]: Reached target timers.target. Mar 17 19:39:27.909750 systemd[1]: Listening on iscsid.socket. Mar 17 19:39:27.909761 systemd[1]: Listening on iscsiuio.socket. Mar 17 19:39:27.909776 systemd[1]: Listening on systemd-journald-audit.socket. Mar 17 19:39:27.909787 systemd[1]: Listening on systemd-journald-dev-log.socket. Mar 17 19:39:27.909796 systemd[1]: Listening on systemd-journald.socket. Mar 17 19:39:27.909805 systemd[1]: Listening on systemd-networkd.socket. Mar 17 19:39:27.909814 systemd[1]: Listening on systemd-udevd-control.socket. Mar 17 19:39:27.909824 systemd[1]: Listening on systemd-udevd-kernel.socket. Mar 17 19:39:27.909833 systemd[1]: Reached target sockets.target. Mar 17 19:39:27.909842 systemd[1]: Starting kmod-static-nodes.service... Mar 17 19:39:27.909851 systemd[1]: Finished network-cleanup.service. Mar 17 19:39:27.909860 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 19:39:27.909869 systemd[1]: Starting systemd-journald.service... Mar 17 19:39:27.909878 systemd[1]: Starting systemd-modules-load.service... Mar 17 19:39:27.909887 systemd[1]: Starting systemd-resolved.service... Mar 17 19:39:27.909895 systemd[1]: Starting systemd-vconsole-setup.service... Mar 17 19:39:27.909906 systemd[1]: Finished kmod-static-nodes.service. Mar 17 19:39:27.909918 systemd-journald[186]: Journal started Mar 17 19:39:27.909976 systemd-journald[186]: Runtime Journal (/run/log/journal/da037cdc4bea43f7b2105935d8de093c) is 8.0M, max 78.4M, 70.4M free. Mar 17 19:39:27.894320 systemd-modules-load[187]: Inserted module 'overlay' Mar 17 19:39:27.898172 systemd-resolved[188]: Positive Trust Anchors: Mar 17 19:39:27.927754 systemd[1]: Started systemd-resolved.service. Mar 17 19:39:27.927775 kernel: audit: type=1130 audit(1742240367.918:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.918000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.898183 systemd-resolved[188]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 19:39:27.898221 systemd-resolved[188]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Mar 17 19:39:27.901012 systemd-resolved[188]: Defaulting to hostname 'linux'. Mar 17 19:39:27.940489 systemd[1]: Started systemd-journald.service. Mar 17 19:39:27.940517 kernel: audit: type=1130 audit(1742240367.931:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.931000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.940512 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 19:39:27.954728 kernel: audit: type=1130 audit(1742240367.934:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.954746 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 19:39:27.954758 kernel: audit: type=1130 audit(1742240367.949:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.934000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.949000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.949673 systemd[1]: Finished systemd-vconsole-setup.service. Mar 17 19:39:27.960501 kernel: audit: type=1130 audit(1742240367.954:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.954000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.955301 systemd[1]: Reached target nss-lookup.target. Mar 17 19:39:27.961771 systemd[1]: Starting dracut-cmdline-ask.service... Mar 17 19:39:27.966900 kernel: Bridge firewalling registered Mar 17 19:39:27.962465 systemd-modules-load[187]: Inserted module 'br_netfilter' Mar 17 19:39:27.965308 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Mar 17 19:39:27.973000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.973989 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Mar 17 19:39:27.981334 kernel: audit: type=1130 audit(1742240367.973:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:27.990980 kernel: SCSI subsystem initialized Mar 17 19:39:27.993643 systemd[1]: Finished dracut-cmdline-ask.service. Mar 17 19:39:27.995887 systemd[1]: Starting dracut-cmdline.service... Mar 17 19:39:27.994000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.003966 kernel: audit: type=1130 audit(1742240367.994:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.004067 dracut-cmdline[203]: dracut-dracut-053 Mar 17 19:39:28.005810 dracut-cmdline[203]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200n8 console=tty0 flatcar.first_boot=detected flatcar.oem.id=openstack verity.usrhash=249ccd113f901380672c0d31e18f792e8e0344094c0e39eedc449f039418b31a Mar 17 19:39:28.018644 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 19:39:28.018671 kernel: device-mapper: uevent: version 1.0.3 Mar 17 19:39:28.021051 kernel: device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com Mar 17 19:39:28.024276 systemd-modules-load[187]: Inserted module 'dm_multipath' Mar 17 19:39:28.026000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.025680 systemd[1]: Finished systemd-modules-load.service. Mar 17 19:39:28.033095 kernel: audit: type=1130 audit(1742240368.026:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.026903 systemd[1]: Starting systemd-sysctl.service... Mar 17 19:39:28.038186 systemd[1]: Finished systemd-sysctl.service. Mar 17 19:39:28.037000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.044983 kernel: audit: type=1130 audit(1742240368.037:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.066994 kernel: Loading iSCSI transport class v2.0-870. Mar 17 19:39:28.087976 kernel: iscsi: registered transport (tcp) Mar 17 19:39:28.114926 kernel: iscsi: registered transport (qla4xxx) Mar 17 19:39:28.115008 kernel: QLogic iSCSI HBA Driver Mar 17 19:39:28.162255 systemd[1]: Finished dracut-cmdline.service. Mar 17 19:39:28.163707 systemd[1]: Starting dracut-pre-udev.service... Mar 17 19:39:28.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.244056 kernel: raid6: sse2x4 gen() 6692 MB/s Mar 17 19:39:28.262048 kernel: raid6: sse2x4 xor() 4930 MB/s Mar 17 19:39:28.280049 kernel: raid6: sse2x2 gen() 13478 MB/s Mar 17 19:39:28.298051 kernel: raid6: sse2x2 xor() 8282 MB/s Mar 17 19:39:28.316048 kernel: raid6: sse2x1 gen() 10668 MB/s Mar 17 19:39:28.338403 kernel: raid6: sse2x1 xor() 6661 MB/s Mar 17 19:39:28.338493 kernel: raid6: using algorithm sse2x2 gen() 13478 MB/s Mar 17 19:39:28.338522 kernel: raid6: .... xor() 8282 MB/s, rmw enabled Mar 17 19:39:28.339668 kernel: raid6: using ssse3x2 recovery algorithm Mar 17 19:39:28.356798 kernel: xor: measuring software checksum speed Mar 17 19:39:28.356860 kernel: prefetch64-sse : 17033 MB/sec Mar 17 19:39:28.358061 kernel: generic_sse : 16760 MB/sec Mar 17 19:39:28.358108 kernel: xor: using function: prefetch64-sse (17033 MB/sec) Mar 17 19:39:28.477006 kernel: Btrfs loaded, crc32c=crc32c-intel, zoned=no, fsverity=no Mar 17 19:39:28.493841 systemd[1]: Finished dracut-pre-udev.service. Mar 17 19:39:28.494000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.495000 audit: BPF prog-id=7 op=LOAD Mar 17 19:39:28.495000 audit: BPF prog-id=8 op=LOAD Mar 17 19:39:28.497180 systemd[1]: Starting systemd-udevd.service... Mar 17 19:39:28.511127 systemd-udevd[385]: Using default interface naming scheme 'v252'. Mar 17 19:39:28.515829 systemd[1]: Started systemd-udevd.service. Mar 17 19:39:28.521000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.524125 systemd[1]: Starting dracut-pre-trigger.service... Mar 17 19:39:28.544365 dracut-pre-trigger[406]: rd.md=0: removing MD RAID activation Mar 17 19:39:28.600231 systemd[1]: Finished dracut-pre-trigger.service. Mar 17 19:39:28.599000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.601883 systemd[1]: Starting systemd-udev-trigger.service... Mar 17 19:39:28.644000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:28.644219 systemd[1]: Finished systemd-udev-trigger.service. Mar 17 19:39:28.708179 kernel: virtio_blk virtio2: [vda] 20971520 512-byte logical blocks (10.7 GB/10.0 GiB) Mar 17 19:39:28.723652 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 17 19:39:28.723670 kernel: GPT:17805311 != 20971519 Mar 17 19:39:28.723682 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 17 19:39:28.723693 kernel: GPT:17805311 != 20971519 Mar 17 19:39:28.723703 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 17 19:39:28.723718 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 19:39:28.751915 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device. Mar 17 19:39:28.817996 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 scanned by (udev-worker) (452) Mar 17 19:39:28.818025 kernel: libata version 3.00 loaded. Mar 17 19:39:28.818037 kernel: ata_piix 0000:00:01.1: version 2.13 Mar 17 19:39:28.818226 kernel: scsi host0: ata_piix Mar 17 19:39:28.818353 kernel: scsi host1: ata_piix Mar 17 19:39:28.818454 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc120 irq 14 Mar 17 19:39:28.818477 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc128 irq 15 Mar 17 19:39:28.821456 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device. Mar 17 19:39:28.822023 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device. Mar 17 19:39:28.832460 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device. Mar 17 19:39:28.836584 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Mar 17 19:39:28.839200 systemd[1]: Starting disk-uuid.service... Mar 17 19:39:28.883865 disk-uuid[471]: Primary Header is updated. Mar 17 19:39:28.883865 disk-uuid[471]: Secondary Entries is updated. Mar 17 19:39:28.883865 disk-uuid[471]: Secondary Header is updated. Mar 17 19:39:28.900999 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 19:39:28.916001 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 19:39:29.930009 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 19:39:29.930108 disk-uuid[472]: The operation has completed successfully. Mar 17 19:39:30.009171 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 19:39:30.010420 systemd[1]: Finished disk-uuid.service. Mar 17 19:39:30.011000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.011000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.023462 systemd[1]: Starting verity-setup.service... Mar 17 19:39:30.062018 kernel: device-mapper: verity: sha256 using implementation "sha256-ssse3" Mar 17 19:39:30.164930 systemd[1]: Found device dev-mapper-usr.device. Mar 17 19:39:30.167509 systemd[1]: Mounting sysusr-usr.mount... Mar 17 19:39:30.173264 systemd[1]: Finished verity-setup.service. Mar 17 19:39:30.173000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.355003 kernel: EXT4-fs (dm-0): mounted filesystem without journal. Opts: norecovery. Quota mode: none. Mar 17 19:39:30.354881 systemd[1]: Mounted sysusr-usr.mount. Mar 17 19:39:30.355541 systemd[1]: afterburn-network-kargs.service was skipped because no trigger condition checks were met. Mar 17 19:39:30.356254 systemd[1]: Starting ignition-setup.service... Mar 17 19:39:30.359582 systemd[1]: Starting parse-ip-for-networkd.service... Mar 17 19:39:30.405921 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 19:39:30.405987 kernel: BTRFS info (device vda6): using free space tree Mar 17 19:39:30.406002 kernel: BTRFS info (device vda6): has skinny extents Mar 17 19:39:30.438255 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 19:39:30.481000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.482164 systemd[1]: Finished parse-ip-for-networkd.service. Mar 17 19:39:30.482000 audit: BPF prog-id=9 op=LOAD Mar 17 19:39:30.484305 systemd[1]: Starting systemd-networkd.service... Mar 17 19:39:30.497000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.496808 systemd[1]: Finished ignition-setup.service. Mar 17 19:39:30.501794 systemd[1]: Starting ignition-fetch-offline.service... Mar 17 19:39:30.508035 systemd-networkd[640]: lo: Link UP Mar 17 19:39:30.508040 systemd-networkd[640]: lo: Gained carrier Mar 17 19:39:30.515000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.508486 systemd-networkd[640]: Enumeration completed Mar 17 19:39:30.508721 systemd-networkd[640]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 19:39:30.510040 systemd-networkd[640]: eth0: Link UP Mar 17 19:39:30.510044 systemd-networkd[640]: eth0: Gained carrier Mar 17 19:39:30.511509 systemd[1]: Started systemd-networkd.service. Mar 17 19:39:30.516574 systemd[1]: Reached target network.target. Mar 17 19:39:30.521731 systemd[1]: Starting iscsiuio.service... Mar 17 19:39:30.523040 systemd-networkd[640]: eth0: DHCPv4 address 172.24.4.218/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 17 19:39:30.533000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.534077 systemd[1]: Started iscsiuio.service. Mar 17 19:39:30.535343 systemd[1]: Starting iscsid.service... Mar 17 19:39:30.539755 iscsid[647]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Mar 17 19:39:30.539755 iscsid[647]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Mar 17 19:39:30.539755 iscsid[647]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Mar 17 19:39:30.539755 iscsid[647]: If using hardware iscsi like qla4xxx this message can be ignored. Mar 17 19:39:30.539755 iscsid[647]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Mar 17 19:39:30.543000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.546782 iscsid[647]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Mar 17 19:39:30.542193 systemd[1]: Started iscsid.service. Mar 17 19:39:30.545229 systemd[1]: Starting dracut-initqueue.service... Mar 17 19:39:30.560000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.560943 systemd[1]: Finished dracut-initqueue.service. Mar 17 19:39:30.561529 systemd[1]: Reached target remote-fs-pre.target. Mar 17 19:39:30.561991 systemd[1]: Reached target remote-cryptsetup.target. Mar 17 19:39:30.562433 systemd[1]: Reached target remote-fs.target. Mar 17 19:39:30.563584 systemd[1]: Starting dracut-pre-mount.service... Mar 17 19:39:30.572000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.572923 systemd[1]: Finished dracut-pre-mount.service. Mar 17 19:39:30.846885 ignition[642]: Ignition 2.14.0 Mar 17 19:39:30.846908 ignition[642]: Stage: fetch-offline Mar 17 19:39:30.847116 ignition[642]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 19:39:30.847168 ignition[642]: parsing config with SHA512: ce918cf8568bff1426dda9ea05b778568a1626fcf4c1bded9ebe13fee104bc1b92fac5f7093a3bfc7d99777c3793d01249c863845c2ca48413d9477d40af178a Mar 17 19:39:30.849441 ignition[642]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 17 19:39:30.850366 ignition[642]: parsed url from cmdline: "" Mar 17 19:39:30.852661 systemd[1]: Finished ignition-fetch-offline.service. Mar 17 19:39:30.853000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:30.850376 ignition[642]: no config URL provided Mar 17 19:39:30.850389 ignition[642]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 19:39:30.856748 systemd[1]: Starting ignition-fetch.service... Mar 17 19:39:30.850407 ignition[642]: no config at "/usr/lib/ignition/user.ign" Mar 17 19:39:30.850417 ignition[642]: failed to fetch config: resource requires networking Mar 17 19:39:30.850891 ignition[642]: Ignition finished successfully Mar 17 19:39:30.877555 ignition[665]: Ignition 2.14.0 Mar 17 19:39:30.877583 ignition[665]: Stage: fetch Mar 17 19:39:30.877835 ignition[665]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 19:39:30.877880 ignition[665]: parsing config with SHA512: ce918cf8568bff1426dda9ea05b778568a1626fcf4c1bded9ebe13fee104bc1b92fac5f7093a3bfc7d99777c3793d01249c863845c2ca48413d9477d40af178a Mar 17 19:39:30.880144 ignition[665]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 17 19:39:30.880358 ignition[665]: parsed url from cmdline: "" Mar 17 19:39:30.880368 ignition[665]: no config URL provided Mar 17 19:39:30.880383 ignition[665]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 19:39:30.880403 ignition[665]: no config at "/usr/lib/ignition/user.ign" Mar 17 19:39:30.886515 ignition[665]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Mar 17 19:39:30.886590 ignition[665]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Mar 17 19:39:30.886761 ignition[665]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Mar 17 19:39:31.143885 ignition[665]: GET result: OK Mar 17 19:39:31.144147 ignition[665]: parsing config with SHA512: abae912ad6dc9aada699d904ce72b06c58317b2f177544a4b8b801ee399e537791e6311e6894acfc72182ce83c6c777f1dd3314256f782db75a4d1319c483f48 Mar 17 19:39:31.161569 unknown[665]: fetched base config from "system" Mar 17 19:39:31.163090 unknown[665]: fetched base config from "system" Mar 17 19:39:31.164457 unknown[665]: fetched user config from "openstack" Mar 17 19:39:31.167101 ignition[665]: fetch: fetch complete Mar 17 19:39:31.168122 ignition[665]: fetch: fetch passed Mar 17 19:39:31.177000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:31.170852 systemd[1]: Finished ignition-fetch.service. Mar 17 19:39:31.168233 ignition[665]: Ignition finished successfully Mar 17 19:39:31.180902 systemd[1]: Starting ignition-kargs.service... Mar 17 19:39:31.202074 ignition[671]: Ignition 2.14.0 Mar 17 19:39:31.202103 ignition[671]: Stage: kargs Mar 17 19:39:31.202392 ignition[671]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 19:39:31.202439 ignition[671]: parsing config with SHA512: ce918cf8568bff1426dda9ea05b778568a1626fcf4c1bded9ebe13fee104bc1b92fac5f7093a3bfc7d99777c3793d01249c863845c2ca48413d9477d40af178a Mar 17 19:39:31.204718 ignition[671]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 17 19:39:31.207526 ignition[671]: kargs: kargs passed Mar 17 19:39:31.210000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:31.209359 systemd[1]: Finished ignition-kargs.service. Mar 17 19:39:31.207620 ignition[671]: Ignition finished successfully Mar 17 19:39:31.212818 systemd[1]: Starting ignition-disks.service... Mar 17 19:39:31.230162 ignition[676]: Ignition 2.14.0 Mar 17 19:39:31.230189 ignition[676]: Stage: disks Mar 17 19:39:31.230428 ignition[676]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 19:39:31.230495 ignition[676]: parsing config with SHA512: ce918cf8568bff1426dda9ea05b778568a1626fcf4c1bded9ebe13fee104bc1b92fac5f7093a3bfc7d99777c3793d01249c863845c2ca48413d9477d40af178a Mar 17 19:39:31.232814 ignition[676]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 17 19:39:31.235709 ignition[676]: disks: disks passed Mar 17 19:39:31.238000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:31.237767 systemd[1]: Finished ignition-disks.service. Mar 17 19:39:31.235797 ignition[676]: Ignition finished successfully Mar 17 19:39:31.239254 systemd[1]: Reached target initrd-root-device.target. Mar 17 19:39:31.241467 systemd[1]: Reached target local-fs-pre.target. Mar 17 19:39:31.243909 systemd[1]: Reached target local-fs.target. Mar 17 19:39:31.246299 systemd[1]: Reached target sysinit.target. Mar 17 19:39:31.248732 systemd[1]: Reached target basic.target. Mar 17 19:39:31.253037 systemd[1]: Starting systemd-fsck-root.service... Mar 17 19:39:31.289302 systemd-fsck[683]: ROOT: clean, 623/1628000 files, 124059/1617920 blocks Mar 17 19:39:31.304504 systemd[1]: Finished systemd-fsck-root.service. Mar 17 19:39:31.304000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:31.307580 systemd[1]: Mounting sysroot.mount... Mar 17 19:39:31.330012 kernel: EXT4-fs (vda9): mounted filesystem with ordered data mode. Opts: (null). Quota mode: none. Mar 17 19:39:31.330811 systemd[1]: Mounted sysroot.mount. Mar 17 19:39:31.333327 systemd[1]: Reached target initrd-root-fs.target. Mar 17 19:39:31.337052 systemd[1]: Mounting sysroot-usr.mount... Mar 17 19:39:31.339097 systemd[1]: flatcar-metadata-hostname.service was skipped because no trigger condition checks were met. Mar 17 19:39:31.340529 systemd[1]: Starting flatcar-openstack-hostname.service... Mar 17 19:39:31.342415 systemd[1]: ignition-remount-sysroot.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 19:39:31.342503 systemd[1]: Reached target ignition-diskful.target. Mar 17 19:39:31.353307 systemd[1]: Mounted sysroot-usr.mount. Mar 17 19:39:31.363229 systemd[1]: Mounting sysroot-usr-share-oem.mount... Mar 17 19:39:31.373387 systemd[1]: Starting initrd-setup-root.service... Mar 17 19:39:31.385930 initrd-setup-root[695]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 19:39:31.403338 initrd-setup-root[703]: cut: /sysroot/etc/group: No such file or directory Mar 17 19:39:31.408973 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 scanned by mount (690) Mar 17 19:39:31.411082 initrd-setup-root[711]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 19:39:31.422295 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 19:39:31.422331 kernel: BTRFS info (device vda6): using free space tree Mar 17 19:39:31.422345 kernel: BTRFS info (device vda6): has skinny extents Mar 17 19:39:31.422716 initrd-setup-root[719]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 19:39:31.442653 systemd[1]: Mounted sysroot-usr-share-oem.mount. Mar 17 19:39:31.490000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:31.490339 systemd[1]: Finished initrd-setup-root.service. Mar 17 19:39:31.491871 systemd[1]: Starting ignition-mount.service... Mar 17 19:39:31.494474 systemd[1]: Starting sysroot-boot.service... Mar 17 19:39:31.501615 systemd[1]: sysusr-usr-share-oem.mount: Deactivated successfully. Mar 17 19:39:31.503279 systemd[1]: sysroot-usr-share-oem.mount: Deactivated successfully. Mar 17 19:39:31.516705 ignition[757]: INFO : Ignition 2.14.0 Mar 17 19:39:31.518273 ignition[757]: INFO : Stage: mount Mar 17 19:39:31.518986 ignition[757]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 19:39:31.519784 ignition[757]: DEBUG : parsing config with SHA512: ce918cf8568bff1426dda9ea05b778568a1626fcf4c1bded9ebe13fee104bc1b92fac5f7093a3bfc7d99777c3793d01249c863845c2ca48413d9477d40af178a Mar 17 19:39:31.521884 ignition[757]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 17 19:39:31.523819 ignition[757]: INFO : mount: mount passed Mar 17 19:39:31.524462 ignition[757]: INFO : Ignition finished successfully Mar 17 19:39:31.528000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:31.528681 systemd[1]: Finished ignition-mount.service. Mar 17 19:39:31.540000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:31.540561 systemd[1]: Finished sysroot-boot.service. Mar 17 19:39:31.567429 coreos-metadata[689]: Mar 17 19:39:31.567 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Mar 17 19:39:31.587233 coreos-metadata[689]: Mar 17 19:39:31.587 INFO Fetch successful Mar 17 19:39:31.587969 coreos-metadata[689]: Mar 17 19:39:31.587 INFO wrote hostname ci-3510-3-7-8-c8b8528301.novalocal to /sysroot/etc/hostname Mar 17 19:39:31.594281 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Mar 17 19:39:31.594527 systemd[1]: Finished flatcar-openstack-hostname.service. Mar 17 19:39:31.595000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:31.595000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:31.598580 systemd[1]: Starting ignition-files.service... Mar 17 19:39:31.612635 systemd[1]: Mounting sysroot-usr-share-oem.mount... Mar 17 19:39:31.628018 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/vda6 scanned by mount (766) Mar 17 19:39:31.638001 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 19:39:31.638059 kernel: BTRFS info (device vda6): using free space tree Mar 17 19:39:31.638093 kernel: BTRFS info (device vda6): has skinny extents Mar 17 19:39:31.651341 systemd[1]: Mounted sysroot-usr-share-oem.mount. Mar 17 19:39:31.670447 ignition[785]: INFO : Ignition 2.14.0 Mar 17 19:39:31.670447 ignition[785]: INFO : Stage: files Mar 17 19:39:31.673534 ignition[785]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 19:39:31.673534 ignition[785]: DEBUG : parsing config with SHA512: ce918cf8568bff1426dda9ea05b778568a1626fcf4c1bded9ebe13fee104bc1b92fac5f7093a3bfc7d99777c3793d01249c863845c2ca48413d9477d40af178a Mar 17 19:39:31.673534 ignition[785]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 17 19:39:31.679812 ignition[785]: DEBUG : files: compiled without relabeling support, skipping Mar 17 19:39:31.681639 ignition[785]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 19:39:31.681639 ignition[785]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 19:39:31.685828 ignition[785]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 19:39:31.687768 ignition[785]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 19:39:31.690357 ignition[785]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 19:39:31.689554 unknown[785]: wrote ssh authorized keys file for user: core Mar 17 19:39:31.695851 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 19:39:31.695851 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 19:39:31.695851 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 19:39:31.695851 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 17 19:39:31.759441 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Mar 17 19:39:32.089332 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 19:39:32.089332 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 19:39:32.094166 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 17 19:39:32.275277 systemd-networkd[640]: eth0: Gained IPv6LL Mar 17 19:39:32.643482 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Mar 17 19:39:34.334809 ignition[785]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 19:39:34.336257 ignition[785]: INFO : files: op(c): [started] processing unit "coreos-metadata-sshkeys@.service" Mar 17 19:39:34.337037 ignition[785]: INFO : files: op(c): [finished] processing unit "coreos-metadata-sshkeys@.service" Mar 17 19:39:34.337796 ignition[785]: INFO : files: op(d): [started] processing unit "containerd.service" Mar 17 19:39:34.339555 ignition[785]: INFO : files: op(d): op(e): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 19:39:34.340618 ignition[785]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 19:39:34.340618 ignition[785]: INFO : files: op(d): [finished] processing unit "containerd.service" Mar 17 19:39:34.340618 ignition[785]: INFO : files: op(f): [started] processing unit "prepare-helm.service" Mar 17 19:39:34.340618 ignition[785]: INFO : files: op(f): op(10): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 19:39:34.344138 ignition[785]: INFO : files: op(f): op(10): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 19:39:34.344138 ignition[785]: INFO : files: op(f): [finished] processing unit "prepare-helm.service" Mar 17 19:39:34.344138 ignition[785]: INFO : files: op(11): [started] setting preset to enabled for "coreos-metadata-sshkeys@.service " Mar 17 19:39:34.344138 ignition[785]: INFO : files: op(11): [finished] setting preset to enabled for "coreos-metadata-sshkeys@.service " Mar 17 19:39:34.344138 ignition[785]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Mar 17 19:39:34.344138 ignition[785]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 19:39:34.355153 ignition[785]: INFO : files: createResultFile: createFiles: op(13): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 19:39:34.356002 ignition[785]: INFO : files: createResultFile: createFiles: op(13): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 19:39:34.356002 ignition[785]: INFO : files: files passed Mar 17 19:39:34.356002 ignition[785]: INFO : Ignition finished successfully Mar 17 19:39:34.357651 systemd[1]: Finished ignition-files.service. Mar 17 19:39:34.370060 kernel: kauditd_printk_skb: 28 callbacks suppressed Mar 17 19:39:34.370082 kernel: audit: type=1130 audit(1742240374.360:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.360000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.365281 systemd[1]: Starting initrd-setup-root-after-ignition.service... Mar 17 19:39:34.367913 systemd[1]: torcx-profile-populate.service was skipped because of an unmet condition check (ConditionPathExists=/sysroot/etc/torcx/next-profile). Mar 17 19:39:34.372000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.372000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.368802 systemd[1]: Starting ignition-quench.service... Mar 17 19:39:34.383165 kernel: audit: type=1130 audit(1742240374.372:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.383196 kernel: audit: type=1131 audit(1742240374.372:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.372814 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 19:39:34.372897 systemd[1]: Finished ignition-quench.service. Mar 17 19:39:34.384997 initrd-setup-root-after-ignition[810]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 19:39:34.386487 systemd[1]: Finished initrd-setup-root-after-ignition.service. Mar 17 19:39:34.392354 kernel: audit: type=1130 audit(1742240374.386:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.386000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.387115 systemd[1]: Reached target ignition-complete.target. Mar 17 19:39:34.393564 systemd[1]: Starting initrd-parse-etc.service... Mar 17 19:39:34.407297 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 19:39:34.407395 systemd[1]: Finished initrd-parse-etc.service. Mar 17 19:39:34.420306 kernel: audit: type=1130 audit(1742240374.407:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.420332 kernel: audit: type=1131 audit(1742240374.407:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.407000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.407000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.408563 systemd[1]: Reached target initrd-fs.target. Mar 17 19:39:34.409118 systemd[1]: Reached target initrd.target. Mar 17 19:39:34.409570 systemd[1]: dracut-mount.service was skipped because no trigger condition checks were met. Mar 17 19:39:34.410365 systemd[1]: Starting dracut-pre-pivot.service... Mar 17 19:39:34.427271 systemd[1]: Finished dracut-pre-pivot.service. Mar 17 19:39:34.426000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.428720 systemd[1]: Starting initrd-cleanup.service... Mar 17 19:39:34.434369 kernel: audit: type=1130 audit(1742240374.426:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.439833 systemd[1]: Stopped target nss-lookup.target. Mar 17 19:39:34.440457 systemd[1]: Stopped target remote-cryptsetup.target. Mar 17 19:39:34.441500 systemd[1]: Stopped target timers.target. Mar 17 19:39:34.442498 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 19:39:34.448559 kernel: audit: type=1131 audit(1742240374.442:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.442000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.442607 systemd[1]: Stopped dracut-pre-pivot.service. Mar 17 19:39:34.443569 systemd[1]: Stopped target initrd.target. Mar 17 19:39:34.449167 systemd[1]: Stopped target basic.target. Mar 17 19:39:34.450141 systemd[1]: Stopped target ignition-complete.target. Mar 17 19:39:34.451092 systemd[1]: Stopped target ignition-diskful.target. Mar 17 19:39:34.452070 systemd[1]: Stopped target initrd-root-device.target. Mar 17 19:39:34.452997 systemd[1]: Stopped target remote-fs.target. Mar 17 19:39:34.453888 systemd[1]: Stopped target remote-fs-pre.target. Mar 17 19:39:34.454835 systemd[1]: Stopped target sysinit.target. Mar 17 19:39:34.455747 systemd[1]: Stopped target local-fs.target. Mar 17 19:39:34.456628 systemd[1]: Stopped target local-fs-pre.target. Mar 17 19:39:34.457512 systemd[1]: Stopped target swap.target. Mar 17 19:39:34.464176 kernel: audit: type=1131 audit(1742240374.458:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.458000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.458337 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 19:39:34.458493 systemd[1]: Stopped dracut-pre-mount.service. Mar 17 19:39:34.459405 systemd[1]: Stopped target cryptsetup.target. Mar 17 19:39:34.471107 kernel: audit: type=1131 audit(1742240374.465:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.465000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.464940 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 19:39:34.470000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.465197 systemd[1]: Stopped dracut-initqueue.service. Mar 17 19:39:34.466547 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 19:39:34.472000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.466727 systemd[1]: Stopped initrd-setup-root-after-ignition.service. Mar 17 19:39:34.472028 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 19:39:34.472212 systemd[1]: Stopped ignition-files.service. Mar 17 19:39:34.474495 systemd[1]: Stopping ignition-mount.service... Mar 17 19:39:34.480129 iscsid[647]: iscsid shutting down. Mar 17 19:39:34.487000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.488000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.490000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.475734 systemd[1]: Stopping iscsid.service... Mar 17 19:39:34.494000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.494000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.487317 systemd[1]: Stopping sysroot-boot.service... Mar 17 19:39:34.487801 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 19:39:34.487924 systemd[1]: Stopped systemd-udev-trigger.service. Mar 17 19:39:34.497000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.488522 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 19:39:34.488623 systemd[1]: Stopped dracut-pre-trigger.service. Mar 17 19:39:34.490497 systemd[1]: iscsid.service: Deactivated successfully. Mar 17 19:39:34.490590 systemd[1]: Stopped iscsid.service. Mar 17 19:39:34.491595 systemd[1]: Stopping iscsiuio.service... Mar 17 19:39:34.492203 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 19:39:34.492275 systemd[1]: Finished initrd-cleanup.service. Mar 17 19:39:34.495378 systemd[1]: iscsiuio.service: Deactivated successfully. Mar 17 19:39:34.495453 systemd[1]: Stopped iscsiuio.service. Mar 17 19:39:34.509260 ignition[823]: INFO : Ignition 2.14.0 Mar 17 19:39:34.509260 ignition[823]: INFO : Stage: umount Mar 17 19:39:34.511338 ignition[823]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 19:39:34.511338 ignition[823]: DEBUG : parsing config with SHA512: ce918cf8568bff1426dda9ea05b778568a1626fcf4c1bded9ebe13fee104bc1b92fac5f7093a3bfc7d99777c3793d01249c863845c2ca48413d9477d40af178a Mar 17 19:39:34.514483 ignition[823]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Mar 17 19:39:34.515000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.516000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.516000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.514155 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 19:39:34.517000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.518000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.520298 ignition[823]: INFO : umount: umount passed Mar 17 19:39:34.520298 ignition[823]: INFO : Ignition finished successfully Mar 17 19:39:34.515896 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 19:39:34.515997 systemd[1]: Stopped ignition-mount.service. Mar 17 19:39:34.516585 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 19:39:34.516624 systemd[1]: Stopped ignition-disks.service. Mar 17 19:39:34.517112 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 19:39:34.524000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.517146 systemd[1]: Stopped ignition-kargs.service. Mar 17 19:39:34.517658 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 17 19:39:34.517691 systemd[1]: Stopped ignition-fetch.service. Mar 17 19:39:34.527000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.518149 systemd[1]: Stopped target network.target. Mar 17 19:39:34.528000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.518570 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 19:39:34.530000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.518608 systemd[1]: Stopped ignition-fetch-offline.service. Mar 17 19:39:34.519105 systemd[1]: Stopped target paths.target. Mar 17 19:39:34.519505 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 19:39:34.521043 systemd[1]: Stopped systemd-ask-password-console.path. Mar 17 19:39:34.536000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.521758 systemd[1]: Stopped target slices.target. Mar 17 19:39:34.537000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.522211 systemd[1]: Stopped target sockets.target. Mar 17 19:39:34.538000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.523232 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 19:39:34.523264 systemd[1]: Closed iscsid.socket. Mar 17 19:39:34.524064 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 19:39:34.524096 systemd[1]: Closed iscsiuio.socket. Mar 17 19:39:34.524988 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 19:39:34.546000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.525026 systemd[1]: Stopped ignition-setup.service. Mar 17 19:39:34.525923 systemd[1]: Stopping systemd-networkd.service... Mar 17 19:39:34.547000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.527012 systemd[1]: Stopping systemd-resolved.service... Mar 17 19:39:34.528105 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 19:39:34.549000 audit: BPF prog-id=6 op=UNLOAD Mar 17 19:39:34.528175 systemd[1]: Stopped sysroot-boot.service. Mar 17 19:39:34.528893 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 19:39:34.553000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.528929 systemd[1]: Stopped initrd-setup-root.service. Mar 17 19:39:34.553000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.530048 systemd-networkd[640]: eth0: DHCPv6 lease lost Mar 17 19:39:34.553000 audit: BPF prog-id=9 op=UNLOAD Mar 17 19:39:34.554000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.530866 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 19:39:34.530940 systemd[1]: Stopped systemd-networkd.service. Mar 17 19:39:34.532043 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 19:39:34.532072 systemd[1]: Closed systemd-networkd.socket. Mar 17 19:39:34.535429 systemd[1]: Stopping network-cleanup.service... Mar 17 19:39:34.535916 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 19:39:34.536005 systemd[1]: Stopped parse-ip-for-networkd.service. Mar 17 19:39:34.537797 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 19:39:34.561000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.537836 systemd[1]: Stopped systemd-sysctl.service. Mar 17 19:39:34.562000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.538536 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 19:39:34.563000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.538572 systemd[1]: Stopped systemd-modules-load.service. Mar 17 19:39:34.542517 systemd[1]: Stopping systemd-udevd.service... Mar 17 19:39:34.544422 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 17 19:39:34.565000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.544901 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 19:39:34.566000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.566000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:34.545027 systemd[1]: Stopped systemd-resolved.service. Mar 17 19:39:34.547932 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 19:39:34.548075 systemd[1]: Stopped systemd-udevd.service. Mar 17 19:39:34.549632 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 19:39:34.549669 systemd[1]: Closed systemd-udevd-control.socket. Mar 17 19:39:34.550321 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 19:39:34.550347 systemd[1]: Closed systemd-udevd-kernel.socket. Mar 17 19:39:34.552246 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 19:39:34.552285 systemd[1]: Stopped dracut-pre-udev.service. Mar 17 19:39:34.553171 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 19:39:34.553206 systemd[1]: Stopped dracut-cmdline.service. Mar 17 19:39:34.554132 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 19:39:34.554168 systemd[1]: Stopped dracut-cmdline-ask.service. Mar 17 19:39:34.555579 systemd[1]: Starting initrd-udevadm-cleanup-db.service... Mar 17 19:39:34.561476 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 19:39:34.561530 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service. Mar 17 19:39:34.562674 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 19:39:34.562711 systemd[1]: Stopped kmod-static-nodes.service. Mar 17 19:39:34.563365 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 19:39:34.563400 systemd[1]: Stopped systemd-vconsole-setup.service. Mar 17 19:39:34.565143 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 17 19:39:34.581000 audit: BPF prog-id=5 op=UNLOAD Mar 17 19:39:34.581000 audit: BPF prog-id=4 op=UNLOAD Mar 17 19:39:34.581000 audit: BPF prog-id=3 op=UNLOAD Mar 17 19:39:34.565597 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 19:39:34.565671 systemd[1]: Stopped network-cleanup.service. Mar 17 19:39:34.582000 audit: BPF prog-id=8 op=UNLOAD Mar 17 19:39:34.582000 audit: BPF prog-id=7 op=UNLOAD Mar 17 19:39:34.566644 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 19:39:34.566715 systemd[1]: Finished initrd-udevadm-cleanup-db.service. Mar 17 19:39:34.567827 systemd[1]: Reached target initrd-switch-root.target. Mar 17 19:39:34.569265 systemd[1]: Starting initrd-switch-root.service... Mar 17 19:39:34.580347 systemd[1]: Switching root. Mar 17 19:39:34.595906 systemd-journald[186]: Journal stopped Mar 17 19:39:39.289661 systemd-journald[186]: Received SIGTERM from PID 1 (systemd). Mar 17 19:39:39.289711 kernel: SELinux: Class mctp_socket not defined in policy. Mar 17 19:39:39.289731 kernel: SELinux: Class anon_inode not defined in policy. Mar 17 19:39:39.289745 kernel: SELinux: the above unknown classes and permissions will be allowed Mar 17 19:39:39.289758 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 19:39:39.289773 kernel: SELinux: policy capability open_perms=1 Mar 17 19:39:39.289787 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 19:39:39.289799 kernel: SELinux: policy capability always_check_network=0 Mar 17 19:39:39.289810 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 19:39:39.289821 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 19:39:39.289832 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 19:39:39.289846 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 19:39:39.289858 systemd[1]: Successfully loaded SELinux policy in 94.975ms. Mar 17 19:39:39.289876 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 17.538ms. Mar 17 19:39:39.289891 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Mar 17 19:39:39.289904 systemd[1]: Detected virtualization kvm. Mar 17 19:39:39.289916 systemd[1]: Detected architecture x86-64. Mar 17 19:39:39.289928 systemd[1]: Detected first boot. Mar 17 19:39:39.289941 systemd[1]: Hostname set to . Mar 17 19:39:39.289975 systemd[1]: Initializing machine ID from VM UUID. Mar 17 19:39:39.289989 kernel: SELinux: Context system_u:object_r:container_file_t:s0:c1022,c1023 is not valid (left unmapped). Mar 17 19:39:39.290004 systemd[1]: Populated /etc with preset unit settings. Mar 17 19:39:39.290016 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 19:39:39.290029 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 19:39:39.290043 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 19:39:39.290056 systemd[1]: Queued start job for default target multi-user.target. Mar 17 19:39:39.290068 systemd[1]: Unnecessary job was removed for dev-vda6.device. Mar 17 19:39:39.290084 systemd[1]: Created slice system-addon\x2dconfig.slice. Mar 17 19:39:39.290096 systemd[1]: Created slice system-addon\x2drun.slice. Mar 17 19:39:39.290108 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice. Mar 17 19:39:39.290121 systemd[1]: Created slice system-getty.slice. Mar 17 19:39:39.290133 systemd[1]: Created slice system-modprobe.slice. Mar 17 19:39:39.290145 systemd[1]: Created slice system-serial\x2dgetty.slice. Mar 17 19:39:39.290157 systemd[1]: Created slice system-system\x2dcloudinit.slice. Mar 17 19:39:39.290169 systemd[1]: Created slice system-systemd\x2dfsck.slice. Mar 17 19:39:39.290181 systemd[1]: Created slice user.slice. Mar 17 19:39:39.290195 systemd[1]: Started systemd-ask-password-console.path. Mar 17 19:39:39.290208 systemd[1]: Started systemd-ask-password-wall.path. Mar 17 19:39:39.290220 systemd[1]: Set up automount boot.automount. Mar 17 19:39:39.290232 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount. Mar 17 19:39:39.290244 systemd[1]: Reached target integritysetup.target. Mar 17 19:39:39.290256 systemd[1]: Reached target remote-cryptsetup.target. Mar 17 19:39:39.290268 systemd[1]: Reached target remote-fs.target. Mar 17 19:39:39.290282 systemd[1]: Reached target slices.target. Mar 17 19:39:39.290294 systemd[1]: Reached target swap.target. Mar 17 19:39:39.290309 systemd[1]: Reached target torcx.target. Mar 17 19:39:39.290321 systemd[1]: Reached target veritysetup.target. Mar 17 19:39:39.290333 systemd[1]: Listening on systemd-coredump.socket. Mar 17 19:39:39.290347 systemd[1]: Listening on systemd-initctl.socket. Mar 17 19:39:39.290359 systemd[1]: Listening on systemd-journald-audit.socket. Mar 17 19:39:39.290371 systemd[1]: Listening on systemd-journald-dev-log.socket. Mar 17 19:39:39.290383 systemd[1]: Listening on systemd-journald.socket. Mar 17 19:39:39.290396 systemd[1]: Listening on systemd-networkd.socket. Mar 17 19:39:39.290408 systemd[1]: Listening on systemd-udevd-control.socket. Mar 17 19:39:39.290420 systemd[1]: Listening on systemd-udevd-kernel.socket. Mar 17 19:39:39.290432 systemd[1]: Listening on systemd-userdbd.socket. Mar 17 19:39:39.290459 systemd[1]: Mounting dev-hugepages.mount... Mar 17 19:39:39.290476 systemd[1]: Mounting dev-mqueue.mount... Mar 17 19:39:39.290488 systemd[1]: Mounting media.mount... Mar 17 19:39:39.290500 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 19:39:39.290512 systemd[1]: Mounting sys-kernel-debug.mount... Mar 17 19:39:39.290524 systemd[1]: Mounting sys-kernel-tracing.mount... Mar 17 19:39:39.290538 systemd[1]: Mounting tmp.mount... Mar 17 19:39:39.290550 systemd[1]: Starting flatcar-tmpfiles.service... Mar 17 19:39:39.290563 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 19:39:39.290575 systemd[1]: Starting kmod-static-nodes.service... Mar 17 19:39:39.290587 systemd[1]: Starting modprobe@configfs.service... Mar 17 19:39:39.290599 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 19:39:39.290611 systemd[1]: Starting modprobe@drm.service... Mar 17 19:39:39.290624 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 19:39:39.290636 systemd[1]: Starting modprobe@fuse.service... Mar 17 19:39:39.290649 systemd[1]: Starting modprobe@loop.service... Mar 17 19:39:39.290662 systemd[1]: setup-nsswitch.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 19:39:39.290674 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Mar 17 19:39:39.290686 systemd[1]: (This warning is only shown for the first unit using IP firewalling.) Mar 17 19:39:39.290698 systemd[1]: Starting systemd-journald.service... Mar 17 19:39:39.290710 systemd[1]: Starting systemd-modules-load.service... Mar 17 19:39:39.290723 systemd[1]: Starting systemd-network-generator.service... Mar 17 19:39:39.290734 kernel: fuse: init (API version 7.34) Mar 17 19:39:39.290746 systemd[1]: Starting systemd-remount-fs.service... Mar 17 19:39:39.290760 systemd[1]: Starting systemd-udev-trigger.service... Mar 17 19:39:39.290771 kernel: loop: module loaded Mar 17 19:39:39.290795 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 19:39:39.290810 systemd-journald[969]: Journal started Mar 17 19:39:39.290856 systemd-journald[969]: Runtime Journal (/run/log/journal/da037cdc4bea43f7b2105935d8de093c) is 8.0M, max 78.4M, 70.4M free. Mar 17 19:39:39.140000 audit[1]: AVC avc: denied { audit_read } for pid=1 comm="systemd" capability=37 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Mar 17 19:39:39.140000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Mar 17 19:39:39.287000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Mar 17 19:39:39.287000 audit[969]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffd83738260 a2=4000 a3=7ffd837382fc items=0 ppid=1 pid=969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:39:39.287000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Mar 17 19:39:39.300797 systemd[1]: Started systemd-journald.service. Mar 17 19:39:39.299000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.301392 systemd[1]: Mounted dev-hugepages.mount. Mar 17 19:39:39.302065 systemd[1]: Mounted dev-mqueue.mount. Mar 17 19:39:39.302570 systemd[1]: Mounted media.mount. Mar 17 19:39:39.303085 systemd[1]: Mounted sys-kernel-debug.mount. Mar 17 19:39:39.303605 systemd[1]: Mounted sys-kernel-tracing.mount. Mar 17 19:39:39.305887 systemd[1]: Mounted tmp.mount. Mar 17 19:39:39.308000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.309000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.309000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.309239 systemd[1]: Finished kmod-static-nodes.service. Mar 17 19:39:39.310012 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 19:39:39.310165 systemd[1]: Finished modprobe@configfs.service. Mar 17 19:39:39.310881 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 19:39:39.311000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.311000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.312414 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 19:39:39.313357 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 19:39:39.313000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.313000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.313708 systemd[1]: Finished modprobe@drm.service. Mar 17 19:39:39.314378 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 19:39:39.314000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.314000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.315000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.315000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.315060 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 19:39:39.315772 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 19:39:39.315906 systemd[1]: Finished modprobe@fuse.service. Mar 17 19:39:39.316652 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 19:39:39.316802 systemd[1]: Finished modprobe@loop.service. Mar 17 19:39:39.317000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.317000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.318000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.319000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.319189 systemd[1]: Finished systemd-modules-load.service. Mar 17 19:39:39.320000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.319938 systemd[1]: Finished systemd-network-generator.service. Mar 17 19:39:39.320697 systemd[1]: Finished systemd-remount-fs.service. Mar 17 19:39:39.321509 systemd[1]: Reached target network-pre.target. Mar 17 19:39:39.325348 systemd[1]: Mounting sys-fs-fuse-connections.mount... Mar 17 19:39:39.329284 systemd[1]: Mounting sys-kernel-config.mount... Mar 17 19:39:39.329835 systemd[1]: remount-root.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 19:39:39.335316 systemd[1]: Starting systemd-hwdb-update.service... Mar 17 19:39:39.337096 systemd[1]: Starting systemd-journal-flush.service... Mar 17 19:39:39.337740 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 19:39:39.338811 systemd[1]: Starting systemd-random-seed.service... Mar 17 19:39:39.341470 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 19:39:39.343496 systemd[1]: Starting systemd-sysctl.service... Mar 17 19:39:39.347345 systemd[1]: Mounted sys-fs-fuse-connections.mount. Mar 17 19:39:39.347992 systemd[1]: Mounted sys-kernel-config.mount. Mar 17 19:39:39.349537 systemd-journald[969]: Time spent on flushing to /var/log/journal/da037cdc4bea43f7b2105935d8de093c is 32.193ms for 1044 entries. Mar 17 19:39:39.349537 systemd-journald[969]: System Journal (/var/log/journal/da037cdc4bea43f7b2105935d8de093c) is 8.0M, max 584.8M, 576.8M free. Mar 17 19:39:39.418845 systemd-journald[969]: Received client request to flush runtime journal. Mar 17 19:39:39.418888 kernel: kauditd_printk_skb: 71 callbacks suppressed Mar 17 19:39:39.418908 kernel: audit: type=1130 audit(1742240379.365:111): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.418927 kernel: audit: type=1130 audit(1742240379.382:112): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.418966 kernel: audit: type=1130 audit(1742240379.401:113): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.365000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.382000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.401000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.365544 systemd[1]: Finished flatcar-tmpfiles.service. Mar 17 19:39:39.419000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.367625 systemd[1]: Starting systemd-sysusers.service... Mar 17 19:39:39.383264 systemd[1]: Finished systemd-random-seed.service. Mar 17 19:39:39.383867 systemd[1]: Reached target first-boot-complete.target. Mar 17 19:39:39.402175 systemd[1]: Finished systemd-sysctl.service. Mar 17 19:39:39.419708 systemd[1]: Finished systemd-journal-flush.service. Mar 17 19:39:39.431158 kernel: audit: type=1130 audit(1742240379.419:114): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.431616 systemd[1]: Finished systemd-udev-trigger.service. Mar 17 19:39:39.433252 systemd[1]: Starting systemd-udev-settle.service... Mar 17 19:39:39.440523 kernel: audit: type=1130 audit(1742240379.431:115): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.431000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.439883 systemd[1]: Finished systemd-sysusers.service. Mar 17 19:39:39.444526 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Mar 17 19:39:39.446880 udevadm[1014]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation-early.service, lvm2-activation.service not to pull it in. Mar 17 19:39:39.439000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.457017 kernel: audit: type=1130 audit(1742240379.439:116): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.499000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:39.500201 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Mar 17 19:39:39.506987 kernel: audit: type=1130 audit(1742240379.499:117): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.146854 systemd[1]: Finished systemd-hwdb-update.service. Mar 17 19:39:40.147000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.160539 systemd[1]: Starting systemd-udevd.service... Mar 17 19:39:40.161235 kernel: audit: type=1130 audit(1742240380.147:118): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.198011 systemd-udevd[1021]: Using default interface naming scheme 'v252'. Mar 17 19:39:40.249167 systemd[1]: Started systemd-udevd.service. Mar 17 19:39:40.253000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.257910 systemd[1]: Starting systemd-networkd.service... Mar 17 19:39:40.261018 kernel: audit: type=1130 audit(1742240380.253:119): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.276881 systemd[1]: Starting systemd-userdbd.service... Mar 17 19:39:40.317930 systemd[1]: Found device dev-ttyS0.device. Mar 17 19:39:40.324000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.324573 systemd[1]: Started systemd-userdbd.service. Mar 17 19:39:40.331976 kernel: audit: type=1130 audit(1742240380.324:120): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.401855 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Mar 17 19:39:40.410097 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Mar 17 19:39:40.427965 kernel: ACPI: button: Power Button [PWRF] Mar 17 19:39:40.428102 systemd-networkd[1042]: lo: Link UP Mar 17 19:39:40.428000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.428111 systemd-networkd[1042]: lo: Gained carrier Mar 17 19:39:40.428502 systemd-networkd[1042]: Enumeration completed Mar 17 19:39:40.428720 systemd[1]: Started systemd-networkd.service. Mar 17 19:39:40.429304 systemd-networkd[1042]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 19:39:40.430925 systemd-networkd[1042]: eth0: Link UP Mar 17 19:39:40.430937 systemd-networkd[1042]: eth0: Gained carrier Mar 17 19:39:40.437000 audit[1023]: AVC avc: denied { confidentiality } for pid=1023 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Mar 17 19:39:40.437000 audit[1023]: SYSCALL arch=c000003e syscall=175 success=yes exit=0 a0=56227183b170 a1=338ac a2=7f663ac26bc5 a3=5 items=110 ppid=1021 pid=1023 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:39:40.437000 audit: CWD cwd="/" Mar 17 19:39:40.437000 audit: PATH item=0 name=(null) inode=45 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=1 name=(null) inode=13876 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=2 name=(null) inode=13876 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=3 name=(null) inode=13877 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=4 name=(null) inode=13876 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=5 name=(null) inode=13878 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=6 name=(null) inode=13876 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=7 name=(null) inode=13879 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=8 name=(null) inode=13879 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=9 name=(null) inode=13880 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=10 name=(null) inode=13879 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=11 name=(null) inode=13881 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=12 name=(null) inode=13879 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=13 name=(null) inode=13882 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=14 name=(null) inode=13879 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=15 name=(null) inode=13883 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=16 name=(null) inode=13879 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=17 name=(null) inode=13884 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=18 name=(null) inode=13876 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=19 name=(null) inode=13885 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=20 name=(null) inode=13885 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=21 name=(null) inode=13886 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=22 name=(null) inode=13885 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=23 name=(null) inode=13887 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=24 name=(null) inode=13885 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=25 name=(null) inode=13888 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=26 name=(null) inode=13885 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=27 name=(null) inode=13889 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=28 name=(null) inode=13885 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=29 name=(null) inode=13890 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=30 name=(null) inode=13876 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=31 name=(null) inode=13891 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=32 name=(null) inode=13891 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=33 name=(null) inode=13892 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=34 name=(null) inode=13891 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=35 name=(null) inode=13893 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=36 name=(null) inode=13891 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=37 name=(null) inode=13894 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=38 name=(null) inode=13891 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=39 name=(null) inode=13895 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=40 name=(null) inode=13891 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=41 name=(null) inode=13896 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=42 name=(null) inode=13876 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=43 name=(null) inode=13897 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=44 name=(null) inode=13897 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=45 name=(null) inode=13898 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=46 name=(null) inode=13897 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=47 name=(null) inode=13899 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=48 name=(null) inode=13897 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=49 name=(null) inode=13900 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=50 name=(null) inode=13897 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=51 name=(null) inode=13901 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=52 name=(null) inode=13897 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=53 name=(null) inode=13902 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=54 name=(null) inode=45 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=55 name=(null) inode=13903 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=56 name=(null) inode=13903 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=57 name=(null) inode=13904 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=58 name=(null) inode=13903 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=59 name=(null) inode=13905 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=60 name=(null) inode=13903 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=61 name=(null) inode=13906 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=62 name=(null) inode=13906 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=63 name=(null) inode=13907 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=64 name=(null) inode=13906 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=65 name=(null) inode=13908 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=66 name=(null) inode=13906 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=67 name=(null) inode=13909 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=68 name=(null) inode=13906 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=69 name=(null) inode=13910 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=70 name=(null) inode=13906 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=71 name=(null) inode=13911 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=72 name=(null) inode=13903 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=73 name=(null) inode=13912 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=74 name=(null) inode=13912 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=75 name=(null) inode=13913 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=76 name=(null) inode=13912 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=77 name=(null) inode=13914 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=78 name=(null) inode=13912 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=79 name=(null) inode=13915 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=80 name=(null) inode=13912 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=81 name=(null) inode=13916 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=82 name=(null) inode=13912 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=83 name=(null) inode=13917 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=84 name=(null) inode=13903 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=85 name=(null) inode=13918 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=86 name=(null) inode=13918 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=87 name=(null) inode=13919 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=88 name=(null) inode=13918 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=89 name=(null) inode=13920 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=90 name=(null) inode=13918 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=91 name=(null) inode=13921 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=92 name=(null) inode=13918 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=93 name=(null) inode=13922 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=94 name=(null) inode=13918 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=95 name=(null) inode=13923 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=96 name=(null) inode=13903 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=97 name=(null) inode=13924 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=98 name=(null) inode=13924 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=99 name=(null) inode=13925 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=100 name=(null) inode=13924 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=101 name=(null) inode=13926 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=102 name=(null) inode=13924 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=103 name=(null) inode=13927 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=104 name=(null) inode=13924 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=105 name=(null) inode=13928 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=106 name=(null) inode=13924 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=107 name=(null) inode=13929 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=108 name=(null) inode=1 dev=00:07 mode=040700 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:debugfs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PATH item=109 name=(null) inode=13930 dev=00:07 mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:debugfs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:39:40.437000 audit: PROCTITLE proctitle="(udev-worker)" Mar 17 19:39:40.442048 systemd-networkd[1042]: eth0: DHCPv4 address 172.24.4.218/24, gateway 172.24.4.1 acquired from 172.24.4.1 Mar 17 19:39:40.448971 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0 Mar 17 19:39:40.463970 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 17 19:39:40.478975 kernel: mousedev: PS/2 mouse device common for all mice Mar 17 19:39:40.523000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-settle comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.524386 systemd[1]: Finished systemd-udev-settle.service. Mar 17 19:39:40.525992 systemd[1]: Starting lvm2-activation-early.service... Mar 17 19:39:40.567509 lvm[1056]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 19:39:40.614934 systemd[1]: Finished lvm2-activation-early.service. Mar 17 19:39:40.615000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.616363 systemd[1]: Reached target cryptsetup.target. Mar 17 19:39:40.619917 systemd[1]: Starting lvm2-activation.service... Mar 17 19:39:40.629395 lvm[1058]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 19:39:40.668776 systemd[1]: Finished lvm2-activation.service. Mar 17 19:39:40.669000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.670200 systemd[1]: Reached target local-fs-pre.target. Mar 17 19:39:40.671448 systemd[1]: var-lib-machines.mount was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 19:39:40.671507 systemd[1]: Reached target local-fs.target. Mar 17 19:39:40.672644 systemd[1]: Reached target machines.target. Mar 17 19:39:40.676472 systemd[1]: Starting ldconfig.service... Mar 17 19:39:40.679138 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 19:39:40.679271 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 19:39:40.681729 systemd[1]: Starting systemd-boot-update.service... Mar 17 19:39:40.684900 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service... Mar 17 19:39:40.690109 systemd[1]: Starting systemd-machine-id-commit.service... Mar 17 19:39:40.693669 systemd[1]: Starting systemd-sysext.service... Mar 17 19:39:40.710904 systemd[1]: boot.automount: Got automount request for /boot, triggered by 1061 (bootctl) Mar 17 19:39:40.714197 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service... Mar 17 19:39:40.730977 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service. Mar 17 19:39:40.731000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:40.736536 systemd[1]: Unmounting usr-share-oem.mount... Mar 17 19:39:40.740185 systemd[1]: usr-share-oem.mount: Deactivated successfully. Mar 17 19:39:40.740384 systemd[1]: Unmounted usr-share-oem.mount. Mar 17 19:39:40.753975 kernel: loop0: detected capacity change from 0 to 210664 Mar 17 19:39:41.064622 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 19:39:41.066123 systemd[1]: Finished systemd-machine-id-commit.service. Mar 17 19:39:41.067000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.116003 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 19:39:41.151012 kernel: loop1: detected capacity change from 0 to 210664 Mar 17 19:39:41.202415 (sd-sysext)[1079]: Using extensions 'kubernetes'. Mar 17 19:39:41.206076 (sd-sysext)[1079]: Merged extensions into '/usr'. Mar 17 19:39:41.232476 systemd-fsck[1075]: fsck.fat 4.2 (2021-01-31) Mar 17 19:39:41.232476 systemd-fsck[1075]: /dev/vda1: 789 files, 119299/258078 clusters Mar 17 19:39:41.238944 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service. Mar 17 19:39:41.240000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.243646 systemd[1]: Mounting boot.mount... Mar 17 19:39:41.270079 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 19:39:41.271437 systemd[1]: Mounting usr-share-oem.mount... Mar 17 19:39:41.272137 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.273197 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 19:39:41.275763 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 19:39:41.278209 systemd[1]: Starting modprobe@loop.service... Mar 17 19:39:41.278735 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.278866 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 19:39:41.279041 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 19:39:41.280362 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 19:39:41.280509 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 19:39:41.280000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.283000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.285074 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 19:39:41.285217 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 19:39:41.284000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.284000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.291334 systemd[1]: Mounted usr-share-oem.mount. Mar 17 19:39:41.293214 systemd[1]: Finished systemd-sysext.service. Mar 17 19:39:41.292000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.293931 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 19:39:41.295043 systemd[1]: Finished modprobe@loop.service. Mar 17 19:39:41.294000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.294000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.297661 systemd[1]: Mounted boot.mount. Mar 17 19:39:41.300884 systemd[1]: Starting ensure-sysext.service... Mar 17 19:39:41.301493 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 19:39:41.301554 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.305691 systemd[1]: Starting systemd-tmpfiles-setup.service... Mar 17 19:39:41.311467 systemd[1]: Reloading. Mar 17 19:39:41.326745 systemd-tmpfiles[1097]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Mar 17 19:39:41.329310 systemd-tmpfiles[1097]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 19:39:41.333882 systemd-tmpfiles[1097]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 19:39:41.380624 /usr/lib/systemd/system-generators/torcx-generator[1117]: time="2025-03-17T19:39:41Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 19:39:41.380992 /usr/lib/systemd/system-generators/torcx-generator[1117]: time="2025-03-17T19:39:41Z" level=info msg="torcx already run" Mar 17 19:39:41.522476 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 19:39:41.522493 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 19:39:41.550808 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 19:39:41.621449 systemd[1]: Finished systemd-boot-update.service. Mar 17 19:39:41.621000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-boot-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.623000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.623813 systemd[1]: Finished systemd-tmpfiles-setup.service. Mar 17 19:39:41.626699 systemd[1]: Starting audit-rules.service... Mar 17 19:39:41.628399 systemd[1]: Starting clean-ca-certificates.service... Mar 17 19:39:41.630178 systemd[1]: Starting systemd-journal-catalog-update.service... Mar 17 19:39:41.635912 systemd[1]: Starting systemd-resolved.service... Mar 17 19:39:41.637852 systemd[1]: Starting systemd-timesyncd.service... Mar 17 19:39:41.640051 systemd[1]: Starting systemd-update-utmp.service... Mar 17 19:39:41.644000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.642226 systemd[1]: Finished clean-ca-certificates.service. Mar 17 19:39:41.650541 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 19:39:41.655000 audit[1179]: SYSTEM_BOOT pid=1179 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.658142 systemd[1]: Finished systemd-update-utmp.service. Mar 17 19:39:41.657000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.660931 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.662184 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 19:39:41.663836 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 19:39:41.666045 systemd[1]: Starting modprobe@loop.service... Mar 17 19:39:41.668081 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.669682 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 19:39:41.669829 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 19:39:41.670910 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 19:39:41.673063 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 19:39:41.674000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.674000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.679131 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.680331 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 19:39:41.680891 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.681031 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 19:39:41.681157 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 19:39:41.686495 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.687783 systemd[1]: Starting modprobe@drm.service... Mar 17 19:39:41.688863 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.689009 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 19:39:41.691346 systemd[1]: Starting systemd-networkd-wait-online.service... Mar 17 19:39:41.694288 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 19:39:41.696942 systemd[1]: Finished systemd-journal-catalog-update.service. Mar 17 19:39:41.697000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.698513 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 19:39:41.698655 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 19:39:41.698000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.699000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.700000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.700000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.700399 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 19:39:41.700533 systemd[1]: Finished modprobe@loop.service. Mar 17 19:39:41.701381 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 19:39:41.701506 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 19:39:41.702000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.702000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.703598 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 19:39:41.704000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.704000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.705156 systemd[1]: Finished modprobe@drm.service. Mar 17 19:39:41.706153 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 19:39:41.706261 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.708000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:39:41.709307 systemd[1]: Finished ensure-sysext.service. Mar 17 19:39:41.740469 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 19:39:41.740490 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 19:39:41.767000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Mar 17 19:39:41.767000 audit[1208]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffbba714b0 a2=420 a3=0 items=0 ppid=1172 pid=1208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:39:41.767000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Mar 17 19:39:41.768267 augenrules[1208]: No rules Mar 17 19:39:41.769065 systemd[1]: Finished audit-rules.service. Mar 17 19:39:41.784180 ldconfig[1060]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 19:39:41.792288 systemd-resolved[1175]: Positive Trust Anchors: Mar 17 19:39:41.792540 systemd-resolved[1175]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 19:39:41.792628 systemd-resolved[1175]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Mar 17 19:39:41.802242 systemd[1]: Started systemd-timesyncd.service. Mar 17 19:39:41.802882 systemd[1]: Reached target time-set.target. Mar 17 19:39:41.811408 systemd[1]: Finished ldconfig.service. Mar 17 19:39:41.813039 systemd[1]: Starting systemd-update-done.service... Mar 17 19:39:41.815079 systemd-resolved[1175]: Using system hostname 'ci-3510-3-7-8-c8b8528301.novalocal'. Mar 17 19:39:41.817737 systemd[1]: Started systemd-resolved.service. Mar 17 19:39:41.818299 systemd[1]: Reached target network.target. Mar 17 19:39:41.818752 systemd[1]: Reached target nss-lookup.target. Mar 17 19:39:41.822059 systemd[1]: Finished systemd-update-done.service. Mar 17 19:39:41.822595 systemd[1]: Reached target sysinit.target. Mar 17 19:39:41.823130 systemd[1]: Started motdgen.path. Mar 17 19:39:41.823563 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path. Mar 17 19:39:41.824184 systemd[1]: Started logrotate.timer. Mar 17 19:39:41.824695 systemd[1]: Started mdadm.timer. Mar 17 19:39:41.825145 systemd[1]: Started systemd-tmpfiles-clean.timer. Mar 17 19:39:41.825600 systemd[1]: update-engine-stub.timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 19:39:41.825629 systemd[1]: Reached target paths.target. Mar 17 19:39:41.826099 systemd[1]: Reached target timers.target. Mar 17 19:39:41.826793 systemd[1]: Listening on dbus.socket. Mar 17 19:39:41.828269 systemd[1]: Starting docker.socket... Mar 17 19:39:41.830552 systemd[1]: Listening on sshd.socket. Mar 17 19:39:41.831207 systemd[1]: systemd-pcrphase-sysinit.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 19:39:41.831695 systemd[1]: Listening on docker.socket. Mar 17 19:39:41.832276 systemd[1]: Reached target sockets.target. Mar 17 19:39:41.832775 systemd[1]: Reached target basic.target. Mar 17 19:39:41.833413 systemd[1]: System is tainted: cgroupsv1 Mar 17 19:39:41.833453 systemd[1]: addon-config@usr-share-oem.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.833475 systemd[1]: addon-run@usr-share-oem.service was skipped because no trigger condition checks were met. Mar 17 19:39:41.834421 systemd[1]: Starting containerd.service... Mar 17 19:39:41.836635 systemd[1]: Starting coreos-metadata-sshkeys@core.service... Mar 17 19:39:41.838120 systemd[1]: Starting dbus.service... Mar 17 19:39:41.839597 systemd[1]: Starting enable-oem-cloudinit.service... Mar 17 19:39:41.841182 systemd[1]: Starting extend-filesystems.service... Mar 17 19:39:41.841690 systemd[1]: flatcar-setup-environment.service was skipped because of an unmet condition check (ConditionPathExists=/usr/share/oem/bin/flatcar-setup-environment). Mar 17 19:39:41.842789 systemd[1]: Starting motdgen.service... Mar 17 19:39:41.847844 systemd[1]: Starting prepare-helm.service... Mar 17 19:39:41.849356 systemd[1]: Starting ssh-key-proc-cmdline.service... Mar 17 19:39:41.852470 systemd[1]: Starting sshd-keygen.service... Mar 17 19:39:41.859658 jq[1224]: false Mar 17 19:39:41.861816 systemd[1]: Starting systemd-logind.service... Mar 17 19:39:41.862408 systemd[1]: systemd-pcrphase.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 19:39:41.862477 systemd[1]: tcsd.service was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 19:39:41.880616 jq[1238]: true Mar 17 19:39:41.863663 systemd[1]: Starting update-engine.service... Mar 17 19:39:41.869345 systemd[1]: Starting update-ssh-keys-after-ignition.service... Mar 17 19:39:41.876887 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 19:39:41.878248 systemd[1]: Condition check resulted in enable-oem-cloudinit.service being skipped. Mar 17 19:39:41.880179 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 19:39:41.881487 systemd[1]: Finished ssh-key-proc-cmdline.service. Mar 17 19:39:41.895033 extend-filesystems[1225]: Found loop1 Mar 17 19:39:41.898238 systemd[1]: Created slice system-sshd.slice. Mar 17 19:39:41.902970 tar[1244]: linux-amd64/helm Mar 17 19:39:41.910546 extend-filesystems[1225]: Found vda Mar 17 19:39:41.910546 extend-filesystems[1225]: Found vda1 Mar 17 19:39:41.910546 extend-filesystems[1225]: Found vda2 Mar 17 19:39:41.910546 extend-filesystems[1225]: Found vda3 Mar 17 19:39:41.910546 extend-filesystems[1225]: Found usr Mar 17 19:39:41.910546 extend-filesystems[1225]: Found vda4 Mar 17 19:39:41.910546 extend-filesystems[1225]: Found vda6 Mar 17 19:39:41.910546 extend-filesystems[1225]: Found vda7 Mar 17 19:39:41.910546 extend-filesystems[1225]: Found vda9 Mar 17 19:39:41.910546 extend-filesystems[1225]: Checking size of /dev/vda9 Mar 17 19:39:41.936402 jq[1249]: true Mar 17 19:39:41.930674 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 19:39:41.930983 systemd-timesyncd[1176]: Contacted time server 23.111.186.186:123 (0.flatcar.pool.ntp.org). Mar 17 19:39:41.931036 systemd-timesyncd[1176]: Initial clock synchronization to Mon 2025-03-17 19:39:41.897058 UTC. Mar 17 19:39:41.932276 systemd[1]: Finished motdgen.service. Mar 17 19:39:41.948866 extend-filesystems[1225]: Resized partition /dev/vda9 Mar 17 19:39:41.950160 dbus-daemon[1223]: [system] SELinux support is enabled Mar 17 19:39:41.950347 systemd[1]: Started dbus.service. Mar 17 19:39:41.956418 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 19:39:41.956444 systemd[1]: Reached target system-config.target. Mar 17 19:39:41.956943 systemd[1]: user-cloudinit-proc-cmdline.service was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 19:39:41.956979 systemd[1]: Reached target user-config.target. Mar 17 19:39:41.964452 extend-filesystems[1279]: resize2fs 1.46.5 (30-Dec-2021) Mar 17 19:39:42.015984 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 2014203 blocks Mar 17 19:39:42.021345 env[1250]: time="2025-03-17T19:39:42.021297833Z" level=info msg="starting containerd" revision=92b3a9d6f1b3bcc6dc74875cfdea653fe39f09c2 version=1.6.16 Mar 17 19:39:42.027969 kernel: EXT4-fs (vda9): resized filesystem to 2014203 Mar 17 19:39:42.097816 env[1250]: time="2025-03-17T19:39:42.059353764Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 19:39:42.097881 extend-filesystems[1279]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 17 19:39:42.097881 extend-filesystems[1279]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 17 19:39:42.097881 extend-filesystems[1279]: The filesystem on /dev/vda9 is now 2014203 (4k) blocks long. Mar 17 19:39:42.093580 systemd-logind[1236]: Watching system buttons on /dev/input/event1 (Power Button) Mar 17 19:39:42.103063 env[1250]: time="2025-03-17T19:39:42.099200499Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 19:39:42.103097 extend-filesystems[1225]: Resized filesystem in /dev/vda9 Mar 17 19:39:42.093615 systemd-logind[1236]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 17 19:39:42.106748 env[1250]: time="2025-03-17T19:39:42.103619076Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.15.179-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 19:39:42.106748 env[1250]: time="2025-03-17T19:39:42.103699337Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 19:39:42.093917 systemd-logind[1236]: New seat seat0. Mar 17 19:39:42.102467 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 19:39:42.102835 systemd[1]: Finished extend-filesystems.service. Mar 17 19:39:42.106184 systemd[1]: Started systemd-logind.service. Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.108231068Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.108265710Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.108301362Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.108316278Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.108435390Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.108742147Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.108966513Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.108986959Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.109058652Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Mar 17 19:39:42.109193 env[1250]: time="2025-03-17T19:39:42.109073258Z" level=info msg="metadata content store policy set" policy=shared Mar 17 19:39:42.135006 bash[1284]: Updated "/home/core/.ssh/authorized_keys" Mar 17 19:39:42.135770 systemd[1]: Finished update-ssh-keys-after-ignition.service. Mar 17 19:39:42.137894 update_engine[1237]: I0317 19:39:42.134636 1237 main.cc:92] Flatcar Update Engine starting Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.150834987Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.150903421Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.150920997Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.150980462Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151004437Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151019953Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151086097Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151102053Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151118549Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151133906Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151170137Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151184773Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151336018Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 19:39:42.151969 env[1250]: time="2025-03-17T19:39:42.151447381Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 19:39:42.152318 env[1250]: time="2025-03-17T19:39:42.151901543Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 19:39:42.152318 env[1250]: time="2025-03-17T19:39:42.151933265Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152655223Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152734075Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152752750Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152766297Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152828403Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152845268Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152862184Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152877750Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152890648Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.152922290Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.153088330Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.153106566Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.153140038Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.153900 env[1250]: time="2025-03-17T19:39:42.153157074Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 19:39:42.152679 systemd[1]: Started update-engine.service. Mar 17 19:39:42.154315 env[1250]: time="2025-03-17T19:39:42.153173090Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1 Mar 17 19:39:42.154315 env[1250]: time="2025-03-17T19:39:42.153190356Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 19:39:42.154315 env[1250]: time="2025-03-17T19:39:42.153232526Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin" Mar 17 19:39:42.154315 env[1250]: time="2025-03-17T19:39:42.153275156Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 19:39:42.154412 env[1250]: time="2025-03-17T19:39:42.153529126Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.6 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 19:39:42.159773 env[1250]: time="2025-03-17T19:39:42.153884001Z" level=info msg="Connect containerd service" Mar 17 19:39:42.159773 env[1250]: time="2025-03-17T19:39:42.154995137Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 19:39:42.159773 env[1250]: time="2025-03-17T19:39:42.156293228Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 19:39:42.159773 env[1250]: time="2025-03-17T19:39:42.156873230Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 19:39:42.159773 env[1250]: time="2025-03-17T19:39:42.156923997Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 19:39:42.159773 env[1250]: time="2025-03-17T19:39:42.157021244Z" level=info msg="containerd successfully booted in 0.150347s" Mar 17 19:39:42.160017 update_engine[1237]: I0317 19:39:42.155961 1237 update_check_scheduler.cc:74] Next update check in 11m27s Mar 17 19:39:42.154850 systemd[1]: Started locksmithd.service. Mar 17 19:39:42.157078 systemd[1]: Started containerd.service. Mar 17 19:39:42.164288 env[1250]: time="2025-03-17T19:39:42.163524956Z" level=info msg="Start subscribing containerd event" Mar 17 19:39:42.165271 env[1250]: time="2025-03-17T19:39:42.165249416Z" level=info msg="Start recovering state" Mar 17 19:39:42.165427 env[1250]: time="2025-03-17T19:39:42.165412837Z" level=info msg="Start event monitor" Mar 17 19:39:42.165506 env[1250]: time="2025-03-17T19:39:42.165491298Z" level=info msg="Start snapshots syncer" Mar 17 19:39:42.165599 env[1250]: time="2025-03-17T19:39:42.165584515Z" level=info msg="Start cni network conf syncer for default" Mar 17 19:39:42.165673 env[1250]: time="2025-03-17T19:39:42.165658768Z" level=info msg="Start streaming server" Mar 17 19:39:42.195075 systemd-networkd[1042]: eth0: Gained IPv6LL Mar 17 19:39:42.197579 systemd[1]: Finished systemd-networkd-wait-online.service. Mar 17 19:39:42.198379 systemd[1]: Reached target network-online.target. Mar 17 19:39:42.204845 systemd[1]: Starting kubelet.service... Mar 17 19:39:42.868287 tar[1244]: linux-amd64/LICENSE Mar 17 19:39:42.869499 tar[1244]: linux-amd64/README.md Mar 17 19:39:42.874096 systemd[1]: Finished prepare-helm.service. Mar 17 19:39:42.892163 locksmithd[1292]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 19:39:43.829480 systemd[1]: Started kubelet.service. Mar 17 19:39:44.267820 sshd_keygen[1262]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 19:39:44.294150 systemd[1]: Finished sshd-keygen.service. Mar 17 19:39:44.296234 systemd[1]: Starting issuegen.service... Mar 17 19:39:44.297643 systemd[1]: Started sshd@0-172.24.4.218:22-172.24.4.1:42338.service. Mar 17 19:39:44.306508 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 19:39:44.306718 systemd[1]: Finished issuegen.service. Mar 17 19:39:44.308568 systemd[1]: Starting systemd-user-sessions.service... Mar 17 19:39:44.320334 systemd[1]: Finished systemd-user-sessions.service. Mar 17 19:39:44.322215 systemd[1]: Started getty@tty1.service. Mar 17 19:39:44.323866 systemd[1]: Started serial-getty@ttyS0.service. Mar 17 19:39:44.325265 systemd[1]: Reached target getty.target. Mar 17 19:39:45.337781 sshd[1327]: Accepted publickey for core from 172.24.4.1 port 42338 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:39:45.342148 sshd[1327]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:39:45.375842 systemd-logind[1236]: New session 1 of user core. Mar 17 19:39:45.378395 systemd[1]: Created slice user-500.slice. Mar 17 19:39:45.382560 systemd[1]: Starting user-runtime-dir@500.service... Mar 17 19:39:45.412985 systemd[1]: Finished user-runtime-dir@500.service. Mar 17 19:39:45.420382 systemd[1]: Starting user@500.service... Mar 17 19:39:45.429331 (systemd)[1340]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:39:45.523229 systemd[1340]: Queued start job for default target default.target. Mar 17 19:39:45.524149 systemd[1340]: Reached target paths.target. Mar 17 19:39:45.524252 systemd[1340]: Reached target sockets.target. Mar 17 19:39:45.524345 systemd[1340]: Reached target timers.target. Mar 17 19:39:45.524434 systemd[1340]: Reached target basic.target. Mar 17 19:39:45.524611 systemd[1]: Started user@500.service. Mar 17 19:39:45.526018 systemd[1]: Started session-1.scope. Mar 17 19:39:45.526714 systemd[1340]: Reached target default.target. Mar 17 19:39:45.526884 systemd[1340]: Startup finished in 91ms. Mar 17 19:39:45.594175 kubelet[1311]: E0317 19:39:45.594095 1311 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 19:39:45.597623 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 19:39:45.597993 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 19:39:46.044939 systemd[1]: Started sshd@1-172.24.4.218:22-172.24.4.1:52514.service. Mar 17 19:39:48.117931 sshd[1350]: Accepted publickey for core from 172.24.4.1 port 52514 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:39:48.120555 sshd[1350]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:39:48.131265 systemd-logind[1236]: New session 2 of user core. Mar 17 19:39:48.132029 systemd[1]: Started session-2.scope. Mar 17 19:39:48.918205 sshd[1350]: pam_unix(sshd:session): session closed for user core Mar 17 19:39:48.918593 systemd[1]: Started sshd@2-172.24.4.218:22-172.24.4.1:52522.service. Mar 17 19:39:48.927875 systemd-logind[1236]: Session 2 logged out. Waiting for processes to exit. Mar 17 19:39:48.929555 systemd[1]: sshd@1-172.24.4.218:22-172.24.4.1:52514.service: Deactivated successfully. Mar 17 19:39:48.931332 systemd[1]: session-2.scope: Deactivated successfully. Mar 17 19:39:48.936312 systemd-logind[1236]: Removed session 2. Mar 17 19:39:48.952439 coreos-metadata[1222]: Mar 17 19:39:48.952 WARN failed to locate config-drive, using the metadata service API instead Mar 17 19:39:49.051331 coreos-metadata[1222]: Mar 17 19:39:49.051 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Mar 17 19:39:49.301251 coreos-metadata[1222]: Mar 17 19:39:49.301 INFO Fetch successful Mar 17 19:39:49.301251 coreos-metadata[1222]: Mar 17 19:39:49.301 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Mar 17 19:39:49.317131 coreos-metadata[1222]: Mar 17 19:39:49.317 INFO Fetch successful Mar 17 19:39:49.319916 unknown[1222]: wrote ssh authorized keys file for user: core Mar 17 19:39:49.378023 update-ssh-keys[1361]: Updated "/home/core/.ssh/authorized_keys" Mar 17 19:39:49.378750 systemd[1]: Finished coreos-metadata-sshkeys@core.service. Mar 17 19:39:49.379558 systemd[1]: Reached target multi-user.target. Mar 17 19:39:49.382698 systemd[1]: Starting systemd-update-utmp-runlevel.service... Mar 17 19:39:49.404252 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Mar 17 19:39:49.404732 systemd[1]: Finished systemd-update-utmp-runlevel.service. Mar 17 19:39:49.405707 systemd[1]: Startup finished in 8.236s (kernel) + 14.500s (userspace) = 22.736s. Mar 17 19:39:50.425874 sshd[1355]: Accepted publickey for core from 172.24.4.1 port 52522 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:39:50.428447 sshd[1355]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:39:50.438728 systemd-logind[1236]: New session 3 of user core. Mar 17 19:39:50.439721 systemd[1]: Started session-3.scope. Mar 17 19:39:51.039724 sshd[1355]: pam_unix(sshd:session): session closed for user core Mar 17 19:39:51.045845 systemd[1]: sshd@2-172.24.4.218:22-172.24.4.1:52522.service: Deactivated successfully. Mar 17 19:39:51.048050 systemd-logind[1236]: Session 3 logged out. Waiting for processes to exit. Mar 17 19:39:51.048214 systemd[1]: session-3.scope: Deactivated successfully. Mar 17 19:39:51.050842 systemd-logind[1236]: Removed session 3. Mar 17 19:39:55.849817 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 19:39:55.850293 systemd[1]: Stopped kubelet.service. Mar 17 19:39:55.853379 systemd[1]: Starting kubelet.service... Mar 17 19:39:56.117817 systemd[1]: Started kubelet.service. Mar 17 19:39:56.220479 kubelet[1380]: E0317 19:39:56.220447 1380 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 19:39:56.225537 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 19:39:56.225679 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 19:40:01.038101 systemd[1]: Started sshd@3-172.24.4.218:22-172.24.4.1:42610.service. Mar 17 19:40:02.212891 sshd[1387]: Accepted publickey for core from 172.24.4.1 port 42610 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:40:02.216209 sshd[1387]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:40:02.226292 systemd-logind[1236]: New session 4 of user core. Mar 17 19:40:02.227048 systemd[1]: Started session-4.scope. Mar 17 19:40:02.857177 sshd[1387]: pam_unix(sshd:session): session closed for user core Mar 17 19:40:02.858108 systemd[1]: Started sshd@4-172.24.4.218:22-172.24.4.1:42614.service. Mar 17 19:40:02.863632 systemd[1]: sshd@3-172.24.4.218:22-172.24.4.1:42610.service: Deactivated successfully. Mar 17 19:40:02.868804 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 19:40:02.869737 systemd-logind[1236]: Session 4 logged out. Waiting for processes to exit. Mar 17 19:40:02.874798 systemd-logind[1236]: Removed session 4. Mar 17 19:40:04.186507 sshd[1392]: Accepted publickey for core from 172.24.4.1 port 42614 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:40:04.189781 sshd[1392]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:40:04.200115 systemd-logind[1236]: New session 5 of user core. Mar 17 19:40:04.200851 systemd[1]: Started session-5.scope. Mar 17 19:40:04.781704 sshd[1392]: pam_unix(sshd:session): session closed for user core Mar 17 19:40:04.785736 systemd[1]: Started sshd@5-172.24.4.218:22-172.24.4.1:45722.service. Mar 17 19:40:04.791934 systemd[1]: sshd@4-172.24.4.218:22-172.24.4.1:42614.service: Deactivated successfully. Mar 17 19:40:04.796458 systemd-logind[1236]: Session 5 logged out. Waiting for processes to exit. Mar 17 19:40:04.796599 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 19:40:04.802214 systemd-logind[1236]: Removed session 5. Mar 17 19:40:06.072242 sshd[1399]: Accepted publickey for core from 172.24.4.1 port 45722 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:40:06.075495 sshd[1399]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:40:06.084067 systemd-logind[1236]: New session 6 of user core. Mar 17 19:40:06.086064 systemd[1]: Started session-6.scope. Mar 17 19:40:06.477058 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 19:40:06.477499 systemd[1]: Stopped kubelet.service. Mar 17 19:40:06.481352 systemd[1]: Starting kubelet.service... Mar 17 19:40:06.765926 systemd[1]: Started kubelet.service. Mar 17 19:40:06.893308 sshd[1399]: pam_unix(sshd:session): session closed for user core Mar 17 19:40:06.896986 systemd[1]: Started sshd@6-172.24.4.218:22-172.24.4.1:45732.service. Mar 17 19:40:06.902248 systemd[1]: sshd@5-172.24.4.218:22-172.24.4.1:45722.service: Deactivated successfully. Mar 17 19:40:06.903807 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 19:40:06.907753 systemd-logind[1236]: Session 6 logged out. Waiting for processes to exit. Mar 17 19:40:06.916363 systemd-logind[1236]: Removed session 6. Mar 17 19:40:06.922024 kubelet[1414]: E0317 19:40:06.921851 1414 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 19:40:06.924810 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 19:40:06.925004 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 19:40:08.235701 sshd[1420]: Accepted publickey for core from 172.24.4.1 port 45732 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:40:08.238171 sshd[1420]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:40:08.248917 systemd[1]: Started session-7.scope. Mar 17 19:40:08.249823 systemd-logind[1236]: New session 7 of user core. Mar 17 19:40:08.818390 sudo[1427]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 19:40:08.819683 sudo[1427]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 19:40:08.831269 dbus-daemon[1223]: \xd0]\x9f\x84?V: received setenforce notice (enforcing=-461167344) Mar 17 19:40:08.835160 sudo[1427]: pam_unix(sudo:session): session closed for user root Mar 17 19:40:09.028473 sshd[1420]: pam_unix(sshd:session): session closed for user core Mar 17 19:40:09.033190 systemd[1]: Started sshd@7-172.24.4.218:22-172.24.4.1:45738.service. Mar 17 19:40:09.040634 systemd[1]: sshd@6-172.24.4.218:22-172.24.4.1:45732.service: Deactivated successfully. Mar 17 19:40:09.044356 systemd-logind[1236]: Session 7 logged out. Waiting for processes to exit. Mar 17 19:40:09.044521 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 19:40:09.050521 systemd-logind[1236]: Removed session 7. Mar 17 19:40:10.348841 sshd[1429]: Accepted publickey for core from 172.24.4.1 port 45738 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:40:10.351615 sshd[1429]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:40:10.361699 systemd-logind[1236]: New session 8 of user core. Mar 17 19:40:10.362501 systemd[1]: Started session-8.scope. Mar 17 19:40:10.904794 sudo[1436]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 19:40:10.905368 sudo[1436]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 19:40:10.911584 sudo[1436]: pam_unix(sudo:session): session closed for user root Mar 17 19:40:10.921866 sudo[1435]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 17 19:40:10.922414 sudo[1435]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 19:40:10.943922 systemd[1]: Stopping audit-rules.service... Mar 17 19:40:10.945000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Mar 17 19:40:10.947162 auditctl[1439]: No rules Mar 17 19:40:10.949801 kernel: kauditd_printk_skb: 148 callbacks suppressed Mar 17 19:40:10.949892 kernel: audit: type=1305 audit(1742240410.945:154): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Mar 17 19:40:10.950513 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 19:40:10.951023 systemd[1]: Stopped audit-rules.service. Mar 17 19:40:10.958450 systemd[1]: Starting audit-rules.service... Mar 17 19:40:10.945000 audit[1439]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffec15c980 a2=420 a3=0 items=0 ppid=1 pid=1439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:10.945000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D44 Mar 17 19:40:10.984859 kernel: audit: type=1300 audit(1742240410.945:154): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffec15c980 a2=420 a3=0 items=0 ppid=1 pid=1439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:10.985046 kernel: audit: type=1327 audit(1742240410.945:154): proctitle=2F7362696E2F617564697463746C002D44 Mar 17 19:40:10.950000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:10.997031 kernel: audit: type=1131 audit(1742240410.950:155): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:11.017876 augenrules[1457]: No rules Mar 17 19:40:11.019994 systemd[1]: Finished audit-rules.service. Mar 17 19:40:11.032211 kernel: audit: type=1130 audit(1742240411.019:156): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:11.019000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:11.031251 sudo[1435]: pam_unix(sudo:session): session closed for user root Mar 17 19:40:11.030000 audit[1435]: USER_END pid=1435 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:11.045982 kernel: audit: type=1106 audit(1742240411.030:157): pid=1435 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:11.030000 audit[1435]: CRED_DISP pid=1435 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:11.058017 kernel: audit: type=1104 audit(1742240411.030:158): pid=1435 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:11.178612 sshd[1429]: pam_unix(sshd:session): session closed for user core Mar 17 19:40:11.184394 systemd[1]: Started sshd@8-172.24.4.218:22-172.24.4.1:45748.service. Mar 17 19:40:11.182000 audit[1429]: USER_END pid=1429 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:11.191636 systemd[1]: sshd@7-172.24.4.218:22-172.24.4.1:45738.service: Deactivated successfully. Mar 17 19:40:11.204022 kernel: audit: type=1106 audit(1742240411.182:159): pid=1429 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:11.205017 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 19:40:11.205098 systemd-logind[1236]: Session 8 logged out. Waiting for processes to exit. Mar 17 19:40:11.210901 systemd-logind[1236]: Removed session 8. Mar 17 19:40:11.183000 audit[1429]: CRED_DISP pid=1429 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:11.227015 kernel: audit: type=1104 audit(1742240411.183:160): pid=1429 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:11.227114 kernel: audit: type=1130 audit(1742240411.183:161): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-172.24.4.218:22-172.24.4.1:45748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:11.183000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-172.24.4.218:22-172.24.4.1:45748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:11.191000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-172.24.4.218:22-172.24.4.1:45738 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:12.509000 audit[1462]: USER_ACCT pid=1462 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:12.510467 sshd[1462]: Accepted publickey for core from 172.24.4.1 port 45748 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:40:12.511000 audit[1462]: CRED_ACQ pid=1462 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:12.511000 audit[1462]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0cea5fd0 a2=3 a3=0 items=0 ppid=1 pid=1462 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:12.511000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:40:12.513548 sshd[1462]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:40:12.523812 systemd-logind[1236]: New session 9 of user core. Mar 17 19:40:12.524533 systemd[1]: Started session-9.scope. Mar 17 19:40:12.536000 audit[1462]: USER_START pid=1462 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:12.538000 audit[1467]: CRED_ACQ pid=1467 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:13.064000 audit[1468]: USER_ACCT pid=1468 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:13.065680 sudo[1468]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 19:40:13.065000 audit[1468]: CRED_REFR pid=1468 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:13.067062 sudo[1468]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 19:40:13.070000 audit[1468]: USER_START pid=1468 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:13.118704 systemd[1]: Starting docker.service... Mar 17 19:40:13.180525 env[1478]: time="2025-03-17T19:40:13.180459023Z" level=info msg="Starting up" Mar 17 19:40:13.182481 env[1478]: time="2025-03-17T19:40:13.182434023Z" level=info msg="parsed scheme: \"unix\"" module=grpc Mar 17 19:40:13.182481 env[1478]: time="2025-03-17T19:40:13.182473787Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Mar 17 19:40:13.182675 env[1478]: time="2025-03-17T19:40:13.182498977Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Mar 17 19:40:13.182675 env[1478]: time="2025-03-17T19:40:13.182512889Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Mar 17 19:40:13.184476 env[1478]: time="2025-03-17T19:40:13.184444830Z" level=info msg="parsed scheme: \"unix\"" module=grpc Mar 17 19:40:13.184476 env[1478]: time="2025-03-17T19:40:13.184463921Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Mar 17 19:40:13.184653 env[1478]: time="2025-03-17T19:40:13.184482301Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Mar 17 19:40:13.184653 env[1478]: time="2025-03-17T19:40:13.184493038Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Mar 17 19:40:13.428474 env[1478]: time="2025-03-17T19:40:13.428290740Z" level=warning msg="Your kernel does not support cgroup blkio weight" Mar 17 19:40:13.428474 env[1478]: time="2025-03-17T19:40:13.428344185Z" level=warning msg="Your kernel does not support cgroup blkio weight_device" Mar 17 19:40:13.429325 env[1478]: time="2025-03-17T19:40:13.429275917Z" level=info msg="Loading containers: start." Mar 17 19:40:13.583000 audit[1509]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1509 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.583000 audit[1509]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffcf8664dc0 a2=0 a3=7ffcf8664dac items=0 ppid=1478 pid=1509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.583000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Mar 17 19:40:13.587000 audit[1511]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1511 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.587000 audit[1511]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7ffc4c7223b0 a2=0 a3=7ffc4c72239c items=0 ppid=1478 pid=1511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.587000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Mar 17 19:40:13.590000 audit[1513]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1513 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.590000 audit[1513]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffe203eadf0 a2=0 a3=7ffe203eaddc items=0 ppid=1478 pid=1513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.590000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Mar 17 19:40:13.594000 audit[1515]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1515 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.594000 audit[1515]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffc07e8da40 a2=0 a3=7ffc07e8da2c items=0 ppid=1478 pid=1515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.594000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Mar 17 19:40:13.598000 audit[1517]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_rule pid=1517 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.598000 audit[1517]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd68fc0680 a2=0 a3=7ffd68fc066c items=0 ppid=1478 pid=1517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.598000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6A0052455455524E Mar 17 19:40:13.616000 audit[1522]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_rule pid=1522 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.616000 audit[1522]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffc0f7b0a60 a2=0 a3=7ffc0f7b0a4c items=0 ppid=1478 pid=1522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.616000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D32002D6A0052455455524E Mar 17 19:40:13.632000 audit[1524]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1524 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.632000 audit[1524]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffe7069a510 a2=0 a3=7ffe7069a4fc items=0 ppid=1478 pid=1524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.632000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Mar 17 19:40:13.636000 audit[1526]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_rule pid=1526 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.636000 audit[1526]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffebe670c10 a2=0 a3=7ffebe670bfc items=0 ppid=1478 pid=1526 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.636000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Mar 17 19:40:13.639000 audit[1528]: NETFILTER_CFG table=filter:10 family=2 entries=2 op=nft_register_chain pid=1528 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.639000 audit[1528]: SYSCALL arch=c000003e syscall=46 success=yes exit=308 a0=3 a1=7ffde0779520 a2=0 a3=7ffde077950c items=0 ppid=1478 pid=1528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.639000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 19:40:13.654000 audit[1532]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_unregister_rule pid=1532 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.654000 audit[1532]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffd5b2c4180 a2=0 a3=7ffd5b2c416c items=0 ppid=1478 pid=1532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.654000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Mar 17 19:40:13.659000 audit[1533]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1533 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.659000 audit[1533]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffdcbfac270 a2=0 a3=7ffdcbfac25c items=0 ppid=1478 pid=1533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.659000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 19:40:13.687070 kernel: Initializing XFRM netlink socket Mar 17 19:40:13.780395 env[1478]: time="2025-03-17T19:40:13.780359555Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address" Mar 17 19:40:13.811000 audit[1541]: NETFILTER_CFG table=nat:13 family=2 entries=2 op=nft_register_chain pid=1541 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.811000 audit[1541]: SYSCALL arch=c000003e syscall=46 success=yes exit=492 a0=3 a1=7ffcd4b453d0 a2=0 a3=7ffcd4b453bc items=0 ppid=1478 pid=1541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.811000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Mar 17 19:40:13.833000 audit[1544]: NETFILTER_CFG table=nat:14 family=2 entries=1 op=nft_register_rule pid=1544 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.833000 audit[1544]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffd4a6f7010 a2=0 a3=7ffd4a6f6ffc items=0 ppid=1478 pid=1544 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.833000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Mar 17 19:40:13.836000 audit[1547]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=1547 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.836000 audit[1547]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffd57596ea0 a2=0 a3=7ffd57596e8c items=0 ppid=1478 pid=1547 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.836000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B657230002D6F00646F636B657230002D6A00414343455054 Mar 17 19:40:13.838000 audit[1549]: NETFILTER_CFG table=filter:16 family=2 entries=1 op=nft_register_rule pid=1549 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.838000 audit[1549]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffc955baa40 a2=0 a3=7ffc955baa2c items=0 ppid=1478 pid=1549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.838000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B6572300000002D6F00646F636B657230002D6A00414343455054 Mar 17 19:40:13.839000 audit[1551]: NETFILTER_CFG table=nat:17 family=2 entries=2 op=nft_register_chain pid=1551 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.839000 audit[1551]: SYSCALL arch=c000003e syscall=46 success=yes exit=356 a0=3 a1=7fff5295b620 a2=0 a3=7fff5295b60c items=0 ppid=1478 pid=1551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.839000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Mar 17 19:40:13.841000 audit[1553]: NETFILTER_CFG table=nat:18 family=2 entries=2 op=nft_register_chain pid=1553 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.841000 audit[1553]: SYSCALL arch=c000003e syscall=46 success=yes exit=444 a0=3 a1=7ffd82236190 a2=0 a3=7ffd8223617c items=0 ppid=1478 pid=1553 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.841000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Mar 17 19:40:13.843000 audit[1555]: NETFILTER_CFG table=filter:19 family=2 entries=1 op=nft_register_rule pid=1555 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.843000 audit[1555]: SYSCALL arch=c000003e syscall=46 success=yes exit=304 a0=3 a1=7fffe6c5b120 a2=0 a3=7fffe6c5b10c items=0 ppid=1478 pid=1555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.843000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6A00444F434B4552 Mar 17 19:40:13.852000 audit[1558]: NETFILTER_CFG table=filter:20 family=2 entries=1 op=nft_register_rule pid=1558 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.852000 audit[1558]: SYSCALL arch=c000003e syscall=46 success=yes exit=508 a0=3 a1=7ffc51c590e0 a2=0 a3=7ffc51c590cc items=0 ppid=1478 pid=1558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.852000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Mar 17 19:40:13.854000 audit[1560]: NETFILTER_CFG table=filter:21 family=2 entries=1 op=nft_register_rule pid=1560 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.854000 audit[1560]: SYSCALL arch=c000003e syscall=46 success=yes exit=240 a0=3 a1=7ffe5f7b9320 a2=0 a3=7ffe5f7b930c items=0 ppid=1478 pid=1560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.854000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Mar 17 19:40:13.855000 audit[1562]: NETFILTER_CFG table=filter:22 family=2 entries=1 op=nft_register_rule pid=1562 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.855000 audit[1562]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7fff56ecb230 a2=0 a3=7fff56ecb21c items=0 ppid=1478 pid=1562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.855000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Mar 17 19:40:13.857000 audit[1564]: NETFILTER_CFG table=filter:23 family=2 entries=1 op=nft_register_rule pid=1564 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.857000 audit[1564]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffc541d4000 a2=0 a3=7ffc541d3fec items=0 ppid=1478 pid=1564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.857000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Mar 17 19:40:13.859148 systemd-networkd[1042]: docker0: Link UP Mar 17 19:40:13.870000 audit[1568]: NETFILTER_CFG table=filter:24 family=2 entries=1 op=nft_unregister_rule pid=1568 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.870000 audit[1568]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fff78fa3980 a2=0 a3=7fff78fa396c items=0 ppid=1478 pid=1568 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.870000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Mar 17 19:40:13.875000 audit[1569]: NETFILTER_CFG table=filter:25 family=2 entries=1 op=nft_register_rule pid=1569 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:13.875000 audit[1569]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffd75292890 a2=0 a3=7ffd7529287c items=0 ppid=1478 pid=1569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:13.875000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 19:40:13.877077 env[1478]: time="2025-03-17T19:40:13.877042798Z" level=info msg="Loading containers: done." Mar 17 19:40:13.900887 env[1478]: time="2025-03-17T19:40:13.900852720Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 19:40:13.901052 env[1478]: time="2025-03-17T19:40:13.901031264Z" level=info msg="Docker daemon" commit=112bdf3343 graphdriver(s)=overlay2 version=20.10.23 Mar 17 19:40:13.901129 env[1478]: time="2025-03-17T19:40:13.901111782Z" level=info msg="Daemon has completed initialization" Mar 17 19:40:13.921000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:13.921976 systemd[1]: Started docker.service. Mar 17 19:40:13.927539 env[1478]: time="2025-03-17T19:40:13.927500646Z" level=info msg="API listen on /run/docker.sock" Mar 17 19:40:15.844473 env[1250]: time="2025-03-17T19:40:15.844362676Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 17 19:40:16.578499 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4128025953.mount: Deactivated successfully. Mar 17 19:40:16.956473 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 17 19:40:16.956828 systemd[1]: Stopped kubelet.service. Mar 17 19:40:16.955000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:16.961080 kernel: kauditd_printk_skb: 84 callbacks suppressed Mar 17 19:40:16.961205 kernel: audit: type=1130 audit(1742240416.955:196): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:16.971213 systemd[1]: Starting kubelet.service... Mar 17 19:40:16.955000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:16.980990 kernel: audit: type=1131 audit(1742240416.955:197): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:17.077881 systemd[1]: Started kubelet.service. Mar 17 19:40:17.077000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:17.084197 kernel: audit: type=1130 audit(1742240417.077:198): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:17.374891 kubelet[1616]: E0317 19:40:17.374526 1616 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 19:40:17.379803 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 19:40:17.380162 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 19:40:17.379000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 19:40:17.386972 kernel: audit: type=1131 audit(1742240417.379:199): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 19:40:19.496426 env[1250]: time="2025-03-17T19:40:19.496331506Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:19.498853 env[1250]: time="2025-03-17T19:40:19.498801220Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:19.502158 env[1250]: time="2025-03-17T19:40:19.502126392Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-apiserver:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:19.505792 env[1250]: time="2025-03-17T19:40:19.505770520Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:19.507839 env[1250]: time="2025-03-17T19:40:19.507767289Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 17 19:40:19.523578 env[1250]: time="2025-03-17T19:40:19.523548259Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 17 19:40:22.251106 env[1250]: time="2025-03-17T19:40:22.251026632Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:22.254157 env[1250]: time="2025-03-17T19:40:22.254106156Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:22.257612 env[1250]: time="2025-03-17T19:40:22.257546650Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-controller-manager:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:22.261048 env[1250]: time="2025-03-17T19:40:22.260981714Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:22.262096 env[1250]: time="2025-03-17T19:40:22.262040410Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 17 19:40:22.274361 env[1250]: time="2025-03-17T19:40:22.274307841Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 17 19:40:24.168056 env[1250]: time="2025-03-17T19:40:24.167910960Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:24.173474 env[1250]: time="2025-03-17T19:40:24.173419959Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:24.176127 env[1250]: time="2025-03-17T19:40:24.176071641Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-scheduler:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:24.179849 env[1250]: time="2025-03-17T19:40:24.179802274Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:24.182175 env[1250]: time="2025-03-17T19:40:24.182116991Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 17 19:40:24.194747 env[1250]: time="2025-03-17T19:40:24.194665787Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 17 19:40:25.929765 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount896447187.mount: Deactivated successfully. Mar 17 19:40:26.867564 env[1250]: time="2025-03-17T19:40:26.867451014Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:26.871037 env[1250]: time="2025-03-17T19:40:26.870939878Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:26.873763 env[1250]: time="2025-03-17T19:40:26.873719718Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-proxy:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:26.876575 env[1250]: time="2025-03-17T19:40:26.876501433Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:26.878862 env[1250]: time="2025-03-17T19:40:26.878783582Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 17 19:40:26.889846 env[1250]: time="2025-03-17T19:40:26.889764543Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 19:40:27.456610 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 17 19:40:27.457073 systemd[1]: Stopped kubelet.service. Mar 17 19:40:27.456000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:27.459726 systemd[1]: Starting kubelet.service... Mar 17 19:40:27.481989 kernel: audit: type=1130 audit(1742240427.456:200): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:27.482151 kernel: audit: type=1131 audit(1742240427.456:201): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:27.456000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:27.514078 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2666893869.mount: Deactivated successfully. Mar 17 19:40:27.594000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:27.595370 systemd[1]: Started kubelet.service. Mar 17 19:40:27.601081 kernel: audit: type=1130 audit(1742240427.594:202): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:27.673383 update_engine[1237]: I0317 19:40:27.673238 1237 update_attempter.cc:509] Updating boot flags... Mar 17 19:40:27.994147 kubelet[1655]: E0317 19:40:27.994092 1655 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 19:40:27.995000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 19:40:27.995896 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 19:40:27.996069 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 19:40:28.002223 kernel: audit: type=1131 audit(1742240427.995:203): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 19:40:30.397389 env[1250]: time="2025-03-17T19:40:30.397123055Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns:v1.11.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:30.409324 env[1250]: time="2025-03-17T19:40:30.409232573Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:30.418718 env[1250]: time="2025-03-17T19:40:30.418654197Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/coredns/coredns:v1.11.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:30.421654 env[1250]: time="2025-03-17T19:40:30.421596244Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 17 19:40:30.425838 env[1250]: time="2025-03-17T19:40:30.425783258Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:30.444362 env[1250]: time="2025-03-17T19:40:30.444232803Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 17 19:40:31.071013 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2439972127.mount: Deactivated successfully. Mar 17 19:40:31.082600 env[1250]: time="2025-03-17T19:40:31.082491798Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:31.086705 env[1250]: time="2025-03-17T19:40:31.086631051Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:31.090349 env[1250]: time="2025-03-17T19:40:31.090281442Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:31.093664 env[1250]: time="2025-03-17T19:40:31.093595210Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:31.095280 env[1250]: time="2025-03-17T19:40:31.095215656Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 17 19:40:31.117259 env[1250]: time="2025-03-17T19:40:31.117155855Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 17 19:40:31.770724 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3812398494.mount: Deactivated successfully. Mar 17 19:40:35.528223 env[1250]: time="2025-03-17T19:40:35.528173055Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd:3.5.12-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:35.532574 env[1250]: time="2025-03-17T19:40:35.532552246Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:35.536272 env[1250]: time="2025-03-17T19:40:35.536251550Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/etcd:3.5.12-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:35.540304 env[1250]: time="2025-03-17T19:40:35.540239904Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:35.541408 env[1250]: time="2025-03-17T19:40:35.541382689Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 17 19:40:38.207000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:38.207027 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 17 19:40:38.208197 systemd[1]: Stopped kubelet.service. Mar 17 19:40:38.212200 systemd[1]: Starting kubelet.service... Mar 17 19:40:38.207000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:38.220000 kernel: audit: type=1130 audit(1742240438.207:204): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:38.220121 kernel: audit: type=1131 audit(1742240438.207:205): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:38.496000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:38.496871 systemd[1]: Started kubelet.service. Mar 17 19:40:38.503011 kernel: audit: type=1130 audit(1742240438.496:206): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:38.557395 kubelet[1756]: E0317 19:40:38.557354 1756 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 19:40:38.557000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 19:40:38.558975 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 19:40:38.559115 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 19:40:38.564971 kernel: audit: type=1131 audit(1742240438.557:207): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 19:40:39.365235 systemd[1]: Stopped kubelet.service. Mar 17 19:40:39.363000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:39.369841 systemd[1]: Starting kubelet.service... Mar 17 19:40:39.370960 kernel: audit: type=1130 audit(1742240439.363:208): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:39.363000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:39.376974 kernel: audit: type=1131 audit(1742240439.363:209): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:39.419808 systemd[1]: Reloading. Mar 17 19:40:39.492661 /usr/lib/systemd/system-generators/torcx-generator[1791]: time="2025-03-17T19:40:39Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 19:40:39.492692 /usr/lib/systemd/system-generators/torcx-generator[1791]: time="2025-03-17T19:40:39Z" level=info msg="torcx already run" Mar 17 19:40:39.789594 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 19:40:39.789614 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 19:40:39.815393 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 19:40:39.922000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:39.928938 systemd[1]: Stopping kubelet.service... Mar 17 19:40:39.931669 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 19:40:39.931883 systemd[1]: Stopped kubelet.service. Mar 17 19:40:39.928000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:39.935815 systemd[1]: Starting kubelet.service... Mar 17 19:40:39.947148 kernel: audit: type=1130 audit(1742240439.922:210): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:39.947269 kernel: audit: type=1131 audit(1742240439.928:211): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:40.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:40.023626 systemd[1]: Started kubelet.service. Mar 17 19:40:40.037417 kernel: audit: type=1130 audit(1742240440.022:212): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:40.097056 kubelet[1862]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 19:40:40.097056 kubelet[1862]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 19:40:40.097056 kubelet[1862]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 19:40:40.097056 kubelet[1862]: I0317 19:40:40.096528 1862 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 19:40:40.617749 kubelet[1862]: I0317 19:40:40.617700 1862 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 19:40:40.617749 kubelet[1862]: I0317 19:40:40.617729 1862 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 19:40:40.618192 kubelet[1862]: I0317 19:40:40.618073 1862 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 19:40:40.640875 kubelet[1862]: I0317 19:40:40.639268 1862 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 19:40:40.642177 kubelet[1862]: E0317 19:40:40.642137 1862 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://172.24.4.218:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:40.652098 kubelet[1862]: I0317 19:40:40.652059 1862 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 19:40:40.653433 kubelet[1862]: I0317 19:40:40.653375 1862 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 19:40:40.653614 kubelet[1862]: I0317 19:40:40.653411 1862 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-3510-3-7-8-c8b8528301.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 19:40:40.653614 kubelet[1862]: I0317 19:40:40.653616 1862 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 19:40:40.654019 kubelet[1862]: I0317 19:40:40.653631 1862 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 19:40:40.654019 kubelet[1862]: I0317 19:40:40.653749 1862 state_mem.go:36] "Initialized new in-memory state store" Mar 17 19:40:40.655033 kubelet[1862]: I0317 19:40:40.655006 1862 kubelet.go:400] "Attempting to sync node with API server" Mar 17 19:40:40.655033 kubelet[1862]: I0317 19:40:40.655025 1862 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 19:40:40.655224 kubelet[1862]: I0317 19:40:40.655046 1862 kubelet.go:312] "Adding apiserver pod source" Mar 17 19:40:40.655224 kubelet[1862]: I0317 19:40:40.655066 1862 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 19:40:40.664743 kubelet[1862]: W0317 19:40:40.664508 1862 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.218:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510-3-7-8-c8b8528301.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:40.664743 kubelet[1862]: E0317 19:40:40.664592 1862 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.218:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510-3-7-8-c8b8528301.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:40.664743 kubelet[1862]: W0317 19:40:40.664680 1862 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://172.24.4.218:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:40.664743 kubelet[1862]: E0317 19:40:40.664738 1862 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://172.24.4.218:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:40.665167 kubelet[1862]: I0317 19:40:40.664834 1862 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Mar 17 19:40:40.669511 kubelet[1862]: I0317 19:40:40.669474 1862 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 19:40:40.669800 kubelet[1862]: W0317 19:40:40.669774 1862 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 19:40:40.671796 kubelet[1862]: I0317 19:40:40.671765 1862 server.go:1264] "Started kubelet" Mar 17 19:40:40.682942 kubelet[1862]: I0317 19:40:40.682286 1862 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 19:40:40.683242 kubelet[1862]: I0317 19:40:40.683207 1862 server.go:455] "Adding debug handlers to kubelet server" Mar 17 19:40:40.684000 audit[1862]: AVC avc: denied { mac_admin } for pid=1862 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:40:40.686243 kubelet[1862]: I0317 19:40:40.686198 1862 kubelet.go:1419] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Mar 17 19:40:40.686475 kubelet[1862]: I0317 19:40:40.686440 1862 kubelet.go:1423] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Mar 17 19:40:40.686790 kubelet[1862]: I0317 19:40:40.686760 1862 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 19:40:40.684000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 19:40:40.691112 kernel: audit: type=1400 audit(1742240440.684:213): avc: denied { mac_admin } for pid=1862 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:40:40.684000 audit[1862]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000c156e0 a1=c000bf3470 a2=c000c156b0 a3=25 items=0 ppid=1 pid=1862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.684000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 19:40:40.684000 audit[1862]: AVC avc: denied { mac_admin } for pid=1862 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:40:40.684000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 19:40:40.684000 audit[1862]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000cbc420 a1=c000bf3488 a2=c000c15770 a3=25 items=0 ppid=1 pid=1862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.684000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 19:40:40.692000 audit[1873]: NETFILTER_CFG table=mangle:26 family=2 entries=2 op=nft_register_chain pid=1873 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:40.692000 audit[1873]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffffc818410 a2=0 a3=7ffffc8183fc items=0 ppid=1862 pid=1873 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.692000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Mar 17 19:40:40.694000 audit[1874]: NETFILTER_CFG table=filter:27 family=2 entries=1 op=nft_register_chain pid=1874 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:40.694000 audit[1874]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffda406f2f0 a2=0 a3=7ffda406f2dc items=0 ppid=1862 pid=1874 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.694000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Mar 17 19:40:40.695764 kubelet[1862]: I0317 19:40:40.695670 1862 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 19:40:40.696168 kubelet[1862]: I0317 19:40:40.696135 1862 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 19:40:40.698660 kubelet[1862]: I0317 19:40:40.698647 1862 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 19:40:40.699774 kubelet[1862]: E0317 19:40:40.699531 1862 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://172.24.4.218:6443/api/v1/namespaces/default/events\": dial tcp 172.24.4.218:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-3510-3-7-8-c8b8528301.novalocal.182dae73338eb376 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-3510-3-7-8-c8b8528301.novalocal,UID:ci-3510-3-7-8-c8b8528301.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-3510-3-7-8-c8b8528301.novalocal,},FirstTimestamp:2025-03-17 19:40:40.671712118 +0000 UTC m=+0.642274091,LastTimestamp:2025-03-17 19:40:40.671712118 +0000 UTC m=+0.642274091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-3510-3-7-8-c8b8528301.novalocal,}" Mar 17 19:40:40.699972 kubelet[1862]: E0317 19:40:40.699909 1862 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.218:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510-3-7-8-c8b8528301.novalocal?timeout=10s\": dial tcp 172.24.4.218:6443: connect: connection refused" interval="200ms" Mar 17 19:40:40.699000 audit[1876]: NETFILTER_CFG table=filter:28 family=2 entries=2 op=nft_register_chain pid=1876 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:40.699000 audit[1876]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7fff05e67320 a2=0 a3=7fff05e6730c items=0 ppid=1862 pid=1876 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.699000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 19:40:40.700897 kubelet[1862]: I0317 19:40:40.700861 1862 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 19:40:40.701438 kubelet[1862]: I0317 19:40:40.701399 1862 reconciler.go:26] "Reconciler: start to sync state" Mar 17 19:40:40.702020 kubelet[1862]: I0317 19:40:40.702001 1862 factory.go:221] Registration of the systemd container factory successfully Mar 17 19:40:40.702351 kubelet[1862]: I0317 19:40:40.702332 1862 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 19:40:40.704917 kubelet[1862]: W0317 19:40:40.704836 1862 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.218:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:40.705040 kubelet[1862]: E0317 19:40:40.705011 1862 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.218:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:40.707052 kubelet[1862]: I0317 19:40:40.707039 1862 factory.go:221] Registration of the containerd container factory successfully Mar 17 19:40:40.710000 audit[1878]: NETFILTER_CFG table=filter:29 family=2 entries=2 op=nft_register_chain pid=1878 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:40.710000 audit[1878]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffcb05cab90 a2=0 a3=7ffcb05cab7c items=0 ppid=1862 pid=1878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.710000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 19:40:40.724209 kubelet[1862]: E0317 19:40:40.724185 1862 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 19:40:40.736000 audit[1884]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=1884 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:40.736000 audit[1884]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffcb2d5cf20 a2=0 a3=7ffcb2d5cf0c items=0 ppid=1862 pid=1884 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.736000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Mar 17 19:40:40.738306 kubelet[1862]: I0317 19:40:40.738240 1862 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 19:40:40.738000 audit[1886]: NETFILTER_CFG table=mangle:31 family=10 entries=2 op=nft_register_chain pid=1886 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:40:40.738000 audit[1886]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fffc6ed7d10 a2=0 a3=7fffc6ed7cfc items=0 ppid=1862 pid=1886 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.738000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Mar 17 19:40:40.740015 kubelet[1862]: I0317 19:40:40.739933 1862 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 19:40:40.740105 kubelet[1862]: I0317 19:40:40.740084 1862 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 19:40:40.740175 kubelet[1862]: I0317 19:40:40.740156 1862 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 19:40:40.740320 kubelet[1862]: E0317 19:40:40.740282 1862 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 19:40:40.740000 audit[1887]: NETFILTER_CFG table=mangle:32 family=10 entries=1 op=nft_register_chain pid=1887 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:40:40.740000 audit[1887]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc272727f0 a2=0 a3=7ffc272727dc items=0 ppid=1862 pid=1887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.740000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Mar 17 19:40:40.741000 audit[1885]: NETFILTER_CFG table=mangle:33 family=2 entries=1 op=nft_register_chain pid=1885 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:40.741000 audit[1885]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe569ba0b0 a2=0 a3=7ffe569ba09c items=0 ppid=1862 pid=1885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.741000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Mar 17 19:40:40.742000 audit[1888]: NETFILTER_CFG table=nat:34 family=10 entries=2 op=nft_register_chain pid=1888 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:40:40.742000 audit[1888]: SYSCALL arch=c000003e syscall=46 success=yes exit=128 a0=3 a1=7ffe111398f0 a2=0 a3=7ffe111398dc items=0 ppid=1862 pid=1888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.742000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Mar 17 19:40:40.743000 audit[1890]: NETFILTER_CFG table=filter:35 family=10 entries=2 op=nft_register_chain pid=1890 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:40:40.743000 audit[1890]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffeeca4dfc0 a2=0 a3=7ffeeca4dfac items=0 ppid=1862 pid=1890 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.743000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Mar 17 19:40:40.747721 kubelet[1862]: W0317 19:40:40.747663 1862 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.218:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:40.747774 kubelet[1862]: E0317 19:40:40.747745 1862 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.218:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:40.748865 kubelet[1862]: I0317 19:40:40.748834 1862 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 19:40:40.748912 kubelet[1862]: I0317 19:40:40.748869 1862 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 19:40:40.748977 kubelet[1862]: I0317 19:40:40.748912 1862 state_mem.go:36] "Initialized new in-memory state store" Mar 17 19:40:40.748000 audit[1889]: NETFILTER_CFG table=nat:36 family=2 entries=1 op=nft_register_chain pid=1889 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:40.748000 audit[1889]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe677cefc0 a2=0 a3=7ffe677cefac items=0 ppid=1862 pid=1889 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.748000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Mar 17 19:40:40.750000 audit[1891]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_chain pid=1891 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:40:40.750000 audit[1891]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe1566dfa0 a2=0 a3=7ffe1566df8c items=0 ppid=1862 pid=1891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.750000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Mar 17 19:40:40.757003 kubelet[1862]: I0317 19:40:40.756980 1862 policy_none.go:49] "None policy: Start" Mar 17 19:40:40.757556 kubelet[1862]: I0317 19:40:40.757541 1862 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 19:40:40.757601 kubelet[1862]: I0317 19:40:40.757562 1862 state_mem.go:35] "Initializing new in-memory state store" Mar 17 19:40:40.764329 kubelet[1862]: I0317 19:40:40.764301 1862 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 19:40:40.763000 audit[1862]: AVC avc: denied { mac_admin } for pid=1862 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:40:40.763000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 19:40:40.763000 audit[1862]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000fcedb0 a1=c000fd03a8 a2=c000fced80 a3=25 items=0 ppid=1 pid=1862 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:40.763000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 19:40:40.764586 kubelet[1862]: I0317 19:40:40.764361 1862 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Mar 17 19:40:40.764586 kubelet[1862]: I0317 19:40:40.764463 1862 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 19:40:40.764586 kubelet[1862]: I0317 19:40:40.764578 1862 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 19:40:40.766551 kubelet[1862]: E0317 19:40:40.766534 1862 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-3510-3-7-8-c8b8528301.novalocal\" not found" Mar 17 19:40:40.801578 kubelet[1862]: I0317 19:40:40.801522 1862 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.802463 kubelet[1862]: E0317 19:40:40.802415 1862 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.218:6443/api/v1/nodes\": dial tcp 172.24.4.218:6443: connect: connection refused" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.841005 kubelet[1862]: I0317 19:40:40.840884 1862 topology_manager.go:215] "Topology Admit Handler" podUID="0b3a5bfca3ea1aea22d83e1fab9252d1" podNamespace="kube-system" podName="kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.843709 kubelet[1862]: I0317 19:40:40.843664 1862 topology_manager.go:215] "Topology Admit Handler" podUID="7eac63e1ce939b7c343e49f9ed09c29a" podNamespace="kube-system" podName="kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.851054 kubelet[1862]: I0317 19:40:40.848063 1862 topology_manager.go:215] "Topology Admit Handler" podUID="9ec61937c6596e0913ad319122feff05" podNamespace="kube-system" podName="kube-scheduler-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.901053 kubelet[1862]: E0317 19:40:40.900841 1862 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.218:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510-3-7-8-c8b8528301.novalocal?timeout=10s\": dial tcp 172.24.4.218:6443: connect: connection refused" interval="400ms" Mar 17 19:40:40.902000 kubelet[1862]: I0317 19:40:40.901922 1862 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0b3a5bfca3ea1aea22d83e1fab9252d1-ca-certs\") pod \"kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"0b3a5bfca3ea1aea22d83e1fab9252d1\") " pod="kube-system/kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.902242 kubelet[1862]: I0317 19:40:40.902205 1862 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-ca-certs\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.902438 kubelet[1862]: I0317 19:40:40.902404 1862 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-k8s-certs\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.902622 kubelet[1862]: I0317 19:40:40.902590 1862 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-kubeconfig\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.902818 kubelet[1862]: I0317 19:40:40.902780 1862 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.903084 kubelet[1862]: I0317 19:40:40.903048 1862 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0b3a5bfca3ea1aea22d83e1fab9252d1-k8s-certs\") pod \"kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"0b3a5bfca3ea1aea22d83e1fab9252d1\") " pod="kube-system/kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.903298 kubelet[1862]: I0317 19:40:40.903260 1862 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0b3a5bfca3ea1aea22d83e1fab9252d1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"0b3a5bfca3ea1aea22d83e1fab9252d1\") " pod="kube-system/kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.903482 kubelet[1862]: I0317 19:40:40.903450 1862 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-flexvolume-dir\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:40.903679 kubelet[1862]: I0317 19:40:40.903644 1862 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9ec61937c6596e0913ad319122feff05-kubeconfig\") pod \"kube-scheduler-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"9ec61937c6596e0913ad319122feff05\") " pod="kube-system/kube-scheduler-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:41.007377 kubelet[1862]: I0317 19:40:41.007319 1862 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:41.008019 kubelet[1862]: E0317 19:40:41.007923 1862 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.218:6443/api/v1/nodes\": dial tcp 172.24.4.218:6443: connect: connection refused" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:41.163517 env[1250]: time="2025-03-17T19:40:41.163137820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal,Uid:0b3a5bfca3ea1aea22d83e1fab9252d1,Namespace:kube-system,Attempt:0,}" Mar 17 19:40:41.166645 env[1250]: time="2025-03-17T19:40:41.166576980Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal,Uid:7eac63e1ce939b7c343e49f9ed09c29a,Namespace:kube-system,Attempt:0,}" Mar 17 19:40:41.169237 env[1250]: time="2025-03-17T19:40:41.169101226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510-3-7-8-c8b8528301.novalocal,Uid:9ec61937c6596e0913ad319122feff05,Namespace:kube-system,Attempt:0,}" Mar 17 19:40:41.303016 kubelet[1862]: E0317 19:40:41.302867 1862 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.218:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510-3-7-8-c8b8528301.novalocal?timeout=10s\": dial tcp 172.24.4.218:6443: connect: connection refused" interval="800ms" Mar 17 19:40:41.411999 kubelet[1862]: I0317 19:40:41.411390 1862 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:41.411999 kubelet[1862]: E0317 19:40:41.411897 1862 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.218:6443/api/v1/nodes\": dial tcp 172.24.4.218:6443: connect: connection refused" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:41.470331 kubelet[1862]: W0317 19:40:41.470146 1862 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://172.24.4.218:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510-3-7-8-c8b8528301.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:41.470331 kubelet[1862]: E0317 19:40:41.470298 1862 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://172.24.4.218:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510-3-7-8-c8b8528301.novalocal&limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:41.527660 kubelet[1862]: W0317 19:40:41.527579 1862 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://172.24.4.218:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:41.527854 kubelet[1862]: E0317 19:40:41.527690 1862 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://172.24.4.218:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:41.749723 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4235793634.mount: Deactivated successfully. Mar 17 19:40:41.769628 env[1250]: time="2025-03-17T19:40:41.769553822Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.776866 env[1250]: time="2025-03-17T19:40:41.776797699Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.785678 env[1250]: time="2025-03-17T19:40:41.785627716Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.790360 env[1250]: time="2025-03-17T19:40:41.790274865Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.793803 kubelet[1862]: W0317 19:40:41.793655 1862 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://172.24.4.218:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:41.793803 kubelet[1862]: E0317 19:40:41.793757 1862 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://172.24.4.218:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 172.24.4.218:6443: connect: connection refused Mar 17 19:40:41.797680 env[1250]: time="2025-03-17T19:40:41.797609678Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.800133 env[1250]: time="2025-03-17T19:40:41.800071669Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.808476 env[1250]: time="2025-03-17T19:40:41.808410569Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.811498 env[1250]: time="2025-03-17T19:40:41.811430492Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.814204 env[1250]: time="2025-03-17T19:40:41.814137272Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.816040 env[1250]: time="2025-03-17T19:40:41.815938806Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.817862 env[1250]: time="2025-03-17T19:40:41.817791213Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.819707 env[1250]: time="2025-03-17T19:40:41.819633653Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:40:41.887550 env[1250]: time="2025-03-17T19:40:41.887175405Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:40:41.887550 env[1250]: time="2025-03-17T19:40:41.887332983Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:40:41.887550 env[1250]: time="2025-03-17T19:40:41.887428728Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:40:41.887888 env[1250]: time="2025-03-17T19:40:41.887830032Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/ca1b5572bb025e30107500abc0ae73e51da56fe55b109a96b0076e4a0921d7f3 pid=1904 runtime=io.containerd.runc.v2 Mar 17 19:40:41.888182 env[1250]: time="2025-03-17T19:40:41.888105826Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:40:41.888287 env[1250]: time="2025-03-17T19:40:41.888165104Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:40:41.888328 env[1250]: time="2025-03-17T19:40:41.888260800Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:40:41.889037 env[1250]: time="2025-03-17T19:40:41.888879271Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/20eb47bc623237c8a20ef595a3401e7b238f4d8217b99c581758e4527ddfa5d7 pid=1921 runtime=io.containerd.runc.v2 Mar 17 19:40:41.893505 env[1250]: time="2025-03-17T19:40:41.893419984Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:40:41.893505 env[1250]: time="2025-03-17T19:40:41.893455539Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:40:41.893505 env[1250]: time="2025-03-17T19:40:41.893468222Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:40:41.893828 env[1250]: time="2025-03-17T19:40:41.893779452Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/5da512c308ec5a9f461ee64869c9be892eaa3ed721ceb690d9322749acf65727 pid=1928 runtime=io.containerd.runc.v2 Mar 17 19:40:41.951644 env[1250]: time="2025-03-17T19:40:41.951604506Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal,Uid:0b3a5bfca3ea1aea22d83e1fab9252d1,Namespace:kube-system,Attempt:0,} returns sandbox id \"ca1b5572bb025e30107500abc0ae73e51da56fe55b109a96b0076e4a0921d7f3\"" Mar 17 19:40:41.955489 env[1250]: time="2025-03-17T19:40:41.955452281Z" level=info msg="CreateContainer within sandbox \"ca1b5572bb025e30107500abc0ae73e51da56fe55b109a96b0076e4a0921d7f3\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 19:40:41.974131 env[1250]: time="2025-03-17T19:40:41.974088792Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal,Uid:7eac63e1ce939b7c343e49f9ed09c29a,Namespace:kube-system,Attempt:0,} returns sandbox id \"20eb47bc623237c8a20ef595a3401e7b238f4d8217b99c581758e4527ddfa5d7\"" Mar 17 19:40:41.977985 env[1250]: time="2025-03-17T19:40:41.976967336Z" level=info msg="CreateContainer within sandbox \"20eb47bc623237c8a20ef595a3401e7b238f4d8217b99c581758e4527ddfa5d7\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 19:40:42.006567 env[1250]: time="2025-03-17T19:40:42.006479550Z" level=info msg="CreateContainer within sandbox \"ca1b5572bb025e30107500abc0ae73e51da56fe55b109a96b0076e4a0921d7f3\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"b18e6c0dd07f65ea27ab704dd53bd2d183e6453f77667d005b73f0fad949d080\"" Mar 17 19:40:42.008029 env[1250]: time="2025-03-17T19:40:42.008006648Z" level=info msg="StartContainer for \"b18e6c0dd07f65ea27ab704dd53bd2d183e6453f77667d005b73f0fad949d080\"" Mar 17 19:40:42.011605 env[1250]: time="2025-03-17T19:40:42.011563602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510-3-7-8-c8b8528301.novalocal,Uid:9ec61937c6596e0913ad319122feff05,Namespace:kube-system,Attempt:0,} returns sandbox id \"5da512c308ec5a9f461ee64869c9be892eaa3ed721ceb690d9322749acf65727\"" Mar 17 19:40:42.014528 env[1250]: time="2025-03-17T19:40:42.014503056Z" level=info msg="CreateContainer within sandbox \"5da512c308ec5a9f461ee64869c9be892eaa3ed721ceb690d9322749acf65727\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 19:40:42.035346 env[1250]: time="2025-03-17T19:40:42.034480677Z" level=info msg="CreateContainer within sandbox \"20eb47bc623237c8a20ef595a3401e7b238f4d8217b99c581758e4527ddfa5d7\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"af275ad3a557e5f3221363590a7b13c62f0c18fed5b2927a4f79a6b1a754e181\"" Mar 17 19:40:42.037229 env[1250]: time="2025-03-17T19:40:42.037199738Z" level=info msg="StartContainer for \"af275ad3a557e5f3221363590a7b13c62f0c18fed5b2927a4f79a6b1a754e181\"" Mar 17 19:40:42.056911 env[1250]: time="2025-03-17T19:40:42.056865658Z" level=info msg="CreateContainer within sandbox \"5da512c308ec5a9f461ee64869c9be892eaa3ed721ceb690d9322749acf65727\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"6a4ac1fe12df9da548adef0f7bc3189181c79f480b87796f6fa27fde5338ac83\"" Mar 17 19:40:42.057464 env[1250]: time="2025-03-17T19:40:42.057425733Z" level=info msg="StartContainer for \"6a4ac1fe12df9da548adef0f7bc3189181c79f480b87796f6fa27fde5338ac83\"" Mar 17 19:40:42.103654 kubelet[1862]: E0317 19:40:42.103596 1862 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://172.24.4.218:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510-3-7-8-c8b8528301.novalocal?timeout=10s\": dial tcp 172.24.4.218:6443: connect: connection refused" interval="1.6s" Mar 17 19:40:42.137749 env[1250]: time="2025-03-17T19:40:42.137707538Z" level=info msg="StartContainer for \"b18e6c0dd07f65ea27ab704dd53bd2d183e6453f77667d005b73f0fad949d080\" returns successfully" Mar 17 19:40:42.178983 env[1250]: time="2025-03-17T19:40:42.178910977Z" level=info msg="StartContainer for \"af275ad3a557e5f3221363590a7b13c62f0c18fed5b2927a4f79a6b1a754e181\" returns successfully" Mar 17 19:40:42.207385 env[1250]: time="2025-03-17T19:40:42.207335399Z" level=info msg="StartContainer for \"6a4ac1fe12df9da548adef0f7bc3189181c79f480b87796f6fa27fde5338ac83\" returns successfully" Mar 17 19:40:42.213502 kubelet[1862]: I0317 19:40:42.213472 1862 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:42.213928 kubelet[1862]: E0317 19:40:42.213893 1862 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://172.24.4.218:6443/api/v1/nodes\": dial tcp 172.24.4.218:6443: connect: connection refused" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:43.815913 kubelet[1862]: I0317 19:40:43.815894 1862 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:44.627833 kubelet[1862]: I0317 19:40:44.627808 1862 kubelet_node_status.go:76] "Successfully registered node" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:44.666448 kubelet[1862]: I0317 19:40:44.666433 1862 apiserver.go:52] "Watching apiserver" Mar 17 19:40:44.701293 kubelet[1862]: I0317 19:40:44.701259 1862 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 19:40:44.701821 kubelet[1862]: E0317 19:40:44.701797 1862 controller.go:145] "Failed to ensure lease exists, will retry" err="namespaces \"kube-node-lease\" not found" interval="3.2s" Mar 17 19:40:45.012830 kubelet[1862]: E0317 19:40:45.012803 1862 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:47.267624 systemd[1]: Reloading. Mar 17 19:40:47.449254 /usr/lib/systemd/system-generators/torcx-generator[2157]: time="2025-03-17T19:40:47Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 19:40:47.449600 /usr/lib/systemd/system-generators/torcx-generator[2157]: time="2025-03-17T19:40:47Z" level=info msg="torcx already run" Mar 17 19:40:47.540634 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 19:40:47.540809 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 19:40:47.566389 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 19:40:47.675024 kubelet[1862]: I0317 19:40:47.673261 1862 dynamic_cafile_content.go:171] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 19:40:47.675024 kubelet[1862]: E0317 19:40:47.673223 1862 event.go:319] "Unable to write event (broadcaster is shut down)" event="&Event{ObjectMeta:{ci-3510-3-7-8-c8b8528301.novalocal.182dae73338eb376 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-3510-3-7-8-c8b8528301.novalocal,UID:ci-3510-3-7-8-c8b8528301.novalocal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-3510-3-7-8-c8b8528301.novalocal,},FirstTimestamp:2025-03-17 19:40:40.671712118 +0000 UTC m=+0.642274091,LastTimestamp:2025-03-17 19:40:40.671712118 +0000 UTC m=+0.642274091,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-3510-3-7-8-c8b8528301.novalocal,}" Mar 17 19:40:47.675677 systemd[1]: Stopping kubelet.service... Mar 17 19:40:47.684485 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 19:40:47.684716 systemd[1]: Stopped kubelet.service. Mar 17 19:40:47.683000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:47.686909 systemd[1]: Starting kubelet.service... Mar 17 19:40:47.688101 kernel: kauditd_printk_skb: 47 callbacks suppressed Mar 17 19:40:47.688146 kernel: audit: type=1131 audit(1742240447.683:228): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:47.883000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:47.884189 systemd[1]: Started kubelet.service. Mar 17 19:40:47.897033 kernel: audit: type=1130 audit(1742240447.883:229): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:47.968035 kubelet[2215]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 19:40:47.968035 kubelet[2215]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 19:40:47.968035 kubelet[2215]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 19:40:47.968441 kubelet[2215]: I0317 19:40:47.968078 2215 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 19:40:47.972513 kubelet[2215]: I0317 19:40:47.972490 2215 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 19:40:47.972598 kubelet[2215]: I0317 19:40:47.972588 2215 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 19:40:47.972899 kubelet[2215]: I0317 19:40:47.972882 2215 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 19:40:47.977262 kubelet[2215]: I0317 19:40:47.976218 2215 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 19:40:47.979108 kubelet[2215]: I0317 19:40:47.979089 2215 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 19:40:47.991755 kubelet[2215]: I0317 19:40:47.991734 2215 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 19:40:47.993187 kubelet[2215]: I0317 19:40:47.993155 2215 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 19:40:47.993587 kubelet[2215]: I0317 19:40:47.993398 2215 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-3510-3-7-8-c8b8528301.novalocal","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 19:40:47.993756 kubelet[2215]: I0317 19:40:47.993743 2215 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 19:40:47.993826 kubelet[2215]: I0317 19:40:47.993817 2215 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 19:40:47.993942 kubelet[2215]: I0317 19:40:47.993930 2215 state_mem.go:36] "Initialized new in-memory state store" Mar 17 19:40:47.994166 kubelet[2215]: I0317 19:40:47.994156 2215 kubelet.go:400] "Attempting to sync node with API server" Mar 17 19:40:47.997811 kubelet[2215]: I0317 19:40:47.996050 2215 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 19:40:47.997811 kubelet[2215]: I0317 19:40:47.996116 2215 kubelet.go:312] "Adding apiserver pod source" Mar 17 19:40:47.997811 kubelet[2215]: I0317 19:40:47.996141 2215 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 19:40:48.004778 kubelet[2215]: I0317 19:40:48.002919 2215 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Mar 17 19:40:48.004778 kubelet[2215]: I0317 19:40:48.003128 2215 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 19:40:48.004778 kubelet[2215]: I0317 19:40:48.003589 2215 server.go:1264] "Started kubelet" Mar 17 19:40:48.009000 audit[2215]: AVC avc: denied { mac_admin } for pid=2215 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:40:48.010525 kubelet[2215]: I0317 19:40:48.010504 2215 kubelet.go:1419] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Mar 17 19:40:48.010627 kubelet[2215]: I0317 19:40:48.010613 2215 kubelet.go:1423] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Mar 17 19:40:48.010726 kubelet[2215]: I0317 19:40:48.010715 2215 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 19:40:48.009000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 19:40:48.017307 kubelet[2215]: I0317 19:40:48.017265 2215 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 19:40:48.018071 kernel: audit: type=1400 audit(1742240448.009:230): avc: denied { mac_admin } for pid=2215 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:40:48.018470 kernel: audit: type=1401 audit(1742240448.009:230): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 19:40:48.019508 kubelet[2215]: I0317 19:40:48.019496 2215 server.go:455] "Adding debug handlers to kubelet server" Mar 17 19:40:48.021508 kubelet[2215]: I0317 19:40:48.021462 2215 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 19:40:48.021805 kubelet[2215]: I0317 19:40:48.021792 2215 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 19:40:48.009000 audit[2215]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000b7c690 a1=c000899ce0 a2=c000b7c660 a3=25 items=0 ppid=1 pid=2215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:48.024076 kubelet[2215]: I0317 19:40:48.024064 2215 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 19:40:48.026466 kubelet[2215]: I0317 19:40:48.026452 2215 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 19:40:48.026781 kubelet[2215]: I0317 19:40:48.026771 2215 reconciler.go:26] "Reconciler: start to sync state" Mar 17 19:40:48.028660 kubelet[2215]: I0317 19:40:48.028641 2215 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 19:40:48.029725 kubelet[2215]: I0317 19:40:48.029711 2215 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 19:40:48.029835 kubelet[2215]: I0317 19:40:48.029824 2215 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 19:40:48.029918 kubelet[2215]: I0317 19:40:48.029907 2215 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 19:40:48.029987 kernel: audit: type=1300 audit(1742240448.009:230): arch=c000003e syscall=188 success=no exit=-22 a0=c000b7c690 a1=c000899ce0 a2=c000b7c660 a3=25 items=0 ppid=1 pid=2215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:48.030104 kubelet[2215]: E0317 19:40:48.030085 2215 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 19:40:48.050724 kernel: audit: type=1327 audit(1742240448.009:230): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 19:40:48.009000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 19:40:48.051667 kubelet[2215]: I0317 19:40:48.048714 2215 factory.go:221] Registration of the systemd container factory successfully Mar 17 19:40:48.051667 kubelet[2215]: I0317 19:40:48.048827 2215 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 19:40:48.059236 kernel: audit: type=1400 audit(1742240448.009:231): avc: denied { mac_admin } for pid=2215 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:40:48.009000 audit[2215]: AVC avc: denied { mac_admin } for pid=2215 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:40:48.070247 kubelet[2215]: I0317 19:40:48.056775 2215 factory.go:221] Registration of the containerd container factory successfully Mar 17 19:40:48.009000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 19:40:48.079032 kernel: audit: type=1401 audit(1742240448.009:231): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 19:40:48.009000 audit[2215]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000b36b00 a1=c000899cf8 a2=c000b7c720 a3=25 items=0 ppid=1 pid=2215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:48.089873 kubelet[2215]: E0317 19:40:48.083538 2215 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 19:40:48.089966 kernel: audit: type=1300 audit(1742240448.009:231): arch=c000003e syscall=188 success=no exit=-22 a0=c000b36b00 a1=c000899cf8 a2=c000b7c720 a3=25 items=0 ppid=1 pid=2215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:48.009000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 19:40:48.105969 kernel: audit: type=1327 audit(1742240448.009:231): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 19:40:48.131043 kubelet[2215]: E0317 19:40:48.131017 2215 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 17 19:40:48.134271 kubelet[2215]: I0317 19:40:48.134251 2215 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.147673 kubelet[2215]: I0317 19:40:48.145436 2215 kubelet_node_status.go:112] "Node was previously registered" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.147859 kubelet[2215]: I0317 19:40:48.147846 2215 kubelet_node_status.go:76] "Successfully registered node" node="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.176513 kubelet[2215]: I0317 19:40:48.176492 2215 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 19:40:48.176719 kubelet[2215]: I0317 19:40:48.176706 2215 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 19:40:48.176816 kubelet[2215]: I0317 19:40:48.176806 2215 state_mem.go:36] "Initialized new in-memory state store" Mar 17 19:40:48.177137 kubelet[2215]: I0317 19:40:48.177122 2215 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 19:40:48.177228 kubelet[2215]: I0317 19:40:48.177201 2215 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 19:40:48.177292 kubelet[2215]: I0317 19:40:48.177284 2215 policy_none.go:49] "None policy: Start" Mar 17 19:40:48.178338 kubelet[2215]: I0317 19:40:48.178325 2215 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 19:40:48.178411 kubelet[2215]: I0317 19:40:48.178402 2215 state_mem.go:35] "Initializing new in-memory state store" Mar 17 19:40:48.178695 kubelet[2215]: I0317 19:40:48.178685 2215 state_mem.go:75] "Updated machine memory state" Mar 17 19:40:48.179888 kubelet[2215]: I0317 19:40:48.179874 2215 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 19:40:48.179000 audit[2215]: AVC avc: denied { mac_admin } for pid=2215 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:40:48.179000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 19:40:48.179000 audit[2215]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000cae7b0 a1=c00119efd8 a2=c000cae780 a3=25 items=0 ppid=1 pid=2215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:40:48.179000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 19:40:48.180255 kubelet[2215]: I0317 19:40:48.180241 2215 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Mar 17 19:40:48.180463 kubelet[2215]: I0317 19:40:48.180434 2215 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 19:40:48.180718 kubelet[2215]: I0317 19:40:48.180705 2215 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 19:40:48.333178 kubelet[2215]: I0317 19:40:48.332492 2215 topology_manager.go:215] "Topology Admit Handler" podUID="0b3a5bfca3ea1aea22d83e1fab9252d1" podNamespace="kube-system" podName="kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.333178 kubelet[2215]: I0317 19:40:48.332594 2215 topology_manager.go:215] "Topology Admit Handler" podUID="7eac63e1ce939b7c343e49f9ed09c29a" podNamespace="kube-system" podName="kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.333178 kubelet[2215]: I0317 19:40:48.332660 2215 topology_manager.go:215] "Topology Admit Handler" podUID="9ec61937c6596e0913ad319122feff05" podNamespace="kube-system" podName="kube-scheduler-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.341865 kubelet[2215]: W0317 19:40:48.341814 2215 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 19:40:48.344774 kubelet[2215]: W0317 19:40:48.344731 2215 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 19:40:48.346997 kubelet[2215]: W0317 19:40:48.346942 2215 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 19:40:48.435173 kubelet[2215]: I0317 19:40:48.435014 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0b3a5bfca3ea1aea22d83e1fab9252d1-k8s-certs\") pod \"kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"0b3a5bfca3ea1aea22d83e1fab9252d1\") " pod="kube-system/kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.435173 kubelet[2215]: I0317 19:40:48.435064 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-flexvolume-dir\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.435173 kubelet[2215]: I0317 19:40:48.435092 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-k8s-certs\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.435173 kubelet[2215]: I0317 19:40:48.435113 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-kubeconfig\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.435508 kubelet[2215]: I0317 19:40:48.435134 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.435508 kubelet[2215]: I0317 19:40:48.435155 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/9ec61937c6596e0913ad319122feff05-kubeconfig\") pod \"kube-scheduler-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"9ec61937c6596e0913ad319122feff05\") " pod="kube-system/kube-scheduler-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.435508 kubelet[2215]: I0317 19:40:48.435172 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0b3a5bfca3ea1aea22d83e1fab9252d1-ca-certs\") pod \"kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"0b3a5bfca3ea1aea22d83e1fab9252d1\") " pod="kube-system/kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.435508 kubelet[2215]: I0317 19:40:48.435190 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0b3a5bfca3ea1aea22d83e1fab9252d1-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"0b3a5bfca3ea1aea22d83e1fab9252d1\") " pod="kube-system/kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.435796 kubelet[2215]: I0317 19:40:48.435209 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/7eac63e1ce939b7c343e49f9ed09c29a-ca-certs\") pod \"kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal\" (UID: \"7eac63e1ce939b7c343e49f9ed09c29a\") " pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:40:48.998788 kubelet[2215]: I0317 19:40:48.998723 2215 apiserver.go:52] "Watching apiserver" Mar 17 19:40:49.027261 kubelet[2215]: I0317 19:40:49.027203 2215 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 19:40:49.159876 kubelet[2215]: I0317 19:40:49.159797 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-3510-3-7-8-c8b8528301.novalocal" podStartSLOduration=1.159731341 podStartE2EDuration="1.159731341s" podCreationTimestamp="2025-03-17 19:40:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 19:40:49.15946458 +0000 UTC m=+1.269305781" watchObservedRunningTime="2025-03-17 19:40:49.159731341 +0000 UTC m=+1.269572542" Mar 17 19:40:49.160071 kubelet[2215]: I0317 19:40:49.159930 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-3510-3-7-8-c8b8528301.novalocal" podStartSLOduration=1.159923351 podStartE2EDuration="1.159923351s" podCreationTimestamp="2025-03-17 19:40:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 19:40:49.146170982 +0000 UTC m=+1.256012173" watchObservedRunningTime="2025-03-17 19:40:49.159923351 +0000 UTC m=+1.269764552" Mar 17 19:40:49.175342 kubelet[2215]: I0317 19:40:49.174223 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-3510-3-7-8-c8b8528301.novalocal" podStartSLOduration=1.174206728 podStartE2EDuration="1.174206728s" podCreationTimestamp="2025-03-17 19:40:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 19:40:49.174077936 +0000 UTC m=+1.283919127" watchObservedRunningTime="2025-03-17 19:40:49.174206728 +0000 UTC m=+1.284047929" Mar 17 19:40:53.768072 sudo[1468]: pam_unix(sudo:session): session closed for user root Mar 17 19:40:53.767000 audit[1468]: USER_END pid=1468 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:53.772187 kernel: kauditd_printk_skb: 4 callbacks suppressed Mar 17 19:40:53.772295 kernel: audit: type=1106 audit(1742240453.767:233): pid=1468 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:53.787025 kernel: audit: type=1104 audit(1742240453.768:234): pid=1468 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:53.768000 audit[1468]: CRED_DISP pid=1468 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 19:40:53.923189 sshd[1462]: pam_unix(sshd:session): session closed for user core Mar 17 19:40:53.925000 audit[1462]: USER_END pid=1462 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:53.943133 kernel: audit: type=1106 audit(1742240453.925:235): pid=1462 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:53.925000 audit[1462]: CRED_DISP pid=1462 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:53.943693 systemd[1]: sshd@8-172.24.4.218:22-172.24.4.1:45748.service: Deactivated successfully. Mar 17 19:40:53.945453 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 19:40:53.957040 kernel: audit: type=1104 audit(1742240453.925:236): pid=1462 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:40:53.942000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-172.24.4.218:22-172.24.4.1:45748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:53.971560 systemd-logind[1236]: Session 9 logged out. Waiting for processes to exit. Mar 17 19:40:53.972142 kernel: audit: type=1131 audit(1742240453.942:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-172.24.4.218:22-172.24.4.1:45748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:40:53.974146 systemd-logind[1236]: Removed session 9. Mar 17 19:41:00.897800 kubelet[2215]: I0317 19:41:00.897734 2215 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 19:41:00.898508 env[1250]: time="2025-03-17T19:41:00.898479480Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 19:41:00.898933 kubelet[2215]: I0317 19:41:00.898896 2215 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 19:41:01.271621 kubelet[2215]: I0317 19:41:01.271574 2215 topology_manager.go:215] "Topology Admit Handler" podUID="36dded5a-cb33-4d3f-b62c-98ccd2fdc7b1" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-hjjck" Mar 17 19:41:01.316220 kubelet[2215]: I0317 19:41:01.316177 2215 topology_manager.go:215] "Topology Admit Handler" podUID="1e62e53f-a5df-44bb-b6de-4a2de9df7ec0" podNamespace="kube-system" podName="kube-proxy-phf6g" Mar 17 19:41:01.318621 kubelet[2215]: I0317 19:41:01.318578 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/36dded5a-cb33-4d3f-b62c-98ccd2fdc7b1-var-lib-calico\") pod \"tigera-operator-7bc55997bb-hjjck\" (UID: \"36dded5a-cb33-4d3f-b62c-98ccd2fdc7b1\") " pod="tigera-operator/tigera-operator-7bc55997bb-hjjck" Mar 17 19:41:01.318730 kubelet[2215]: I0317 19:41:01.318633 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bt9w\" (UniqueName: \"kubernetes.io/projected/36dded5a-cb33-4d3f-b62c-98ccd2fdc7b1-kube-api-access-6bt9w\") pod \"tigera-operator-7bc55997bb-hjjck\" (UID: \"36dded5a-cb33-4d3f-b62c-98ccd2fdc7b1\") " pod="tigera-operator/tigera-operator-7bc55997bb-hjjck" Mar 17 19:41:01.419565 kubelet[2215]: I0317 19:41:01.419512 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/1e62e53f-a5df-44bb-b6de-4a2de9df7ec0-xtables-lock\") pod \"kube-proxy-phf6g\" (UID: \"1e62e53f-a5df-44bb-b6de-4a2de9df7ec0\") " pod="kube-system/kube-proxy-phf6g" Mar 17 19:41:01.419882 kubelet[2215]: I0317 19:41:01.419834 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5nvx\" (UniqueName: \"kubernetes.io/projected/1e62e53f-a5df-44bb-b6de-4a2de9df7ec0-kube-api-access-t5nvx\") pod \"kube-proxy-phf6g\" (UID: \"1e62e53f-a5df-44bb-b6de-4a2de9df7ec0\") " pod="kube-system/kube-proxy-phf6g" Mar 17 19:41:01.420116 kubelet[2215]: I0317 19:41:01.420067 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1e62e53f-a5df-44bb-b6de-4a2de9df7ec0-lib-modules\") pod \"kube-proxy-phf6g\" (UID: \"1e62e53f-a5df-44bb-b6de-4a2de9df7ec0\") " pod="kube-system/kube-proxy-phf6g" Mar 17 19:41:01.420362 kubelet[2215]: I0317 19:41:01.420336 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/1e62e53f-a5df-44bb-b6de-4a2de9df7ec0-kube-proxy\") pod \"kube-proxy-phf6g\" (UID: \"1e62e53f-a5df-44bb-b6de-4a2de9df7ec0\") " pod="kube-system/kube-proxy-phf6g" Mar 17 19:41:01.577196 env[1250]: time="2025-03-17T19:41:01.576382886Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-hjjck,Uid:36dded5a-cb33-4d3f-b62c-98ccd2fdc7b1,Namespace:tigera-operator,Attempt:0,}" Mar 17 19:41:01.614826 env[1250]: time="2025-03-17T19:41:01.614689945Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:01.615156 env[1250]: time="2025-03-17T19:41:01.614785524Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:01.615357 env[1250]: time="2025-03-17T19:41:01.615134349Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:01.616005 env[1250]: time="2025-03-17T19:41:01.615871813Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/109dfc6d5e6c6047dc4d12b4d812adf541aa19d618e91cbeace4dada42eacedf pid=2296 runtime=io.containerd.runc.v2 Mar 17 19:41:01.624338 env[1250]: time="2025-03-17T19:41:01.622786210Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-phf6g,Uid:1e62e53f-a5df-44bb-b6de-4a2de9df7ec0,Namespace:kube-system,Attempt:0,}" Mar 17 19:41:01.657093 env[1250]: time="2025-03-17T19:41:01.656055634Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:01.657093 env[1250]: time="2025-03-17T19:41:01.656161713Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:01.657093 env[1250]: time="2025-03-17T19:41:01.656195547Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:01.658031 env[1250]: time="2025-03-17T19:41:01.657877805Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/f9cc55377a55e82c4d69da5c7fbd6f20e9986f8c07f781d5b0df229d6e052abc pid=2322 runtime=io.containerd.runc.v2 Mar 17 19:41:01.709097 env[1250]: time="2025-03-17T19:41:01.709050020Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-hjjck,Uid:36dded5a-cb33-4d3f-b62c-98ccd2fdc7b1,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"109dfc6d5e6c6047dc4d12b4d812adf541aa19d618e91cbeace4dada42eacedf\"" Mar 17 19:41:01.712296 env[1250]: time="2025-03-17T19:41:01.712273420Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Mar 17 19:41:01.724549 env[1250]: time="2025-03-17T19:41:01.724504336Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-phf6g,Uid:1e62e53f-a5df-44bb-b6de-4a2de9df7ec0,Namespace:kube-system,Attempt:0,} returns sandbox id \"f9cc55377a55e82c4d69da5c7fbd6f20e9986f8c07f781d5b0df229d6e052abc\"" Mar 17 19:41:01.728119 env[1250]: time="2025-03-17T19:41:01.727968518Z" level=info msg="CreateContainer within sandbox \"f9cc55377a55e82c4d69da5c7fbd6f20e9986f8c07f781d5b0df229d6e052abc\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 19:41:01.750608 env[1250]: time="2025-03-17T19:41:01.750555242Z" level=info msg="CreateContainer within sandbox \"f9cc55377a55e82c4d69da5c7fbd6f20e9986f8c07f781d5b0df229d6e052abc\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9806448d3b28bf9630c63d67b9a15e420b442d9ca2c912549f429b637c321c5d\"" Mar 17 19:41:01.752230 env[1250]: time="2025-03-17T19:41:01.751150268Z" level=info msg="StartContainer for \"9806448d3b28bf9630c63d67b9a15e420b442d9ca2c912549f429b637c321c5d\"" Mar 17 19:41:01.815256 env[1250]: time="2025-03-17T19:41:01.814659659Z" level=info msg="StartContainer for \"9806448d3b28bf9630c63d67b9a15e420b442d9ca2c912549f429b637c321c5d\" returns successfully" Mar 17 19:41:01.882000 audit[2427]: NETFILTER_CFG table=mangle:38 family=2 entries=1 op=nft_register_chain pid=2427 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.892999 kernel: audit: type=1325 audit(1742240461.882:238): table=mangle:38 family=2 entries=1 op=nft_register_chain pid=2427 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.891000 audit[2428]: NETFILTER_CFG table=mangle:39 family=10 entries=1 op=nft_register_chain pid=2428 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:01.891000 audit[2428]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffd40c9340 a2=0 a3=7fffd40c932c items=0 ppid=2387 pid=2428 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.910250 kernel: audit: type=1325 audit(1742240461.891:239): table=mangle:39 family=10 entries=1 op=nft_register_chain pid=2428 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:01.910306 kernel: audit: type=1300 audit(1742240461.891:239): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffd40c9340 a2=0 a3=7fffd40c932c items=0 ppid=2387 pid=2428 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.891000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 19:41:01.914137 kernel: audit: type=1327 audit(1742240461.891:239): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 19:41:01.892000 audit[2429]: NETFILTER_CFG table=nat:40 family=10 entries=1 op=nft_register_chain pid=2429 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:01.918075 kernel: audit: type=1325 audit(1742240461.892:240): table=nat:40 family=10 entries=1 op=nft_register_chain pid=2429 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:01.892000 audit[2429]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff05b34820 a2=0 a3=7fff05b3480c items=0 ppid=2387 pid=2429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.925058 kernel: audit: type=1300 audit(1742240461.892:240): arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff05b34820 a2=0 a3=7fff05b3480c items=0 ppid=2387 pid=2429 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.892000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 19:41:01.928537 kernel: audit: type=1327 audit(1742240461.892:240): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 19:41:01.893000 audit[2430]: NETFILTER_CFG table=filter:41 family=10 entries=1 op=nft_register_chain pid=2430 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:01.932073 kernel: audit: type=1325 audit(1742240461.893:241): table=filter:41 family=10 entries=1 op=nft_register_chain pid=2430 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:01.932123 kernel: audit: type=1300 audit(1742240461.893:241): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdca269a00 a2=0 a3=7ffdca2699ec items=0 ppid=2387 pid=2430 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.893000 audit[2430]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffdca269a00 a2=0 a3=7ffdca2699ec items=0 ppid=2387 pid=2430 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.893000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 19:41:01.946485 kernel: audit: type=1327 audit(1742240461.893:241): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 19:41:01.882000 audit[2427]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffe0015b40 a2=0 a3=7fffe0015b2c items=0 ppid=2387 pid=2427 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.882000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 19:41:01.910000 audit[2431]: NETFILTER_CFG table=nat:42 family=2 entries=1 op=nft_register_chain pid=2431 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.910000 audit[2431]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff86fec780 a2=0 a3=7fff86fec76c items=0 ppid=2387 pid=2431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.910000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 19:41:01.914000 audit[2432]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2432 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.914000 audit[2432]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffed6e68a10 a2=0 a3=7ffed6e689fc items=0 ppid=2387 pid=2432 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.914000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 19:41:01.983000 audit[2433]: NETFILTER_CFG table=filter:44 family=2 entries=1 op=nft_register_chain pid=2433 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.983000 audit[2433]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffe96568b00 a2=0 a3=7ffe96568aec items=0 ppid=2387 pid=2433 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.983000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Mar 17 19:41:01.986000 audit[2435]: NETFILTER_CFG table=filter:45 family=2 entries=1 op=nft_register_rule pid=2435 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.986000 audit[2435]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffe42e7eba0 a2=0 a3=7ffe42e7eb8c items=0 ppid=2387 pid=2435 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.986000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Mar 17 19:41:01.989000 audit[2438]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2438 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.989000 audit[2438]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff4f637aa0 a2=0 a3=7fff4f637a8c items=0 ppid=2387 pid=2438 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.989000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Mar 17 19:41:01.990000 audit[2439]: NETFILTER_CFG table=filter:47 family=2 entries=1 op=nft_register_chain pid=2439 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.990000 audit[2439]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc5eb43050 a2=0 a3=7ffc5eb4303c items=0 ppid=2387 pid=2439 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.990000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Mar 17 19:41:01.992000 audit[2441]: NETFILTER_CFG table=filter:48 family=2 entries=1 op=nft_register_rule pid=2441 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.992000 audit[2441]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe48efff80 a2=0 a3=7ffe48efff6c items=0 ppid=2387 pid=2441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.992000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Mar 17 19:41:01.993000 audit[2442]: NETFILTER_CFG table=filter:49 family=2 entries=1 op=nft_register_chain pid=2442 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.993000 audit[2442]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe3160ece0 a2=0 a3=7ffe3160eccc items=0 ppid=2387 pid=2442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.993000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Mar 17 19:41:01.996000 audit[2444]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_rule pid=2444 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.996000 audit[2444]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7fff103790b0 a2=0 a3=7fff1037909c items=0 ppid=2387 pid=2444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.996000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Mar 17 19:41:01.999000 audit[2447]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_rule pid=2447 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:01.999000 audit[2447]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffedd47de10 a2=0 a3=7ffedd47ddfc items=0 ppid=2387 pid=2447 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:01.999000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Mar 17 19:41:02.000000 audit[2448]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2448 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.000000 audit[2448]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffb5903400 a2=0 a3=7fffb59033ec items=0 ppid=2387 pid=2448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.000000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Mar 17 19:41:02.003000 audit[2450]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_rule pid=2450 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.003000 audit[2450]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffcc37e3aa0 a2=0 a3=7ffcc37e3a8c items=0 ppid=2387 pid=2450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.003000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Mar 17 19:41:02.004000 audit[2451]: NETFILTER_CFG table=filter:54 family=2 entries=1 op=nft_register_chain pid=2451 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.004000 audit[2451]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffcf98cb4e0 a2=0 a3=7ffcf98cb4cc items=0 ppid=2387 pid=2451 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.004000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Mar 17 19:41:02.006000 audit[2453]: NETFILTER_CFG table=filter:55 family=2 entries=1 op=nft_register_rule pid=2453 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.006000 audit[2453]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffda8d8b570 a2=0 a3=7ffda8d8b55c items=0 ppid=2387 pid=2453 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.006000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 19:41:02.010000 audit[2456]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_rule pid=2456 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.010000 audit[2456]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffe09fc6f90 a2=0 a3=7ffe09fc6f7c items=0 ppid=2387 pid=2456 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.010000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 19:41:02.013000 audit[2459]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_rule pid=2459 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.013000 audit[2459]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffcec670610 a2=0 a3=7ffcec6705fc items=0 ppid=2387 pid=2459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.013000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Mar 17 19:41:02.015000 audit[2460]: NETFILTER_CFG table=nat:58 family=2 entries=1 op=nft_register_chain pid=2460 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.015000 audit[2460]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff13e6b020 a2=0 a3=7fff13e6b00c items=0 ppid=2387 pid=2460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.015000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Mar 17 19:41:02.017000 audit[2462]: NETFILTER_CFG table=nat:59 family=2 entries=1 op=nft_register_rule pid=2462 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.017000 audit[2462]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc627e2b60 a2=0 a3=7ffc627e2b4c items=0 ppid=2387 pid=2462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.017000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 19:41:02.021000 audit[2465]: NETFILTER_CFG table=nat:60 family=2 entries=1 op=nft_register_rule pid=2465 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.021000 audit[2465]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd7861bf20 a2=0 a3=7ffd7861bf0c items=0 ppid=2387 pid=2465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.021000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 19:41:02.022000 audit[2466]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=2466 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.022000 audit[2466]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdcd1b6ef0 a2=0 a3=7ffdcd1b6edc items=0 ppid=2387 pid=2466 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.022000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Mar 17 19:41:02.025000 audit[2468]: NETFILTER_CFG table=nat:62 family=2 entries=1 op=nft_register_rule pid=2468 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 19:41:02.025000 audit[2468]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7fffea1d35f0 a2=0 a3=7fffea1d35dc items=0 ppid=2387 pid=2468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.025000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Mar 17 19:41:02.050000 audit[2474]: NETFILTER_CFG table=filter:63 family=2 entries=8 op=nft_register_rule pid=2474 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:02.050000 audit[2474]: SYSCALL arch=c000003e syscall=46 success=yes exit=5164 a0=3 a1=7fff7e1c4720 a2=0 a3=7fff7e1c470c items=0 ppid=2387 pid=2474 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.050000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:02.061000 audit[2474]: NETFILTER_CFG table=nat:64 family=2 entries=14 op=nft_register_chain pid=2474 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:02.061000 audit[2474]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7fff7e1c4720 a2=0 a3=7fff7e1c470c items=0 ppid=2387 pid=2474 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.061000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:02.062000 audit[2479]: NETFILTER_CFG table=filter:65 family=10 entries=1 op=nft_register_chain pid=2479 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.062000 audit[2479]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffd022df3c0 a2=0 a3=7ffd022df3ac items=0 ppid=2387 pid=2479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.062000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Mar 17 19:41:02.068000 audit[2481]: NETFILTER_CFG table=filter:66 family=10 entries=2 op=nft_register_chain pid=2481 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.068000 audit[2481]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffe39ab3010 a2=0 a3=7ffe39ab2ffc items=0 ppid=2387 pid=2481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.068000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Mar 17 19:41:02.077000 audit[2484]: NETFILTER_CFG table=filter:67 family=10 entries=2 op=nft_register_chain pid=2484 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.077000 audit[2484]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffdb99d6950 a2=0 a3=7ffdb99d693c items=0 ppid=2387 pid=2484 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.077000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Mar 17 19:41:02.079000 audit[2485]: NETFILTER_CFG table=filter:68 family=10 entries=1 op=nft_register_chain pid=2485 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.079000 audit[2485]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd8d56d4b0 a2=0 a3=7ffd8d56d49c items=0 ppid=2387 pid=2485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.079000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Mar 17 19:41:02.083000 audit[2488]: NETFILTER_CFG table=filter:69 family=10 entries=1 op=nft_register_rule pid=2488 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.083000 audit[2488]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffdd8beb9f0 a2=0 a3=7ffdd8beb9dc items=0 ppid=2387 pid=2488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.083000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Mar 17 19:41:02.085000 audit[2489]: NETFILTER_CFG table=filter:70 family=10 entries=1 op=nft_register_chain pid=2489 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.085000 audit[2489]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcc549b6c0 a2=0 a3=7ffcc549b6ac items=0 ppid=2387 pid=2489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.085000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Mar 17 19:41:02.089000 audit[2491]: NETFILTER_CFG table=filter:71 family=10 entries=1 op=nft_register_rule pid=2491 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.089000 audit[2491]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffccc7456c0 a2=0 a3=7ffccc7456ac items=0 ppid=2387 pid=2491 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.089000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Mar 17 19:41:02.093000 audit[2494]: NETFILTER_CFG table=filter:72 family=10 entries=2 op=nft_register_chain pid=2494 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.093000 audit[2494]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7fffb1e3dd20 a2=0 a3=7fffb1e3dd0c items=0 ppid=2387 pid=2494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.093000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Mar 17 19:41:02.094000 audit[2495]: NETFILTER_CFG table=filter:73 family=10 entries=1 op=nft_register_chain pid=2495 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.094000 audit[2495]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe3c691a30 a2=0 a3=7ffe3c691a1c items=0 ppid=2387 pid=2495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.094000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Mar 17 19:41:02.097000 audit[2497]: NETFILTER_CFG table=filter:74 family=10 entries=1 op=nft_register_rule pid=2497 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.097000 audit[2497]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffeb23896f0 a2=0 a3=7ffeb23896dc items=0 ppid=2387 pid=2497 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.097000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Mar 17 19:41:02.098000 audit[2498]: NETFILTER_CFG table=filter:75 family=10 entries=1 op=nft_register_chain pid=2498 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.098000 audit[2498]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffdbe78220 a2=0 a3=7fffdbe7820c items=0 ppid=2387 pid=2498 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.098000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Mar 17 19:41:02.102000 audit[2500]: NETFILTER_CFG table=filter:76 family=10 entries=1 op=nft_register_rule pid=2500 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.102000 audit[2500]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffca7d1d060 a2=0 a3=7ffca7d1d04c items=0 ppid=2387 pid=2500 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.102000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 19:41:02.107000 audit[2503]: NETFILTER_CFG table=filter:77 family=10 entries=1 op=nft_register_rule pid=2503 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.107000 audit[2503]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffca1449910 a2=0 a3=7ffca14498fc items=0 ppid=2387 pid=2503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.107000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Mar 17 19:41:02.111000 audit[2506]: NETFILTER_CFG table=filter:78 family=10 entries=1 op=nft_register_rule pid=2506 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.111000 audit[2506]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffecb09ea10 a2=0 a3=7ffecb09e9fc items=0 ppid=2387 pid=2506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.111000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Mar 17 19:41:02.113000 audit[2507]: NETFILTER_CFG table=nat:79 family=10 entries=1 op=nft_register_chain pid=2507 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.113000 audit[2507]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd4048bc10 a2=0 a3=7ffd4048bbfc items=0 ppid=2387 pid=2507 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.113000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Mar 17 19:41:02.116000 audit[2509]: NETFILTER_CFG table=nat:80 family=10 entries=2 op=nft_register_chain pid=2509 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.116000 audit[2509]: SYSCALL arch=c000003e syscall=46 success=yes exit=600 a0=3 a1=7ffe94104500 a2=0 a3=7ffe941044ec items=0 ppid=2387 pid=2509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.116000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 19:41:02.120000 audit[2512]: NETFILTER_CFG table=nat:81 family=10 entries=2 op=nft_register_chain pid=2512 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.120000 audit[2512]: SYSCALL arch=c000003e syscall=46 success=yes exit=608 a0=3 a1=7ffe0bee2960 a2=0 a3=7ffe0bee294c items=0 ppid=2387 pid=2512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.120000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 19:41:02.121000 audit[2513]: NETFILTER_CFG table=nat:82 family=10 entries=1 op=nft_register_chain pid=2513 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.121000 audit[2513]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff591d7400 a2=0 a3=7fff591d73ec items=0 ppid=2387 pid=2513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.121000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Mar 17 19:41:02.125000 audit[2515]: NETFILTER_CFG table=nat:83 family=10 entries=2 op=nft_register_chain pid=2515 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.125000 audit[2515]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7fff0e102730 a2=0 a3=7fff0e10271c items=0 ppid=2387 pid=2515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.125000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Mar 17 19:41:02.128000 audit[2516]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=2516 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.128000 audit[2516]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff0d58cdc0 a2=0 a3=7fff0d58cdac items=0 ppid=2387 pid=2516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.128000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Mar 17 19:41:02.131000 audit[2518]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=2518 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.131000 audit[2518]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffdaf455db0 a2=0 a3=7ffdaf455d9c items=0 ppid=2387 pid=2518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.131000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 19:41:02.137000 audit[2521]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_rule pid=2521 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 19:41:02.137000 audit[2521]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffcb5090a10 a2=0 a3=7ffcb50909fc items=0 ppid=2387 pid=2521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.137000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 19:41:02.143000 audit[2523]: NETFILTER_CFG table=filter:87 family=10 entries=3 op=nft_register_rule pid=2523 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Mar 17 19:41:02.143000 audit[2523]: SYSCALL arch=c000003e syscall=46 success=yes exit=2004 a0=3 a1=7ffc4cd54ec0 a2=0 a3=7ffc4cd54eac items=0 ppid=2387 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.143000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:02.146000 audit[2523]: NETFILTER_CFG table=nat:88 family=10 entries=7 op=nft_register_chain pid=2523 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Mar 17 19:41:02.146000 audit[2523]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffc4cd54ec0 a2=0 a3=7ffc4cd54eac items=0 ppid=2387 pid=2523 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:02.146000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:02.172225 kubelet[2215]: I0317 19:41:02.172135 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-phf6g" podStartSLOduration=1.172071212 podStartE2EDuration="1.172071212s" podCreationTimestamp="2025-03-17 19:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 19:41:02.171660841 +0000 UTC m=+14.281502082" watchObservedRunningTime="2025-03-17 19:41:02.172071212 +0000 UTC m=+14.281912453" Mar 17 19:41:03.621475 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount470131311.mount: Deactivated successfully. Mar 17 19:41:04.622209 env[1250]: time="2025-03-17T19:41:04.622169139Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator:v1.36.2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:04.626366 env[1250]: time="2025-03-17T19:41:04.626341769Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:04.629734 env[1250]: time="2025-03-17T19:41:04.629713228Z" level=info msg="ImageUpdate event &ImageUpdate{Name:quay.io/tigera/operator:v1.36.2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:04.633159 env[1250]: time="2025-03-17T19:41:04.633137223Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:04.635039 env[1250]: time="2025-03-17T19:41:04.634930830Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Mar 17 19:41:04.647672 env[1250]: time="2025-03-17T19:41:04.647600799Z" level=info msg="CreateContainer within sandbox \"109dfc6d5e6c6047dc4d12b4d812adf541aa19d618e91cbeace4dada42eacedf\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 19:41:04.664985 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2744193249.mount: Deactivated successfully. Mar 17 19:41:04.671580 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2945311169.mount: Deactivated successfully. Mar 17 19:41:04.682005 env[1250]: time="2025-03-17T19:41:04.681942291Z" level=info msg="CreateContainer within sandbox \"109dfc6d5e6c6047dc4d12b4d812adf541aa19d618e91cbeace4dada42eacedf\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"f88f7c1685d4d0911112d5850558dd835eb229b47007f01b983e887aad082077\"" Mar 17 19:41:04.684106 env[1250]: time="2025-03-17T19:41:04.684082157Z" level=info msg="StartContainer for \"f88f7c1685d4d0911112d5850558dd835eb229b47007f01b983e887aad082077\"" Mar 17 19:41:04.752019 env[1250]: time="2025-03-17T19:41:04.751968767Z" level=info msg="StartContainer for \"f88f7c1685d4d0911112d5850558dd835eb229b47007f01b983e887aad082077\" returns successfully" Mar 17 19:41:08.051651 kubelet[2215]: I0317 19:41:08.051550 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-hjjck" podStartSLOduration=4.1243400040000004 podStartE2EDuration="7.05149719s" podCreationTimestamp="2025-03-17 19:41:01 +0000 UTC" firstStartedPulling="2025-03-17 19:41:01.710777793 +0000 UTC m=+13.820618984" lastFinishedPulling="2025-03-17 19:41:04.637934978 +0000 UTC m=+16.747776170" observedRunningTime="2025-03-17 19:41:05.192062107 +0000 UTC m=+17.301903349" watchObservedRunningTime="2025-03-17 19:41:08.05149719 +0000 UTC m=+20.161338431" Mar 17 19:41:08.054000 audit[2562]: NETFILTER_CFG table=filter:89 family=2 entries=15 op=nft_register_rule pid=2562 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:08.058854 kernel: kauditd_printk_skb: 143 callbacks suppressed Mar 17 19:41:08.058925 kernel: audit: type=1325 audit(1742240468.054:289): table=filter:89 family=2 entries=15 op=nft_register_rule pid=2562 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:08.054000 audit[2562]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffcff401080 a2=0 a3=7ffcff40106c items=0 ppid=2387 pid=2562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:08.071614 kernel: audit: type=1300 audit(1742240468.054:289): arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffcff401080 a2=0 a3=7ffcff40106c items=0 ppid=2387 pid=2562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:08.054000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:08.067000 audit[2562]: NETFILTER_CFG table=nat:90 family=2 entries=12 op=nft_register_rule pid=2562 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:08.082121 kernel: audit: type=1327 audit(1742240468.054:289): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:08.082221 kernel: audit: type=1325 audit(1742240468.067:290): table=nat:90 family=2 entries=12 op=nft_register_rule pid=2562 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:08.067000 audit[2562]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcff401080 a2=0 a3=0 items=0 ppid=2387 pid=2562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:08.091977 kernel: audit: type=1300 audit(1742240468.067:290): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffcff401080 a2=0 a3=0 items=0 ppid=2387 pid=2562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:08.067000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:08.096966 kernel: audit: type=1327 audit(1742240468.067:290): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:08.084000 audit[2564]: NETFILTER_CFG table=filter:91 family=2 entries=16 op=nft_register_rule pid=2564 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:08.104982 kernel: audit: type=1325 audit(1742240468.084:291): table=filter:91 family=2 entries=16 op=nft_register_rule pid=2564 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:08.084000 audit[2564]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffd080ff0b0 a2=0 a3=7ffd080ff09c items=0 ppid=2387 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:08.115969 kernel: audit: type=1300 audit(1742240468.084:291): arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffd080ff0b0 a2=0 a3=7ffd080ff09c items=0 ppid=2387 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:08.084000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:08.098000 audit[2564]: NETFILTER_CFG table=nat:92 family=2 entries=12 op=nft_register_rule pid=2564 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:08.132526 kernel: audit: type=1327 audit(1742240468.084:291): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:08.132619 kernel: audit: type=1325 audit(1742240468.098:292): table=nat:92 family=2 entries=12 op=nft_register_rule pid=2564 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:08.098000 audit[2564]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd080ff0b0 a2=0 a3=0 items=0 ppid=2387 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:08.098000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:08.296877 kubelet[2215]: I0317 19:41:08.296841 2215 topology_manager.go:215] "Topology Admit Handler" podUID="50943077-b08f-48d5-9baf-b982b799d438" podNamespace="calico-system" podName="calico-typha-5d5f5b6d95-nkx7z" Mar 17 19:41:08.361132 kubelet[2215]: I0317 19:41:08.361022 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/50943077-b08f-48d5-9baf-b982b799d438-tigera-ca-bundle\") pod \"calico-typha-5d5f5b6d95-nkx7z\" (UID: \"50943077-b08f-48d5-9baf-b982b799d438\") " pod="calico-system/calico-typha-5d5f5b6d95-nkx7z" Mar 17 19:41:08.361132 kubelet[2215]: I0317 19:41:08.361074 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkvmd\" (UniqueName: \"kubernetes.io/projected/50943077-b08f-48d5-9baf-b982b799d438-kube-api-access-lkvmd\") pod \"calico-typha-5d5f5b6d95-nkx7z\" (UID: \"50943077-b08f-48d5-9baf-b982b799d438\") " pod="calico-system/calico-typha-5d5f5b6d95-nkx7z" Mar 17 19:41:08.361132 kubelet[2215]: I0317 19:41:08.361098 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/50943077-b08f-48d5-9baf-b982b799d438-typha-certs\") pod \"calico-typha-5d5f5b6d95-nkx7z\" (UID: \"50943077-b08f-48d5-9baf-b982b799d438\") " pod="calico-system/calico-typha-5d5f5b6d95-nkx7z" Mar 17 19:41:08.404138 kubelet[2215]: I0317 19:41:08.404101 2215 topology_manager.go:215] "Topology Admit Handler" podUID="de0606a7-b6ba-40e9-84c3-11a88cadaf75" podNamespace="calico-system" podName="calico-node-2jq7c" Mar 17 19:41:08.462147 kubelet[2215]: I0317 19:41:08.462077 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/de0606a7-b6ba-40e9-84c3-11a88cadaf75-node-certs\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462147 kubelet[2215]: I0317 19:41:08.462140 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/de0606a7-b6ba-40e9-84c3-11a88cadaf75-cni-net-dir\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462359 kubelet[2215]: I0317 19:41:08.462166 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/de0606a7-b6ba-40e9-84c3-11a88cadaf75-var-run-calico\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462359 kubelet[2215]: I0317 19:41:08.462205 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/de0606a7-b6ba-40e9-84c3-11a88cadaf75-var-lib-calico\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462359 kubelet[2215]: I0317 19:41:08.462226 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/de0606a7-b6ba-40e9-84c3-11a88cadaf75-flexvol-driver-host\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462359 kubelet[2215]: I0317 19:41:08.462248 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/de0606a7-b6ba-40e9-84c3-11a88cadaf75-lib-modules\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462359 kubelet[2215]: I0317 19:41:08.462284 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de0606a7-b6ba-40e9-84c3-11a88cadaf75-tigera-ca-bundle\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462505 kubelet[2215]: I0317 19:41:08.462365 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/de0606a7-b6ba-40e9-84c3-11a88cadaf75-xtables-lock\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462505 kubelet[2215]: I0317 19:41:08.462387 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/de0606a7-b6ba-40e9-84c3-11a88cadaf75-cni-bin-dir\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462505 kubelet[2215]: I0317 19:41:08.462405 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/de0606a7-b6ba-40e9-84c3-11a88cadaf75-cni-log-dir\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462505 kubelet[2215]: I0317 19:41:08.462426 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b84s\" (UniqueName: \"kubernetes.io/projected/de0606a7-b6ba-40e9-84c3-11a88cadaf75-kube-api-access-6b84s\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.462505 kubelet[2215]: I0317 19:41:08.462465 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/de0606a7-b6ba-40e9-84c3-11a88cadaf75-policysync\") pod \"calico-node-2jq7c\" (UID: \"de0606a7-b6ba-40e9-84c3-11a88cadaf75\") " pod="calico-system/calico-node-2jq7c" Mar 17 19:41:08.529759 kubelet[2215]: I0317 19:41:08.529726 2215 topology_manager.go:215] "Topology Admit Handler" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" podNamespace="calico-system" podName="csi-node-driver-ptzbm" Mar 17 19:41:08.530193 kubelet[2215]: E0317 19:41:08.530170 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:08.564301 kubelet[2215]: I0317 19:41:08.564271 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/b69f8cef-aedd-412b-a964-af34b5f74525-registration-dir\") pod \"csi-node-driver-ptzbm\" (UID: \"b69f8cef-aedd-412b-a964-af34b5f74525\") " pod="calico-system/csi-node-driver-ptzbm" Mar 17 19:41:08.566391 kubelet[2215]: E0317 19:41:08.566361 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.566505 kubelet[2215]: W0317 19:41:08.566490 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.566604 kubelet[2215]: E0317 19:41:08.566588 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.566870 kubelet[2215]: E0317 19:41:08.566857 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.566968 kubelet[2215]: W0317 19:41:08.566941 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.567031 kubelet[2215]: E0317 19:41:08.567020 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.568974 kubelet[2215]: E0317 19:41:08.568941 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.569094 kubelet[2215]: W0317 19:41:08.569080 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.569195 kubelet[2215]: E0317 19:41:08.569182 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.569550 kubelet[2215]: E0317 19:41:08.569515 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.569550 kubelet[2215]: W0317 19:41:08.569541 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.569674 kubelet[2215]: E0317 19:41:08.569567 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.569864 kubelet[2215]: E0317 19:41:08.569843 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.569864 kubelet[2215]: W0317 19:41:08.569864 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.569992 kubelet[2215]: E0317 19:41:08.569973 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.570141 kubelet[2215]: E0317 19:41:08.570123 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.570141 kubelet[2215]: W0317 19:41:08.570137 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.570261 kubelet[2215]: E0317 19:41:08.570245 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.570445 kubelet[2215]: E0317 19:41:08.570429 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.570445 kubelet[2215]: W0317 19:41:08.570442 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.570545 kubelet[2215]: E0317 19:41:08.570526 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.570701 kubelet[2215]: E0317 19:41:08.570672 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.570701 kubelet[2215]: W0317 19:41:08.570687 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.570701 kubelet[2215]: E0317 19:41:08.570700 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.570982 kubelet[2215]: E0317 19:41:08.570914 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.570982 kubelet[2215]: W0317 19:41:08.570926 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.570982 kubelet[2215]: E0317 19:41:08.570968 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.572179 kubelet[2215]: E0317 19:41:08.572159 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.572179 kubelet[2215]: W0317 19:41:08.572174 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.572314 kubelet[2215]: E0317 19:41:08.572185 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.572725 kubelet[2215]: E0317 19:41:08.572706 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.572725 kubelet[2215]: W0317 19:41:08.572720 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.572884 kubelet[2215]: E0317 19:41:08.572868 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.573573 kubelet[2215]: E0317 19:41:08.573545 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.573573 kubelet[2215]: W0317 19:41:08.573559 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.573687 kubelet[2215]: E0317 19:41:08.573652 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.573904 kubelet[2215]: E0317 19:41:08.573885 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.573904 kubelet[2215]: W0317 19:41:08.573899 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.574058 kubelet[2215]: E0317 19:41:08.574039 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.575624 kubelet[2215]: E0317 19:41:08.575592 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.575624 kubelet[2215]: W0317 19:41:08.575608 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.575767 kubelet[2215]: E0317 19:41:08.575745 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.575976 kubelet[2215]: E0317 19:41:08.575936 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.576029 kubelet[2215]: W0317 19:41:08.575973 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.576083 kubelet[2215]: E0317 19:41:08.576062 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.576705 kubelet[2215]: E0317 19:41:08.576682 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.576705 kubelet[2215]: W0317 19:41:08.576699 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.576842 kubelet[2215]: E0317 19:41:08.576817 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.577569 kubelet[2215]: E0317 19:41:08.577551 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.577569 kubelet[2215]: W0317 19:41:08.577564 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.577664 kubelet[2215]: E0317 19:41:08.577645 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.577821 kubelet[2215]: E0317 19:41:08.577805 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.577821 kubelet[2215]: W0317 19:41:08.577816 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.577920 kubelet[2215]: E0317 19:41:08.577900 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.578774 kubelet[2215]: E0317 19:41:08.578757 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.578774 kubelet[2215]: W0317 19:41:08.578770 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.578870 kubelet[2215]: E0317 19:41:08.578849 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.579543 kubelet[2215]: E0317 19:41:08.579524 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.579543 kubelet[2215]: W0317 19:41:08.579538 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.579639 kubelet[2215]: E0317 19:41:08.579614 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.580300 kubelet[2215]: E0317 19:41:08.580281 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.580300 kubelet[2215]: W0317 19:41:08.580294 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.580387 kubelet[2215]: E0317 19:41:08.580371 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.580812 kubelet[2215]: E0317 19:41:08.580792 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.580812 kubelet[2215]: W0317 19:41:08.580805 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.580969 kubelet[2215]: E0317 19:41:08.580877 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.583047 kubelet[2215]: E0317 19:41:08.583021 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.583047 kubelet[2215]: W0317 19:41:08.583037 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.583180 kubelet[2215]: E0317 19:41:08.583128 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.583356 kubelet[2215]: E0317 19:41:08.583339 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.583356 kubelet[2215]: W0317 19:41:08.583351 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.583468 kubelet[2215]: E0317 19:41:08.583458 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.583639 kubelet[2215]: E0317 19:41:08.583620 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.583639 kubelet[2215]: W0317 19:41:08.583633 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.583740 kubelet[2215]: E0317 19:41:08.583722 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.583917 kubelet[2215]: E0317 19:41:08.583900 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.583917 kubelet[2215]: W0317 19:41:08.583913 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.584039 kubelet[2215]: E0317 19:41:08.584020 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.584673 kubelet[2215]: E0317 19:41:08.584597 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.584673 kubelet[2215]: W0317 19:41:08.584665 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.584771 kubelet[2215]: E0317 19:41:08.584746 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.584806 kubelet[2215]: I0317 19:41:08.584776 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qsl8k\" (UniqueName: \"kubernetes.io/projected/b69f8cef-aedd-412b-a964-af34b5f74525-kube-api-access-qsl8k\") pod \"csi-node-driver-ptzbm\" (UID: \"b69f8cef-aedd-412b-a964-af34b5f74525\") " pod="calico-system/csi-node-driver-ptzbm" Mar 17 19:41:08.585187 kubelet[2215]: E0317 19:41:08.585166 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.585187 kubelet[2215]: W0317 19:41:08.585180 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.585285 kubelet[2215]: E0317 19:41:08.585258 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.585447 kubelet[2215]: E0317 19:41:08.585430 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.585447 kubelet[2215]: W0317 19:41:08.585443 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.585538 kubelet[2215]: E0317 19:41:08.585518 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.585699 kubelet[2215]: E0317 19:41:08.585683 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.585699 kubelet[2215]: W0317 19:41:08.585694 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.585839 kubelet[2215]: E0317 19:41:08.585816 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.585994 kubelet[2215]: E0317 19:41:08.585975 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.585994 kubelet[2215]: W0317 19:41:08.585989 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.586150 kubelet[2215]: E0317 19:41:08.586128 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.586372 kubelet[2215]: E0317 19:41:08.586354 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.586372 kubelet[2215]: W0317 19:41:08.586367 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.586472 kubelet[2215]: E0317 19:41:08.586454 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.586628 kubelet[2215]: E0317 19:41:08.586612 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.586628 kubelet[2215]: W0317 19:41:08.586624 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.586720 kubelet[2215]: E0317 19:41:08.586701 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.586886 kubelet[2215]: E0317 19:41:08.586856 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.586982 kubelet[2215]: W0317 19:41:08.586888 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.587029 kubelet[2215]: E0317 19:41:08.586982 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.587187 kubelet[2215]: E0317 19:41:08.587170 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.587187 kubelet[2215]: W0317 19:41:08.587183 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.587284 kubelet[2215]: E0317 19:41:08.587273 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.587461 kubelet[2215]: E0317 19:41:08.587445 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.587461 kubelet[2215]: W0317 19:41:08.587458 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.587549 kubelet[2215]: E0317 19:41:08.587530 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.587700 kubelet[2215]: E0317 19:41:08.587684 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.587700 kubelet[2215]: W0317 19:41:08.587696 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.587790 kubelet[2215]: E0317 19:41:08.587708 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.587997 kubelet[2215]: E0317 19:41:08.587942 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.587997 kubelet[2215]: W0317 19:41:08.587992 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.588085 kubelet[2215]: E0317 19:41:08.588074 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.588237 kubelet[2215]: E0317 19:41:08.588221 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.588237 kubelet[2215]: W0317 19:41:08.588233 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.588326 kubelet[2215]: E0317 19:41:08.588307 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.588490 kubelet[2215]: E0317 19:41:08.588463 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.588490 kubelet[2215]: W0317 19:41:08.588475 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.588586 kubelet[2215]: E0317 19:41:08.588558 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.588743 kubelet[2215]: E0317 19:41:08.588724 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.588805 kubelet[2215]: W0317 19:41:08.588743 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.588837 kubelet[2215]: E0317 19:41:08.588817 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.589042 kubelet[2215]: E0317 19:41:08.589025 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.589105 kubelet[2215]: W0317 19:41:08.589048 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.589140 kubelet[2215]: E0317 19:41:08.589124 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.589288 kubelet[2215]: E0317 19:41:08.589269 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.589288 kubelet[2215]: W0317 19:41:08.589283 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.589365 kubelet[2215]: E0317 19:41:08.589293 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.589365 kubelet[2215]: I0317 19:41:08.589314 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/b69f8cef-aedd-412b-a964-af34b5f74525-varrun\") pod \"csi-node-driver-ptzbm\" (UID: \"b69f8cef-aedd-412b-a964-af34b5f74525\") " pod="calico-system/csi-node-driver-ptzbm" Mar 17 19:41:08.591295 kubelet[2215]: E0317 19:41:08.591263 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.591295 kubelet[2215]: W0317 19:41:08.591284 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.591295 kubelet[2215]: E0317 19:41:08.591303 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.591471 kubelet[2215]: I0317 19:41:08.591325 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/b69f8cef-aedd-412b-a964-af34b5f74525-socket-dir\") pod \"csi-node-driver-ptzbm\" (UID: \"b69f8cef-aedd-412b-a964-af34b5f74525\") " pod="calico-system/csi-node-driver-ptzbm" Mar 17 19:41:08.591708 kubelet[2215]: E0317 19:41:08.591689 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.591708 kubelet[2215]: W0317 19:41:08.591704 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.591799 kubelet[2215]: E0317 19:41:08.591781 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.591830 kubelet[2215]: I0317 19:41:08.591803 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b69f8cef-aedd-412b-a964-af34b5f74525-kubelet-dir\") pod \"csi-node-driver-ptzbm\" (UID: \"b69f8cef-aedd-412b-a964-af34b5f74525\") " pod="calico-system/csi-node-driver-ptzbm" Mar 17 19:41:08.592026 kubelet[2215]: E0317 19:41:08.592009 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.592026 kubelet[2215]: W0317 19:41:08.592021 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.592113 kubelet[2215]: E0317 19:41:08.592092 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.592284 kubelet[2215]: E0317 19:41:08.592267 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.592284 kubelet[2215]: W0317 19:41:08.592280 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.592391 kubelet[2215]: E0317 19:41:08.592373 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.592528 kubelet[2215]: E0317 19:41:08.592512 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.592528 kubelet[2215]: W0317 19:41:08.592524 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.592659 kubelet[2215]: E0317 19:41:08.592635 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.592850 kubelet[2215]: E0317 19:41:08.592833 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.592850 kubelet[2215]: W0317 19:41:08.592845 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.592980 kubelet[2215]: E0317 19:41:08.592962 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.593124 kubelet[2215]: E0317 19:41:08.593109 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.593124 kubelet[2215]: W0317 19:41:08.593120 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.593213 kubelet[2215]: E0317 19:41:08.593198 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.593368 kubelet[2215]: E0317 19:41:08.593352 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.593368 kubelet[2215]: W0317 19:41:08.593364 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.593446 kubelet[2215]: E0317 19:41:08.593437 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.593608 kubelet[2215]: E0317 19:41:08.593592 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.593608 kubelet[2215]: W0317 19:41:08.593603 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.593694 kubelet[2215]: E0317 19:41:08.593674 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.593835 kubelet[2215]: E0317 19:41:08.593820 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.593835 kubelet[2215]: W0317 19:41:08.593832 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.593911 kubelet[2215]: E0317 19:41:08.593901 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.594076 kubelet[2215]: E0317 19:41:08.594061 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.594076 kubelet[2215]: W0317 19:41:08.594073 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.594159 kubelet[2215]: E0317 19:41:08.594085 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.594302 kubelet[2215]: E0317 19:41:08.594286 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.594302 kubelet[2215]: W0317 19:41:08.594299 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.594376 kubelet[2215]: E0317 19:41:08.594311 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.594682 kubelet[2215]: E0317 19:41:08.594517 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.594682 kubelet[2215]: W0317 19:41:08.594531 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.594682 kubelet[2215]: E0317 19:41:08.594546 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.596740 kubelet[2215]: E0317 19:41:08.594979 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.596740 kubelet[2215]: W0317 19:41:08.594993 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.596740 kubelet[2215]: E0317 19:41:08.595009 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.597119 kubelet[2215]: E0317 19:41:08.597099 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.597119 kubelet[2215]: W0317 19:41:08.597113 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.597211 kubelet[2215]: E0317 19:41:08.597151 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.600462 env[1250]: time="2025-03-17T19:41:08.600072566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d5f5b6d95-nkx7z,Uid:50943077-b08f-48d5-9baf-b982b799d438,Namespace:calico-system,Attempt:0,}" Mar 17 19:41:08.630016 kubelet[2215]: E0317 19:41:08.629317 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.630016 kubelet[2215]: W0317 19:41:08.629338 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.630016 kubelet[2215]: E0317 19:41:08.629361 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.630016 kubelet[2215]: E0317 19:41:08.629622 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.630016 kubelet[2215]: W0317 19:41:08.629632 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.630016 kubelet[2215]: E0317 19:41:08.629643 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.660405 env[1250]: time="2025-03-17T19:41:08.660295445Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:08.660567 env[1250]: time="2025-03-17T19:41:08.660419267Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:08.660567 env[1250]: time="2025-03-17T19:41:08.660451818Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:08.660829 env[1250]: time="2025-03-17T19:41:08.660731243Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/6d47d5cfddb51d892abcd261b8a4b6d0d9f77792e080b3c5effcf3cd186eb4d5 pid=2635 runtime=io.containerd.runc.v2 Mar 17 19:41:08.692822 kubelet[2215]: E0317 19:41:08.692786 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.692822 kubelet[2215]: W0317 19:41:08.692816 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.693037 kubelet[2215]: E0317 19:41:08.692836 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.693070 kubelet[2215]: E0317 19:41:08.693050 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.693070 kubelet[2215]: W0317 19:41:08.693060 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.693138 kubelet[2215]: E0317 19:41:08.693071 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.693257 kubelet[2215]: E0317 19:41:08.693234 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.693257 kubelet[2215]: W0317 19:41:08.693247 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.693257 kubelet[2215]: E0317 19:41:08.693257 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.693446 kubelet[2215]: E0317 19:41:08.693417 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.693446 kubelet[2215]: W0317 19:41:08.693430 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.693446 kubelet[2215]: E0317 19:41:08.693446 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.693634 kubelet[2215]: E0317 19:41:08.693613 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.693634 kubelet[2215]: W0317 19:41:08.693626 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.693714 kubelet[2215]: E0317 19:41:08.693636 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.693847 kubelet[2215]: E0317 19:41:08.693828 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.693847 kubelet[2215]: W0317 19:41:08.693840 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.693932 kubelet[2215]: E0317 19:41:08.693853 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.694981 kubelet[2215]: E0317 19:41:08.694043 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.694981 kubelet[2215]: W0317 19:41:08.694055 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.694981 kubelet[2215]: E0317 19:41:08.694079 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.694981 kubelet[2215]: E0317 19:41:08.694227 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.694981 kubelet[2215]: W0317 19:41:08.694236 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.694981 kubelet[2215]: E0317 19:41:08.694245 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.694981 kubelet[2215]: E0317 19:41:08.694431 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.694981 kubelet[2215]: W0317 19:41:08.694441 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.694981 kubelet[2215]: E0317 19:41:08.694452 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.694981 kubelet[2215]: E0317 19:41:08.694651 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.695268 kubelet[2215]: W0317 19:41:08.694661 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.695268 kubelet[2215]: E0317 19:41:08.694674 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.695268 kubelet[2215]: E0317 19:41:08.694895 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.695268 kubelet[2215]: W0317 19:41:08.694906 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.695268 kubelet[2215]: E0317 19:41:08.694985 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.695268 kubelet[2215]: E0317 19:41:08.695162 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.695268 kubelet[2215]: W0317 19:41:08.695171 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.695268 kubelet[2215]: E0317 19:41:08.695255 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.695458 kubelet[2215]: E0317 19:41:08.695384 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.695458 kubelet[2215]: W0317 19:41:08.695394 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.695510 kubelet[2215]: E0317 19:41:08.695465 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.697012 kubelet[2215]: E0317 19:41:08.695584 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.697012 kubelet[2215]: W0317 19:41:08.695595 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.697012 kubelet[2215]: E0317 19:41:08.695636 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.697012 kubelet[2215]: E0317 19:41:08.695806 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.697012 kubelet[2215]: W0317 19:41:08.695815 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.697012 kubelet[2215]: E0317 19:41:08.695827 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.697012 kubelet[2215]: E0317 19:41:08.696063 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.697012 kubelet[2215]: W0317 19:41:08.696072 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.697012 kubelet[2215]: E0317 19:41:08.696084 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.697012 kubelet[2215]: E0317 19:41:08.696507 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.697318 kubelet[2215]: W0317 19:41:08.696515 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.697318 kubelet[2215]: E0317 19:41:08.696528 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.697318 kubelet[2215]: E0317 19:41:08.696729 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.697318 kubelet[2215]: W0317 19:41:08.696739 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.697318 kubelet[2215]: E0317 19:41:08.696819 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.697318 kubelet[2215]: E0317 19:41:08.697170 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.697318 kubelet[2215]: W0317 19:41:08.697180 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.697496 kubelet[2215]: E0317 19:41:08.697351 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.697496 kubelet[2215]: W0317 19:41:08.697360 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.697496 kubelet[2215]: E0317 19:41:08.697370 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.697591 kubelet[2215]: E0317 19:41:08.697548 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.697591 kubelet[2215]: W0317 19:41:08.697584 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.697646 kubelet[2215]: E0317 19:41:08.697594 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.697759 kubelet[2215]: E0317 19:41:08.697726 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.697813 kubelet[2215]: E0317 19:41:08.697805 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.697849 kubelet[2215]: W0317 19:41:08.697840 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.697887 kubelet[2215]: E0317 19:41:08.697856 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.699181 kubelet[2215]: E0317 19:41:08.698090 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.699181 kubelet[2215]: W0317 19:41:08.698103 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.699181 kubelet[2215]: E0317 19:41:08.698145 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.699181 kubelet[2215]: E0317 19:41:08.698329 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.699181 kubelet[2215]: W0317 19:41:08.698338 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.699181 kubelet[2215]: E0317 19:41:08.698348 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.699181 kubelet[2215]: E0317 19:41:08.698594 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.699181 kubelet[2215]: W0317 19:41:08.698604 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.699181 kubelet[2215]: E0317 19:41:08.698613 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.710422 env[1250]: time="2025-03-17T19:41:08.710383984Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2jq7c,Uid:de0606a7-b6ba-40e9-84c3-11a88cadaf75,Namespace:calico-system,Attempt:0,}" Mar 17 19:41:08.714121 kubelet[2215]: E0317 19:41:08.714078 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:08.714391 kubelet[2215]: W0317 19:41:08.714362 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:08.714515 kubelet[2215]: E0317 19:41:08.714489 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:08.734576 env[1250]: time="2025-03-17T19:41:08.734511835Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:08.734755 env[1250]: time="2025-03-17T19:41:08.734730485Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:08.734863 env[1250]: time="2025-03-17T19:41:08.734840481Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:08.735768 env[1250]: time="2025-03-17T19:41:08.735705985Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/b0fa39a337560e0ba3a063bce0d5bfd9ef3286da2f35b1bb2ee8bec8c90071c1 pid=2696 runtime=io.containerd.runc.v2 Mar 17 19:41:08.778982 env[1250]: time="2025-03-17T19:41:08.778920545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-5d5f5b6d95-nkx7z,Uid:50943077-b08f-48d5-9baf-b982b799d438,Namespace:calico-system,Attempt:0,} returns sandbox id \"6d47d5cfddb51d892abcd261b8a4b6d0d9f77792e080b3c5effcf3cd186eb4d5\"" Mar 17 19:41:08.782111 env[1250]: time="2025-03-17T19:41:08.782077540Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Mar 17 19:41:08.818689 env[1250]: time="2025-03-17T19:41:08.818626040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-2jq7c,Uid:de0606a7-b6ba-40e9-84c3-11a88cadaf75,Namespace:calico-system,Attempt:0,} returns sandbox id \"b0fa39a337560e0ba3a063bce0d5bfd9ef3286da2f35b1bb2ee8bec8c90071c1\"" Mar 17 19:41:09.112000 audit[2737]: NETFILTER_CFG table=filter:93 family=2 entries=17 op=nft_register_rule pid=2737 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:09.112000 audit[2737]: SYSCALL arch=c000003e syscall=46 success=yes exit=6652 a0=3 a1=7ffe4dafeb20 a2=0 a3=7ffe4dafeb0c items=0 ppid=2387 pid=2737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:09.112000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:09.119000 audit[2737]: NETFILTER_CFG table=nat:94 family=2 entries=12 op=nft_register_rule pid=2737 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:09.119000 audit[2737]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe4dafeb20 a2=0 a3=0 items=0 ppid=2387 pid=2737 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:09.119000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:10.033545 kubelet[2215]: E0317 19:41:10.033467 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:11.225774 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3992889006.mount: Deactivated successfully. Mar 17 19:41:12.031183 kubelet[2215]: E0317 19:41:12.030424 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:12.939961 env[1250]: time="2025-03-17T19:41:12.939900823Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:12.943091 env[1250]: time="2025-03-17T19:41:12.943055443Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:12.945670 env[1250]: time="2025-03-17T19:41:12.945635394Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/typha:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:12.948247 env[1250]: time="2025-03-17T19:41:12.948209986Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:12.948900 env[1250]: time="2025-03-17T19:41:12.948851540Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Mar 17 19:41:12.950875 env[1250]: time="2025-03-17T19:41:12.950847596Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Mar 17 19:41:12.965757 env[1250]: time="2025-03-17T19:41:12.965724773Z" level=info msg="CreateContainer within sandbox \"6d47d5cfddb51d892abcd261b8a4b6d0d9f77792e080b3c5effcf3cd186eb4d5\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 19:41:12.984114 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3627087804.mount: Deactivated successfully. Mar 17 19:41:12.997063 env[1250]: time="2025-03-17T19:41:12.997023217Z" level=info msg="CreateContainer within sandbox \"6d47d5cfddb51d892abcd261b8a4b6d0d9f77792e080b3c5effcf3cd186eb4d5\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"325844d893762ffcca5222ba91cb8a3686982a7829fdf0893e9e5f6103977e89\"" Mar 17 19:41:12.999009 env[1250]: time="2025-03-17T19:41:12.998973757Z" level=info msg="StartContainer for \"325844d893762ffcca5222ba91cb8a3686982a7829fdf0893e9e5f6103977e89\"" Mar 17 19:41:13.075484 env[1250]: time="2025-03-17T19:41:13.075443625Z" level=info msg="StartContainer for \"325844d893762ffcca5222ba91cb8a3686982a7829fdf0893e9e5f6103977e89\" returns successfully" Mar 17 19:41:13.256178 kubelet[2215]: I0317 19:41:13.256130 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-5d5f5b6d95-nkx7z" podStartSLOduration=1.087320739 podStartE2EDuration="5.256100223s" podCreationTimestamp="2025-03-17 19:41:08 +0000 UTC" firstStartedPulling="2025-03-17 19:41:08.781454861 +0000 UTC m=+20.891296052" lastFinishedPulling="2025-03-17 19:41:12.950234345 +0000 UTC m=+25.060075536" observedRunningTime="2025-03-17 19:41:13.255283691 +0000 UTC m=+25.365124882" watchObservedRunningTime="2025-03-17 19:41:13.256100223 +0000 UTC m=+25.365941424" Mar 17 19:41:13.287543 kubelet[2215]: E0317 19:41:13.287486 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.287543 kubelet[2215]: W0317 19:41:13.287511 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.287543 kubelet[2215]: E0317 19:41:13.287527 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.287851 kubelet[2215]: E0317 19:41:13.287754 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.287851 kubelet[2215]: W0317 19:41:13.287764 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.287851 kubelet[2215]: E0317 19:41:13.287773 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.288084 kubelet[2215]: E0317 19:41:13.287924 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.288084 kubelet[2215]: W0317 19:41:13.287933 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.288084 kubelet[2215]: E0317 19:41:13.287942 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.288288 kubelet[2215]: E0317 19:41:13.288115 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.288288 kubelet[2215]: W0317 19:41:13.288135 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.288288 kubelet[2215]: E0317 19:41:13.288145 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.288474 kubelet[2215]: E0317 19:41:13.288283 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.288474 kubelet[2215]: W0317 19:41:13.288304 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.288474 kubelet[2215]: E0317 19:41:13.288313 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.288474 kubelet[2215]: E0317 19:41:13.288449 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.288474 kubelet[2215]: W0317 19:41:13.288468 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.288474 kubelet[2215]: E0317 19:41:13.288477 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.288816 kubelet[2215]: E0317 19:41:13.288608 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.288816 kubelet[2215]: W0317 19:41:13.288628 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.288816 kubelet[2215]: E0317 19:41:13.288636 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.288816 kubelet[2215]: E0317 19:41:13.288763 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.288816 kubelet[2215]: W0317 19:41:13.288783 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.288816 kubelet[2215]: E0317 19:41:13.288791 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.289260 kubelet[2215]: E0317 19:41:13.288965 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.289260 kubelet[2215]: W0317 19:41:13.288974 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.289260 kubelet[2215]: E0317 19:41:13.289007 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.289260 kubelet[2215]: E0317 19:41:13.289144 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.289260 kubelet[2215]: W0317 19:41:13.289165 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.289260 kubelet[2215]: E0317 19:41:13.289174 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.289608 kubelet[2215]: E0317 19:41:13.289301 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.289608 kubelet[2215]: W0317 19:41:13.289321 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.289608 kubelet[2215]: E0317 19:41:13.289329 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.289608 kubelet[2215]: E0317 19:41:13.289456 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.289608 kubelet[2215]: W0317 19:41:13.289476 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.289608 kubelet[2215]: E0317 19:41:13.289485 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.289985 kubelet[2215]: E0317 19:41:13.289615 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.289985 kubelet[2215]: W0317 19:41:13.289636 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.289985 kubelet[2215]: E0317 19:41:13.289644 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.289985 kubelet[2215]: E0317 19:41:13.289771 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.289985 kubelet[2215]: W0317 19:41:13.289799 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.289985 kubelet[2215]: E0317 19:41:13.289808 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.289985 kubelet[2215]: E0317 19:41:13.289935 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.289985 kubelet[2215]: W0317 19:41:13.289966 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.289985 kubelet[2215]: E0317 19:41:13.289975 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.334977 kubelet[2215]: E0317 19:41:13.334720 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.334977 kubelet[2215]: W0317 19:41:13.334757 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.334977 kubelet[2215]: E0317 19:41:13.334789 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.335237 kubelet[2215]: E0317 19:41:13.335211 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.335313 kubelet[2215]: W0317 19:41:13.335243 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.335313 kubelet[2215]: E0317 19:41:13.335277 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.335652 kubelet[2215]: E0317 19:41:13.335619 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.335652 kubelet[2215]: W0317 19:41:13.335645 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.335726 kubelet[2215]: E0317 19:41:13.335674 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.336089 kubelet[2215]: E0317 19:41:13.336064 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.336164 kubelet[2215]: W0317 19:41:13.336093 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.336164 kubelet[2215]: E0317 19:41:13.336127 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.336480 kubelet[2215]: E0317 19:41:13.336448 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.336480 kubelet[2215]: W0317 19:41:13.336473 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.336743 kubelet[2215]: E0317 19:41:13.336709 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.336838 kubelet[2215]: E0317 19:41:13.336816 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.336901 kubelet[2215]: W0317 19:41:13.336842 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.336995 kubelet[2215]: E0317 19:41:13.336977 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.337314 kubelet[2215]: E0317 19:41:13.337273 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.337314 kubelet[2215]: W0317 19:41:13.337304 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.337493 kubelet[2215]: E0317 19:41:13.337463 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.337671 kubelet[2215]: E0317 19:41:13.337645 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.337730 kubelet[2215]: W0317 19:41:13.337672 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.337730 kubelet[2215]: E0317 19:41:13.337703 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.338215 kubelet[2215]: E0317 19:41:13.338181 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.338287 kubelet[2215]: W0317 19:41:13.338213 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.338287 kubelet[2215]: E0317 19:41:13.338245 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.338557 kubelet[2215]: E0317 19:41:13.338508 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.338628 kubelet[2215]: W0317 19:41:13.338616 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.338716 kubelet[2215]: E0317 19:41:13.338703 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.339086 kubelet[2215]: E0317 19:41:13.339048 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.339086 kubelet[2215]: W0317 19:41:13.339082 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.339168 kubelet[2215]: E0317 19:41:13.339113 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.339517 kubelet[2215]: E0317 19:41:13.339491 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.339563 kubelet[2215]: W0317 19:41:13.339522 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.339642 kubelet[2215]: E0317 19:41:13.339617 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.340014 kubelet[2215]: E0317 19:41:13.339989 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.340075 kubelet[2215]: W0317 19:41:13.340019 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.340075 kubelet[2215]: E0317 19:41:13.340053 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.340356 kubelet[2215]: E0317 19:41:13.340332 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.340442 kubelet[2215]: W0317 19:41:13.340362 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.340442 kubelet[2215]: E0317 19:41:13.340393 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.340760 kubelet[2215]: E0317 19:41:13.340734 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.340812 kubelet[2215]: W0317 19:41:13.340765 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.340812 kubelet[2215]: E0317 19:41:13.340798 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.341110 kubelet[2215]: E0317 19:41:13.341099 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.341205 kubelet[2215]: W0317 19:41:13.341193 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.341277 kubelet[2215]: E0317 19:41:13.341265 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.341698 kubelet[2215]: E0317 19:41:13.341672 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.341745 kubelet[2215]: W0317 19:41:13.341702 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.341745 kubelet[2215]: E0317 19:41:13.341732 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:13.342077 kubelet[2215]: E0317 19:41:13.342053 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:13.342125 kubelet[2215]: W0317 19:41:13.342082 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:13.342125 kubelet[2215]: E0317 19:41:13.342103 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.031315 kubelet[2215]: E0317 19:41:14.031279 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:14.245788 kubelet[2215]: I0317 19:41:14.245763 2215 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 19:41:14.298420 kubelet[2215]: E0317 19:41:14.298252 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.298420 kubelet[2215]: W0317 19:41:14.298295 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.298420 kubelet[2215]: E0317 19:41:14.298327 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.299698 kubelet[2215]: E0317 19:41:14.299040 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.299698 kubelet[2215]: W0317 19:41:14.299125 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.299698 kubelet[2215]: E0317 19:41:14.299153 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.299698 kubelet[2215]: E0317 19:41:14.299457 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.299698 kubelet[2215]: W0317 19:41:14.299478 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.299698 kubelet[2215]: E0317 19:41:14.299503 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.300249 kubelet[2215]: E0317 19:41:14.299787 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.300249 kubelet[2215]: W0317 19:41:14.299807 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.300249 kubelet[2215]: E0317 19:41:14.299829 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.300249 kubelet[2215]: E0317 19:41:14.300197 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.300249 kubelet[2215]: W0317 19:41:14.300218 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.300249 kubelet[2215]: E0317 19:41:14.300240 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.300597 kubelet[2215]: E0317 19:41:14.300511 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.300597 kubelet[2215]: W0317 19:41:14.300530 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.300597 kubelet[2215]: E0317 19:41:14.300550 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.300863 kubelet[2215]: E0317 19:41:14.300834 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.300863 kubelet[2215]: W0317 19:41:14.300861 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.301257 kubelet[2215]: E0317 19:41:14.300912 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.301422 kubelet[2215]: E0317 19:41:14.301389 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.301422 kubelet[2215]: W0317 19:41:14.301419 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.301595 kubelet[2215]: E0317 19:41:14.301442 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.301797 kubelet[2215]: E0317 19:41:14.301769 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.301797 kubelet[2215]: W0317 19:41:14.301795 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.302012 kubelet[2215]: E0317 19:41:14.301815 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.302205 kubelet[2215]: E0317 19:41:14.302177 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.302205 kubelet[2215]: W0317 19:41:14.302203 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.302348 kubelet[2215]: E0317 19:41:14.302223 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.302613 kubelet[2215]: E0317 19:41:14.302581 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.302613 kubelet[2215]: W0317 19:41:14.302610 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.302803 kubelet[2215]: E0317 19:41:14.302632 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.302987 kubelet[2215]: E0317 19:41:14.302934 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.302987 kubelet[2215]: W0317 19:41:14.303004 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.303269 kubelet[2215]: E0317 19:41:14.303026 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.303469 kubelet[2215]: E0317 19:41:14.303437 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.303469 kubelet[2215]: W0317 19:41:14.303466 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.303645 kubelet[2215]: E0317 19:41:14.303489 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.303821 kubelet[2215]: E0317 19:41:14.303793 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.303821 kubelet[2215]: W0317 19:41:14.303820 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.304044 kubelet[2215]: E0317 19:41:14.303840 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.304240 kubelet[2215]: E0317 19:41:14.304211 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.304240 kubelet[2215]: W0317 19:41:14.304238 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.304400 kubelet[2215]: E0317 19:41:14.304261 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.342092 kubelet[2215]: E0317 19:41:14.342058 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.342274 kubelet[2215]: W0317 19:41:14.342246 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.342455 kubelet[2215]: E0317 19:41:14.342422 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.343110 kubelet[2215]: E0317 19:41:14.343085 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.343278 kubelet[2215]: W0317 19:41:14.343249 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.343474 kubelet[2215]: E0317 19:41:14.343443 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.343839 kubelet[2215]: E0317 19:41:14.343799 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.343839 kubelet[2215]: W0317 19:41:14.343837 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.344082 kubelet[2215]: E0317 19:41:14.343878 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.344277 kubelet[2215]: E0317 19:41:14.344241 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.344277 kubelet[2215]: W0317 19:41:14.344273 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.344449 kubelet[2215]: E0317 19:41:14.344295 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.344615 kubelet[2215]: E0317 19:41:14.344586 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.344615 kubelet[2215]: W0317 19:41:14.344613 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.344789 kubelet[2215]: E0317 19:41:14.344647 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.345146 kubelet[2215]: E0317 19:41:14.345113 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.345146 kubelet[2215]: W0317 19:41:14.345146 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.345470 kubelet[2215]: E0317 19:41:14.345437 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.345803 kubelet[2215]: E0317 19:41:14.345540 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.345803 kubelet[2215]: W0317 19:41:14.345794 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.346207 kubelet[2215]: E0317 19:41:14.346158 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.346207 kubelet[2215]: W0317 19:41:14.346190 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.346365 kubelet[2215]: E0317 19:41:14.346214 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.346552 kubelet[2215]: E0317 19:41:14.346504 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.346552 kubelet[2215]: W0317 19:41:14.346535 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.346705 kubelet[2215]: E0317 19:41:14.346555 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.346871 kubelet[2215]: E0317 19:41:14.346830 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.346871 kubelet[2215]: W0317 19:41:14.346861 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.347079 kubelet[2215]: E0317 19:41:14.346882 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.347360 kubelet[2215]: E0317 19:41:14.347328 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.347604 kubelet[2215]: E0317 19:41:14.347370 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.347604 kubelet[2215]: W0317 19:41:14.347600 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.347760 kubelet[2215]: E0317 19:41:14.347629 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.348325 kubelet[2215]: E0317 19:41:14.348300 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.348487 kubelet[2215]: W0317 19:41:14.348461 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.348675 kubelet[2215]: E0317 19:41:14.348646 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.349066 kubelet[2215]: E0317 19:41:14.349033 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.349066 kubelet[2215]: W0317 19:41:14.349063 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.349257 kubelet[2215]: E0317 19:41:14.349098 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.349511 kubelet[2215]: E0317 19:41:14.349476 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.349511 kubelet[2215]: W0317 19:41:14.349506 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.349774 kubelet[2215]: E0317 19:41:14.349539 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.349921 kubelet[2215]: E0317 19:41:14.349844 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.349921 kubelet[2215]: W0317 19:41:14.349865 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.349921 kubelet[2215]: E0317 19:41:14.349885 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.350424 kubelet[2215]: E0317 19:41:14.350226 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.350424 kubelet[2215]: W0317 19:41:14.350257 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.350424 kubelet[2215]: E0317 19:41:14.350277 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.351521 kubelet[2215]: E0317 19:41:14.351142 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.351521 kubelet[2215]: W0317 19:41:14.351177 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.351521 kubelet[2215]: E0317 19:41:14.351200 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:14.351836 kubelet[2215]: E0317 19:41:14.351788 2215 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 19:41:14.351836 kubelet[2215]: W0317 19:41:14.351820 2215 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 19:41:14.352181 kubelet[2215]: E0317 19:41:14.351842 2215 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 19:41:15.074037 env[1250]: time="2025-03-17T19:41:15.073923692Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:15.077443 env[1250]: time="2025-03-17T19:41:15.077374948Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:15.081002 env[1250]: time="2025-03-17T19:41:15.080846353Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:15.086725 env[1250]: time="2025-03-17T19:41:15.086665872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Mar 17 19:41:15.086939 env[1250]: time="2025-03-17T19:41:15.086892067Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:15.090615 env[1250]: time="2025-03-17T19:41:15.089537802Z" level=info msg="CreateContainer within sandbox \"b0fa39a337560e0ba3a063bce0d5bfd9ef3286da2f35b1bb2ee8bec8c90071c1\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 19:41:15.109635 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2630899993.mount: Deactivated successfully. Mar 17 19:41:15.119913 env[1250]: time="2025-03-17T19:41:15.119820046Z" level=info msg="CreateContainer within sandbox \"b0fa39a337560e0ba3a063bce0d5bfd9ef3286da2f35b1bb2ee8bec8c90071c1\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3b55c629681a101723686ad4915d67d6b9cb77bdfca5599003933fdff3897fab\"" Mar 17 19:41:15.122718 env[1250]: time="2025-03-17T19:41:15.120709976Z" level=info msg="StartContainer for \"3b55c629681a101723686ad4915d67d6b9cb77bdfca5599003933fdff3897fab\"" Mar 17 19:41:15.197982 env[1250]: time="2025-03-17T19:41:15.193326811Z" level=info msg="StartContainer for \"3b55c629681a101723686ad4915d67d6b9cb77bdfca5599003933fdff3897fab\" returns successfully" Mar 17 19:41:15.914131 env[1250]: time="2025-03-17T19:41:15.914001876Z" level=info msg="shim disconnected" id=3b55c629681a101723686ad4915d67d6b9cb77bdfca5599003933fdff3897fab Mar 17 19:41:15.914590 env[1250]: time="2025-03-17T19:41:15.914492076Z" level=warning msg="cleaning up after shim disconnected" id=3b55c629681a101723686ad4915d67d6b9cb77bdfca5599003933fdff3897fab namespace=k8s.io Mar 17 19:41:15.914753 env[1250]: time="2025-03-17T19:41:15.914719603Z" level=info msg="cleaning up dead shim" Mar 17 19:41:15.933085 env[1250]: time="2025-03-17T19:41:15.933013461Z" level=warning msg="cleanup warnings time=\"2025-03-17T19:41:15Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=2901 runtime=io.containerd.runc.v2\n" Mar 17 19:41:16.032995 kubelet[2215]: E0317 19:41:16.030808 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:16.108658 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3b55c629681a101723686ad4915d67d6b9cb77bdfca5599003933fdff3897fab-rootfs.mount: Deactivated successfully. Mar 17 19:41:16.264481 env[1250]: time="2025-03-17T19:41:16.261099122Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Mar 17 19:41:18.033920 kubelet[2215]: E0317 19:41:18.032157 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:20.032003 kubelet[2215]: E0317 19:41:20.031422 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:22.033575 kubelet[2215]: E0317 19:41:22.033500 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:23.546124 env[1250]: time="2025-03-17T19:41:23.546088805Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:23.549384 env[1250]: time="2025-03-17T19:41:23.549360854Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:23.552252 env[1250]: time="2025-03-17T19:41:23.552230047Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/cni:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:23.555158 env[1250]: time="2025-03-17T19:41:23.555133294Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:23.556182 env[1250]: time="2025-03-17T19:41:23.556158799Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Mar 17 19:41:23.562532 env[1250]: time="2025-03-17T19:41:23.562461204Z" level=info msg="CreateContainer within sandbox \"b0fa39a337560e0ba3a063bce0d5bfd9ef3286da2f35b1bb2ee8bec8c90071c1\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 19:41:23.581116 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1632569527.mount: Deactivated successfully. Mar 17 19:41:23.591329 env[1250]: time="2025-03-17T19:41:23.591275719Z" level=info msg="CreateContainer within sandbox \"b0fa39a337560e0ba3a063bce0d5bfd9ef3286da2f35b1bb2ee8bec8c90071c1\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"b5c50e4f507f769a849da3fb0381b81b1b0875d83061a882b9277dbbd5333d91\"" Mar 17 19:41:23.594162 env[1250]: time="2025-03-17T19:41:23.594104898Z" level=info msg="StartContainer for \"b5c50e4f507f769a849da3fb0381b81b1b0875d83061a882b9277dbbd5333d91\"" Mar 17 19:41:23.681451 env[1250]: time="2025-03-17T19:41:23.680472791Z" level=info msg="StartContainer for \"b5c50e4f507f769a849da3fb0381b81b1b0875d83061a882b9277dbbd5333d91\" returns successfully" Mar 17 19:41:24.031277 kubelet[2215]: E0317 19:41:24.031219 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:24.915606 env[1250]: time="2025-03-17T19:41:24.915502960Z" level=error msg="failed to reload cni configuration after receiving fs change event(\"/etc/cni/net.d/calico-kubeconfig\": WRITE)" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 19:41:24.971373 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-b5c50e4f507f769a849da3fb0381b81b1b0875d83061a882b9277dbbd5333d91-rootfs.mount: Deactivated successfully. Mar 17 19:41:24.989286 kubelet[2215]: I0317 19:41:24.989252 2215 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 17 19:41:25.586297 kubelet[2215]: I0317 19:41:25.586224 2215 topology_manager.go:215] "Topology Admit Handler" podUID="88261017-01e9-4199-99b8-fa205595cc28" podNamespace="kube-system" podName="coredns-7db6d8ff4d-tm2v7" Mar 17 19:41:25.615105 kubelet[2215]: I0317 19:41:25.615049 2215 topology_manager.go:215] "Topology Admit Handler" podUID="7d670e19-a00c-4881-a611-273cacfe43fc" podNamespace="kube-system" podName="coredns-7db6d8ff4d-j947x" Mar 17 19:41:25.615729 kubelet[2215]: I0317 19:41:25.615689 2215 topology_manager.go:215] "Topology Admit Handler" podUID="0357a441-9a17-439b-997f-9e0244a116fd" podNamespace="calico-system" podName="calico-kube-controllers-959455b4-2tg7d" Mar 17 19:41:25.617772 kubelet[2215]: I0317 19:41:25.617730 2215 topology_manager.go:215] "Topology Admit Handler" podUID="7ae34508-787e-4967-b222-10f79da4690c" podNamespace="calico-apiserver" podName="calico-apiserver-6657945b8c-w9rz5" Mar 17 19:41:25.618233 kubelet[2215]: I0317 19:41:25.618193 2215 topology_manager.go:215] "Topology Admit Handler" podUID="bb9bdd56-e015-48b4-a9cb-554b7a129746" podNamespace="calico-apiserver" podName="calico-apiserver-6657945b8c-wnzqx" Mar 17 19:41:25.631077 kubelet[2215]: I0317 19:41:25.626188 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/88261017-01e9-4199-99b8-fa205595cc28-config-volume\") pod \"coredns-7db6d8ff4d-tm2v7\" (UID: \"88261017-01e9-4199-99b8-fa205595cc28\") " pod="kube-system/coredns-7db6d8ff4d-tm2v7" Mar 17 19:41:25.631077 kubelet[2215]: I0317 19:41:25.626271 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58cp5\" (UniqueName: \"kubernetes.io/projected/88261017-01e9-4199-99b8-fa205595cc28-kube-api-access-58cp5\") pod \"coredns-7db6d8ff4d-tm2v7\" (UID: \"88261017-01e9-4199-99b8-fa205595cc28\") " pod="kube-system/coredns-7db6d8ff4d-tm2v7" Mar 17 19:41:25.727528 kubelet[2215]: I0317 19:41:25.727431 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d670e19-a00c-4881-a611-273cacfe43fc-config-volume\") pod \"coredns-7db6d8ff4d-j947x\" (UID: \"7d670e19-a00c-4881-a611-273cacfe43fc\") " pod="kube-system/coredns-7db6d8ff4d-j947x" Mar 17 19:41:26.036001 kubelet[2215]: I0317 19:41:25.727740 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwkjj\" (UniqueName: \"kubernetes.io/projected/7d670e19-a00c-4881-a611-273cacfe43fc-kube-api-access-fwkjj\") pod \"coredns-7db6d8ff4d-j947x\" (UID: \"7d670e19-a00c-4881-a611-273cacfe43fc\") " pod="kube-system/coredns-7db6d8ff4d-j947x" Mar 17 19:41:26.036001 kubelet[2215]: I0317 19:41:25.728633 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49chd\" (UniqueName: \"kubernetes.io/projected/7ae34508-787e-4967-b222-10f79da4690c-kube-api-access-49chd\") pod \"calico-apiserver-6657945b8c-w9rz5\" (UID: \"7ae34508-787e-4967-b222-10f79da4690c\") " pod="calico-apiserver/calico-apiserver-6657945b8c-w9rz5" Mar 17 19:41:26.036001 kubelet[2215]: I0317 19:41:25.728812 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xnpd\" (UniqueName: \"kubernetes.io/projected/bb9bdd56-e015-48b4-a9cb-554b7a129746-kube-api-access-2xnpd\") pod \"calico-apiserver-6657945b8c-wnzqx\" (UID: \"bb9bdd56-e015-48b4-a9cb-554b7a129746\") " pod="calico-apiserver/calico-apiserver-6657945b8c-wnzqx" Mar 17 19:41:26.036001 kubelet[2215]: I0317 19:41:25.729061 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0357a441-9a17-439b-997f-9e0244a116fd-tigera-ca-bundle\") pod \"calico-kube-controllers-959455b4-2tg7d\" (UID: \"0357a441-9a17-439b-997f-9e0244a116fd\") " pod="calico-system/calico-kube-controllers-959455b4-2tg7d" Mar 17 19:41:26.036001 kubelet[2215]: I0317 19:41:25.729234 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2vcz\" (UniqueName: \"kubernetes.io/projected/0357a441-9a17-439b-997f-9e0244a116fd-kube-api-access-v2vcz\") pod \"calico-kube-controllers-959455b4-2tg7d\" (UID: \"0357a441-9a17-439b-997f-9e0244a116fd\") " pod="calico-system/calico-kube-controllers-959455b4-2tg7d" Mar 17 19:41:26.036502 kubelet[2215]: I0317 19:41:25.729359 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/bb9bdd56-e015-48b4-a9cb-554b7a129746-calico-apiserver-certs\") pod \"calico-apiserver-6657945b8c-wnzqx\" (UID: \"bb9bdd56-e015-48b4-a9cb-554b7a129746\") " pod="calico-apiserver/calico-apiserver-6657945b8c-wnzqx" Mar 17 19:41:26.036502 kubelet[2215]: I0317 19:41:25.729463 2215 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7ae34508-787e-4967-b222-10f79da4690c-calico-apiserver-certs\") pod \"calico-apiserver-6657945b8c-w9rz5\" (UID: \"7ae34508-787e-4967-b222-10f79da4690c\") " pod="calico-apiserver/calico-apiserver-6657945b8c-w9rz5" Mar 17 19:41:26.109420 env[1250]: time="2025-03-17T19:41:26.108890499Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ptzbm,Uid:b69f8cef-aedd-412b-a964-af34b5f74525,Namespace:calico-system,Attempt:0,}" Mar 17 19:41:26.162047 env[1250]: time="2025-03-17T19:41:26.161905578Z" level=info msg="shim disconnected" id=b5c50e4f507f769a849da3fb0381b81b1b0875d83061a882b9277dbbd5333d91 Mar 17 19:41:26.162196 env[1250]: time="2025-03-17T19:41:26.162042485Z" level=warning msg="cleaning up after shim disconnected" id=b5c50e4f507f769a849da3fb0381b81b1b0875d83061a882b9277dbbd5333d91 namespace=k8s.io Mar 17 19:41:26.162196 env[1250]: time="2025-03-17T19:41:26.162088182Z" level=info msg="cleaning up dead shim" Mar 17 19:41:26.174730 env[1250]: time="2025-03-17T19:41:26.174673354Z" level=warning msg="cleanup warnings time=\"2025-03-17T19:41:26Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=2980 runtime=io.containerd.runc.v2\n" Mar 17 19:41:26.215802 env[1250]: time="2025-03-17T19:41:26.215720389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-tm2v7,Uid:88261017-01e9-4199-99b8-fa205595cc28,Namespace:kube-system,Attempt:0,}" Mar 17 19:41:26.222876 env[1250]: time="2025-03-17T19:41:26.222841180Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-j947x,Uid:7d670e19-a00c-4881-a611-273cacfe43fc,Namespace:kube-system,Attempt:0,}" Mar 17 19:41:26.226812 env[1250]: time="2025-03-17T19:41:26.226776343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6657945b8c-wnzqx,Uid:bb9bdd56-e015-48b4-a9cb-554b7a129746,Namespace:calico-apiserver,Attempt:0,}" Mar 17 19:41:26.246104 env[1250]: time="2025-03-17T19:41:26.246029157Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-959455b4-2tg7d,Uid:0357a441-9a17-439b-997f-9e0244a116fd,Namespace:calico-system,Attempt:0,}" Mar 17 19:41:26.246617 env[1250]: time="2025-03-17T19:41:26.246557578Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6657945b8c-w9rz5,Uid:7ae34508-787e-4967-b222-10f79da4690c,Namespace:calico-apiserver,Attempt:0,}" Mar 17 19:41:26.288312 env[1250]: time="2025-03-17T19:41:26.286582736Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Mar 17 19:41:26.472061 env[1250]: time="2025-03-17T19:41:26.471898967Z" level=error msg="Failed to destroy network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.473225 env[1250]: time="2025-03-17T19:41:26.473151638Z" level=error msg="encountered an error cleaning up failed sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.473520 env[1250]: time="2025-03-17T19:41:26.473445900Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ptzbm,Uid:b69f8cef-aedd-412b-a964-af34b5f74525,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.474225 kubelet[2215]: E0317 19:41:26.474130 2215 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.474376 kubelet[2215]: E0317 19:41:26.474250 2215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ptzbm" Mar 17 19:41:26.474376 kubelet[2215]: E0317 19:41:26.474307 2215 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-ptzbm" Mar 17 19:41:26.474532 kubelet[2215]: E0317 19:41:26.474357 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-ptzbm_calico-system(b69f8cef-aedd-412b-a964-af34b5f74525)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-ptzbm_calico-system(b69f8cef-aedd-412b-a964-af34b5f74525)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:26.742798 env[1250]: time="2025-03-17T19:41:26.742697767Z" level=error msg="Failed to destroy network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.744155 env[1250]: time="2025-03-17T19:41:26.744032622Z" level=error msg="encountered an error cleaning up failed sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.744491 env[1250]: time="2025-03-17T19:41:26.744416111Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-tm2v7,Uid:88261017-01e9-4199-99b8-fa205595cc28,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.745249 kubelet[2215]: E0317 19:41:26.744993 2215 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.745249 kubelet[2215]: E0317 19:41:26.745100 2215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-tm2v7" Mar 17 19:41:26.746394 kubelet[2215]: E0317 19:41:26.745920 2215 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-tm2v7" Mar 17 19:41:26.746394 kubelet[2215]: E0317 19:41:26.746100 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-tm2v7_kube-system(88261017-01e9-4199-99b8-fa205595cc28)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-tm2v7_kube-system(88261017-01e9-4199-99b8-fa205595cc28)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-tm2v7" podUID="88261017-01e9-4199-99b8-fa205595cc28" Mar 17 19:41:26.963082 env[1250]: time="2025-03-17T19:41:26.963020390Z" level=error msg="Failed to destroy network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.963397 env[1250]: time="2025-03-17T19:41:26.963352433Z" level=error msg="encountered an error cleaning up failed sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.963455 env[1250]: time="2025-03-17T19:41:26.963404190Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6657945b8c-wnzqx,Uid:bb9bdd56-e015-48b4-a9cb-554b7a129746,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.964523 kubelet[2215]: E0317 19:41:26.963662 2215 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:26.964523 kubelet[2215]: E0317 19:41:26.963719 2215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6657945b8c-wnzqx" Mar 17 19:41:26.964523 kubelet[2215]: E0317 19:41:26.963742 2215 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6657945b8c-wnzqx" Mar 17 19:41:26.964648 kubelet[2215]: E0317 19:41:26.963787 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6657945b8c-wnzqx_calico-apiserver(bb9bdd56-e015-48b4-a9cb-554b7a129746)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6657945b8c-wnzqx_calico-apiserver(bb9bdd56-e015-48b4-a9cb-554b7a129746)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6657945b8c-wnzqx" podUID="bb9bdd56-e015-48b4-a9cb-554b7a129746" Mar 17 19:41:27.019107 env[1250]: time="2025-03-17T19:41:27.018270183Z" level=error msg="Failed to destroy network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.019107 env[1250]: time="2025-03-17T19:41:27.018638984Z" level=error msg="encountered an error cleaning up failed sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.019107 env[1250]: time="2025-03-17T19:41:27.018683307Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-j947x,Uid:7d670e19-a00c-4881-a611-273cacfe43fc,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.020528 kubelet[2215]: E0317 19:41:27.018914 2215 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.020528 kubelet[2215]: E0317 19:41:27.019005 2215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-j947x" Mar 17 19:41:27.020528 kubelet[2215]: E0317 19:41:27.019031 2215 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-j947x" Mar 17 19:41:27.020689 kubelet[2215]: E0317 19:41:27.019091 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-j947x_kube-system(7d670e19-a00c-4881-a611-273cacfe43fc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-j947x_kube-system(7d670e19-a00c-4881-a611-273cacfe43fc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-j947x" podUID="7d670e19-a00c-4881-a611-273cacfe43fc" Mar 17 19:41:27.040152 env[1250]: time="2025-03-17T19:41:27.040071737Z" level=error msg="Failed to destroy network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.040617 env[1250]: time="2025-03-17T19:41:27.040588377Z" level=error msg="encountered an error cleaning up failed sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.040737 env[1250]: time="2025-03-17T19:41:27.040708403Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-959455b4-2tg7d,Uid:0357a441-9a17-439b-997f-9e0244a116fd,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.041155 kubelet[2215]: E0317 19:41:27.041103 2215 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.041224 kubelet[2215]: E0317 19:41:27.041180 2215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-959455b4-2tg7d" Mar 17 19:41:27.041224 kubelet[2215]: E0317 19:41:27.041204 2215 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-959455b4-2tg7d" Mar 17 19:41:27.041316 kubelet[2215]: E0317 19:41:27.041285 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-959455b4-2tg7d_calico-system(0357a441-9a17-439b-997f-9e0244a116fd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-959455b4-2tg7d_calico-system(0357a441-9a17-439b-997f-9e0244a116fd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-959455b4-2tg7d" podUID="0357a441-9a17-439b-997f-9e0244a116fd" Mar 17 19:41:27.067391 env[1250]: time="2025-03-17T19:41:27.067331214Z" level=error msg="Failed to destroy network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.069808 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41-shm.mount: Deactivated successfully. Mar 17 19:41:27.071091 env[1250]: time="2025-03-17T19:41:27.071054338Z" level=error msg="encountered an error cleaning up failed sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.071159 env[1250]: time="2025-03-17T19:41:27.071112908Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6657945b8c-w9rz5,Uid:7ae34508-787e-4967-b222-10f79da4690c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.071425 kubelet[2215]: E0317 19:41:27.071385 2215 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.071484 kubelet[2215]: E0317 19:41:27.071465 2215 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6657945b8c-w9rz5" Mar 17 19:41:27.071518 kubelet[2215]: E0317 19:41:27.071490 2215 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6657945b8c-w9rz5" Mar 17 19:41:27.071595 kubelet[2215]: E0317 19:41:27.071566 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6657945b8c-w9rz5_calico-apiserver(7ae34508-787e-4967-b222-10f79da4690c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6657945b8c-w9rz5_calico-apiserver(7ae34508-787e-4967-b222-10f79da4690c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6657945b8c-w9rz5" podUID="7ae34508-787e-4967-b222-10f79da4690c" Mar 17 19:41:27.293052 kubelet[2215]: I0317 19:41:27.289538 2215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:27.293326 env[1250]: time="2025-03-17T19:41:27.291537770Z" level=info msg="StopPodSandbox for \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\"" Mar 17 19:41:27.297394 kubelet[2215]: I0317 19:41:27.297334 2215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:27.300124 env[1250]: time="2025-03-17T19:41:27.299910729Z" level=info msg="StopPodSandbox for \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\"" Mar 17 19:41:27.302920 kubelet[2215]: I0317 19:41:27.302854 2215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:27.304491 env[1250]: time="2025-03-17T19:41:27.304280048Z" level=info msg="StopPodSandbox for \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\"" Mar 17 19:41:27.307922 kubelet[2215]: I0317 19:41:27.306269 2215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:27.307922 kubelet[2215]: I0317 19:41:27.307034 2215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:27.308184 env[1250]: time="2025-03-17T19:41:27.307770847Z" level=info msg="StopPodSandbox for \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\"" Mar 17 19:41:27.312676 env[1250]: time="2025-03-17T19:41:27.312155663Z" level=info msg="StopPodSandbox for \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\"" Mar 17 19:41:27.341560 kubelet[2215]: I0317 19:41:27.341380 2215 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:27.349997 env[1250]: time="2025-03-17T19:41:27.346226579Z" level=info msg="StopPodSandbox for \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\"" Mar 17 19:41:27.454222 env[1250]: time="2025-03-17T19:41:27.454143879Z" level=error msg="StopPodSandbox for \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\" failed" error="failed to destroy network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.454449 kubelet[2215]: E0317 19:41:27.454408 2215 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:27.454522 kubelet[2215]: E0317 19:41:27.454469 2215 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70"} Mar 17 19:41:27.454559 kubelet[2215]: E0317 19:41:27.454533 2215 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"bb9bdd56-e015-48b4-a9cb-554b7a129746\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 19:41:27.454635 kubelet[2215]: E0317 19:41:27.454562 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"bb9bdd56-e015-48b4-a9cb-554b7a129746\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6657945b8c-wnzqx" podUID="bb9bdd56-e015-48b4-a9cb-554b7a129746" Mar 17 19:41:27.493135 env[1250]: time="2025-03-17T19:41:27.493071166Z" level=error msg="StopPodSandbox for \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\" failed" error="failed to destroy network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.493347 kubelet[2215]: E0317 19:41:27.493306 2215 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:27.493437 kubelet[2215]: E0317 19:41:27.493360 2215 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1"} Mar 17 19:41:27.493437 kubelet[2215]: E0317 19:41:27.493403 2215 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"0357a441-9a17-439b-997f-9e0244a116fd\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 19:41:27.493560 kubelet[2215]: E0317 19:41:27.493431 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"0357a441-9a17-439b-997f-9e0244a116fd\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-959455b4-2tg7d" podUID="0357a441-9a17-439b-997f-9e0244a116fd" Mar 17 19:41:27.499548 env[1250]: time="2025-03-17T19:41:27.499471063Z" level=error msg="StopPodSandbox for \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\" failed" error="failed to destroy network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.499893 kubelet[2215]: E0317 19:41:27.499855 2215 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:27.499982 kubelet[2215]: E0317 19:41:27.499908 2215 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0"} Mar 17 19:41:27.499982 kubelet[2215]: E0317 19:41:27.499967 2215 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"88261017-01e9-4199-99b8-fa205595cc28\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 19:41:27.500091 kubelet[2215]: E0317 19:41:27.500011 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"88261017-01e9-4199-99b8-fa205595cc28\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-tm2v7" podUID="88261017-01e9-4199-99b8-fa205595cc28" Mar 17 19:41:27.507090 env[1250]: time="2025-03-17T19:41:27.507020428Z" level=error msg="StopPodSandbox for \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\" failed" error="failed to destroy network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.507263 kubelet[2215]: E0317 19:41:27.507222 2215 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:27.507332 kubelet[2215]: E0317 19:41:27.507271 2215 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f"} Mar 17 19:41:27.507332 kubelet[2215]: E0317 19:41:27.507305 2215 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7d670e19-a00c-4881-a611-273cacfe43fc\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 19:41:27.507438 kubelet[2215]: E0317 19:41:27.507330 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7d670e19-a00c-4881-a611-273cacfe43fc\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-j947x" podUID="7d670e19-a00c-4881-a611-273cacfe43fc" Mar 17 19:41:27.514465 env[1250]: time="2025-03-17T19:41:27.514408741Z" level=error msg="StopPodSandbox for \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\" failed" error="failed to destroy network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.514694 kubelet[2215]: E0317 19:41:27.514653 2215 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:27.514773 kubelet[2215]: E0317 19:41:27.514698 2215 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41"} Mar 17 19:41:27.514773 kubelet[2215]: E0317 19:41:27.514733 2215 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7ae34508-787e-4967-b222-10f79da4690c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 19:41:27.514773 kubelet[2215]: E0317 19:41:27.514763 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7ae34508-787e-4967-b222-10f79da4690c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6657945b8c-w9rz5" podUID="7ae34508-787e-4967-b222-10f79da4690c" Mar 17 19:41:27.525260 env[1250]: time="2025-03-17T19:41:27.525195249Z" level=error msg="StopPodSandbox for \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\" failed" error="failed to destroy network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 19:41:27.525493 kubelet[2215]: E0317 19:41:27.525434 2215 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:27.525493 kubelet[2215]: E0317 19:41:27.525483 2215 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4"} Mar 17 19:41:27.525599 kubelet[2215]: E0317 19:41:27.525517 2215 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"b69f8cef-aedd-412b-a964-af34b5f74525\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 19:41:27.525599 kubelet[2215]: E0317 19:41:27.525542 2215 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"b69f8cef-aedd-412b-a964-af34b5f74525\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-ptzbm" podUID="b69f8cef-aedd-412b-a964-af34b5f74525" Mar 17 19:41:34.302548 kubelet[2215]: I0317 19:41:34.302343 2215 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 19:41:34.350000 audit[3306]: NETFILTER_CFG table=filter:95 family=2 entries=17 op=nft_register_rule pid=3306 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:34.352643 kernel: kauditd_printk_skb: 8 callbacks suppressed Mar 17 19:41:34.352760 kernel: audit: type=1325 audit(1742240494.350:295): table=filter:95 family=2 entries=17 op=nft_register_rule pid=3306 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:34.350000 audit[3306]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffe60305920 a2=0 a3=7ffe6030590c items=0 ppid=2387 pid=3306 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:34.363863 kernel: audit: type=1300 audit(1742240494.350:295): arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffe60305920 a2=0 a3=7ffe6030590c items=0 ppid=2387 pid=3306 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:34.350000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:34.369033 kernel: audit: type=1327 audit(1742240494.350:295): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:34.368000 audit[3306]: NETFILTER_CFG table=nat:96 family=2 entries=19 op=nft_register_chain pid=3306 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:34.368000 audit[3306]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffe60305920 a2=0 a3=7ffe6030590c items=0 ppid=2387 pid=3306 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:34.380462 kernel: audit: type=1325 audit(1742240494.368:296): table=nat:96 family=2 entries=19 op=nft_register_chain pid=3306 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:34.380503 kernel: audit: type=1300 audit(1742240494.368:296): arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7ffe60305920 a2=0 a3=7ffe6030590c items=0 ppid=2387 pid=3306 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:34.368000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:34.384174 kernel: audit: type=1327 audit(1742240494.368:296): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:37.305320 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2891327991.mount: Deactivated successfully. Mar 17 19:41:37.366028 env[1250]: time="2025-03-17T19:41:37.365912388Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:37.369319 env[1250]: time="2025-03-17T19:41:37.369260359Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:37.372201 env[1250]: time="2025-03-17T19:41:37.372173925Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:37.374515 env[1250]: time="2025-03-17T19:41:37.374455616Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:37.374838 env[1250]: time="2025-03-17T19:41:37.374814780Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Mar 17 19:41:37.399060 env[1250]: time="2025-03-17T19:41:37.399025351Z" level=info msg="CreateContainer within sandbox \"b0fa39a337560e0ba3a063bce0d5bfd9ef3286da2f35b1bb2ee8bec8c90071c1\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 19:41:37.431847 env[1250]: time="2025-03-17T19:41:37.431787046Z" level=info msg="CreateContainer within sandbox \"b0fa39a337560e0ba3a063bce0d5bfd9ef3286da2f35b1bb2ee8bec8c90071c1\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4c1ef740ef47f5ff497973bc373f8c9106da486bac8ad0d98fcdbbc883538f45\"" Mar 17 19:41:37.432978 env[1250]: time="2025-03-17T19:41:37.432799395Z" level=info msg="StartContainer for \"4c1ef740ef47f5ff497973bc373f8c9106da486bac8ad0d98fcdbbc883538f45\"" Mar 17 19:41:37.510726 env[1250]: time="2025-03-17T19:41:37.510678354Z" level=info msg="StartContainer for \"4c1ef740ef47f5ff497973bc373f8c9106da486bac8ad0d98fcdbbc883538f45\" returns successfully" Mar 17 19:41:37.586482 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 19:41:37.586590 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 19:41:38.428902 kubelet[2215]: I0317 19:41:38.428683 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-2jq7c" podStartSLOduration=1.874808515 podStartE2EDuration="30.428648687s" podCreationTimestamp="2025-03-17 19:41:08 +0000 UTC" firstStartedPulling="2025-03-17 19:41:08.822042501 +0000 UTC m=+20.931883702" lastFinishedPulling="2025-03-17 19:41:37.375882683 +0000 UTC m=+49.485723874" observedRunningTime="2025-03-17 19:41:38.427940409 +0000 UTC m=+50.537781720" watchObservedRunningTime="2025-03-17 19:41:38.428648687 +0000 UTC m=+50.538489928" Mar 17 19:41:38.441511 systemd[1]: run-containerd-runc-k8s.io-4c1ef740ef47f5ff497973bc373f8c9106da486bac8ad0d98fcdbbc883538f45-runc.agXylo.mount: Deactivated successfully. Mar 17 19:41:38.909000 audit[3442]: AVC avc: denied { write } for pid=3442 comm="tee" name="fd" dev="proc" ino=26331 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 19:41:38.909000 audit[3442]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffd09f569e9 a2=241 a3=1b6 items=1 ppid=3406 pid=3442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:38.922545 kernel: audit: type=1400 audit(1742240498.909:297): avc: denied { write } for pid=3442 comm="tee" name="fd" dev="proc" ino=26331 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 19:41:38.922616 kernel: audit: type=1300 audit(1742240498.909:297): arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffd09f569e9 a2=241 a3=1b6 items=1 ppid=3406 pid=3442 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:38.909000 audit: CWD cwd="/etc/service/enabled/node-status-reporter/log" Mar 17 19:41:38.924973 kernel: audit: type=1307 audit(1742240498.909:297): cwd="/etc/service/enabled/node-status-reporter/log" Mar 17 19:41:38.909000 audit: PATH item=0 name="/dev/fd/63" inode=26753 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:41:38.933979 kernel: audit: type=1302 audit(1742240498.909:297): item=0 name="/dev/fd/63" inode=26753 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:41:38.909000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 19:41:38.924000 audit[3431]: AVC avc: denied { write } for pid=3431 comm="tee" name="fd" dev="proc" ino=26758 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 19:41:38.924000 audit[3431]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7fff1bc2b9f8 a2=241 a3=1b6 items=1 ppid=3413 pid=3431 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:38.924000 audit: CWD cwd="/etc/service/enabled/felix/log" Mar 17 19:41:38.924000 audit: PATH item=0 name="/dev/fd/63" inode=26741 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:41:38.924000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 19:41:38.958000 audit[3455]: AVC avc: denied { write } for pid=3455 comm="tee" name="fd" dev="proc" ino=26774 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 19:41:38.958000 audit[3455]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffd22f5e9e8 a2=241 a3=1b6 items=1 ppid=3409 pid=3455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:38.958000 audit: CWD cwd="/etc/service/enabled/allocate-tunnel-addrs/log" Mar 17 19:41:38.958000 audit: PATH item=0 name="/dev/fd/63" inode=26760 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:41:38.958000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 19:41:38.962000 audit[3470]: AVC avc: denied { write } for pid=3470 comm="tee" name="fd" dev="proc" ino=26361 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 19:41:38.962000 audit[3470]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7fff662a79f8 a2=241 a3=1b6 items=1 ppid=3412 pid=3470 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:38.962000 audit: CWD cwd="/etc/service/enabled/confd/log" Mar 17 19:41:38.962000 audit: PATH item=0 name="/dev/fd/63" inode=26769 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:41:38.962000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 19:41:38.974000 audit[3473]: AVC avc: denied { write } for pid=3473 comm="tee" name="fd" dev="proc" ino=26366 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 19:41:38.974000 audit[3473]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffdb8f089f8 a2=241 a3=1b6 items=1 ppid=3407 pid=3473 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:38.974000 audit: CWD cwd="/etc/service/enabled/bird6/log" Mar 17 19:41:38.974000 audit: PATH item=0 name="/dev/fd/63" inode=26771 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:41:38.974000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 19:41:38.989000 audit[3458]: AVC avc: denied { write } for pid=3458 comm="tee" name="fd" dev="proc" ino=26778 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 19:41:38.989000 audit[3458]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffeeee989f9 a2=241 a3=1b6 items=1 ppid=3422 pid=3458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:38.989000 audit: CWD cwd="/etc/service/enabled/bird/log" Mar 17 19:41:38.989000 audit: PATH item=0 name="/dev/fd/63" inode=26347 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:41:38.989000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 19:41:39.005000 audit[3479]: AVC avc: denied { write } for pid=3479 comm="tee" name="fd" dev="proc" ino=26373 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 19:41:39.005000 audit[3479]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffe69fe29fa a2=241 a3=1b6 items=1 ppid=3417 pid=3479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.005000 audit: CWD cwd="/etc/service/enabled/cni/log" Mar 17 19:41:39.005000 audit: PATH item=0 name="/dev/fd/63" inode=26370 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 19:41:39.005000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 19:41:39.236000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.236000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.236000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.236000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.236000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.236000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.236000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.236000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.236000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.236000 audit: BPF prog-id=10 op=LOAD Mar 17 19:41:39.236000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe733a7250 a2=98 a3=3 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.236000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.237000 audit: BPF prog-id=10 op=UNLOAD Mar 17 19:41:39.238000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.238000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.238000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.238000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.238000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.238000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.238000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.238000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.238000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.238000 audit: BPF prog-id=11 op=LOAD Mar 17 19:41:39.238000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe733a7030 a2=74 a3=540051 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.238000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.240000 audit: BPF prog-id=11 op=UNLOAD Mar 17 19:41:39.240000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.240000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.240000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.240000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.240000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.240000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.240000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.240000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.240000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.240000 audit: BPF prog-id=12 op=LOAD Mar 17 19:41:39.240000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe733a7060 a2=94 a3=2 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.240000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.244000 audit: BPF prog-id=12 op=UNLOAD Mar 17 19:41:39.416370 systemd[1]: run-containerd-runc-k8s.io-4c1ef740ef47f5ff497973bc373f8c9106da486bac8ad0d98fcdbbc883538f45-runc.pdXPLx.mount: Deactivated successfully. Mar 17 19:41:39.421000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.423929 kernel: kauditd_printk_skb: 70 callbacks suppressed Mar 17 19:41:39.423998 kernel: audit: type=1400 audit(1742240499.421:310): avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.421000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.437969 kernel: audit: type=1400 audit(1742240499.421:310): avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.421000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.446003 kernel: audit: type=1400 audit(1742240499.421:310): avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.421000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.452986 kernel: audit: type=1400 audit(1742240499.421:310): avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.421000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.461993 kernel: audit: type=1400 audit(1742240499.421:310): avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.421000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.421000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.473527 kernel: audit: type=1400 audit(1742240499.421:310): avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.473648 kernel: audit: type=1400 audit(1742240499.421:310): avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.487476 kernel: audit: audit_backlog=65 > audit_backlog_limit=64 Mar 17 19:41:39.487605 kernel: audit: audit_lost=1 audit_rate_limit=0 audit_backlog_limit=64 Mar 17 19:41:39.487630 kernel: audit: backlog limit exceeded Mar 17 19:41:39.421000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.421000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.421000 audit: BPF prog-id=13 op=LOAD Mar 17 19:41:39.421000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe733a6f20 a2=40 a3=1 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.421000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.427000 audit: BPF prog-id=13 op=UNLOAD Mar 17 19:41:39.427000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.427000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=0 a1=7ffe733a6ff0 a2=50 a3=7ffe733a70d0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.427000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.482000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.482000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe733a6f30 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.482000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.482000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.482000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe733a6f60 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.482000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.482000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.482000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe733a6e70 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.482000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.482000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.482000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe733a6f80 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.482000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.482000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.482000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe733a6f60 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.482000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.482000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.482000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe733a6f50 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.482000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.482000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.482000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe733a6f80 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.482000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe733a6f60 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.483000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe733a6f80 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.483000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe733a6f50 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.483000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe733a6fc0 a2=28 a3=0 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.483000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffe733a6d70 a2=50 a3=1 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.483000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.483000 audit: BPF prog-id=14 op=LOAD Mar 17 19:41:39.483000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe733a6d70 a2=94 a3=5 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.483000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.488000 audit: BPF prog-id=14 op=UNLOAD Mar 17 19:41:39.488000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.488000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffe733a6e20 a2=50 a3=1 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.488000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.488000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.488000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=16 a1=7ffe733a6f40 a2=4 a3=38 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.488000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.489000 audit[3515]: AVC avc: denied { confidentiality } for pid=3515 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 19:41:39.489000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffe733a6f90 a2=94 a3=6 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.489000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.497000 audit[3515]: AVC avc: denied { confidentiality } for pid=3515 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 19:41:39.497000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffe733a6740 a2=94 a3=83 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.497000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { perfmon } for pid=3515 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { bpf } for pid=3515 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.499000 audit[3515]: AVC avc: denied { confidentiality } for pid=3515 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 19:41:39.499000 audit[3515]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffe733a6740 a2=94 a3=83 items=0 ppid=3415 pid=3515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.499000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 19:41:39.511000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.511000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.511000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.511000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.511000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.511000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.511000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.511000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.511000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.511000 audit: BPF prog-id=15 op=LOAD Mar 17 19:41:39.511000 audit[3540]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffccd4ae880 a2=98 a3=1999999999999999 items=0 ppid=3415 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.511000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 19:41:39.512000 audit: BPF prog-id=15 op=UNLOAD Mar 17 19:41:39.512000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.512000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.512000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.512000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.512000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.512000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.512000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.512000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.512000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.512000 audit: BPF prog-id=16 op=LOAD Mar 17 19:41:39.512000 audit[3540]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffccd4ae760 a2=74 a3=ffff items=0 ppid=3415 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.512000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 19:41:39.513000 audit: BPF prog-id=16 op=UNLOAD Mar 17 19:41:39.513000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.513000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.513000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.513000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.513000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.513000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.513000 audit[3540]: AVC avc: denied { perfmon } for pid=3540 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.513000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.513000 audit[3540]: AVC avc: denied { bpf } for pid=3540 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.513000 audit: BPF prog-id=17 op=LOAD Mar 17 19:41:39.513000 audit[3540]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffccd4ae7a0 a2=40 a3=7ffccd4ae980 items=0 ppid=3415 pid=3540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.513000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 19:41:39.514000 audit: BPF prog-id=17 op=UNLOAD Mar 17 19:41:39.617499 systemd-networkd[1042]: vxlan.calico: Link UP Mar 17 19:41:39.617507 systemd-networkd[1042]: vxlan.calico: Gained carrier Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit: BPF prog-id=18 op=LOAD Mar 17 19:41:39.682000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd0736c760 a2=98 a3=100 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.682000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.682000 audit: BPF prog-id=18 op=UNLOAD Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.682000 audit: BPF prog-id=19 op=LOAD Mar 17 19:41:39.682000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd0736c570 a2=74 a3=540051 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.682000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit: BPF prog-id=19 op=UNLOAD Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit: BPF prog-id=20 op=LOAD Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd0736c5a0 a2=94 a3=2 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit: BPF prog-id=20 op=UNLOAD Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd0736c470 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd0736c4a0 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd0736c3b0 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd0736c4c0 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd0736c4a0 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd0736c490 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd0736c4c0 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd0736c4a0 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd0736c4c0 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd0736c490 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd0736c500 a2=28 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit: BPF prog-id=21 op=LOAD Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd0736c370 a2=40 a3=0 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.684000 audit: BPF prog-id=21 op=UNLOAD Mar 17 19:41:39.684000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.684000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=0 a1=7ffd0736c360 a2=50 a3=2800 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.684000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=0 a1=7ffd0736c360 a2=50 a3=2800 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.685000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit: BPF prog-id=22 op=LOAD Mar 17 19:41:39.685000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd0736bb80 a2=94 a3=2 items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.685000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.685000 audit: BPF prog-id=22 op=UNLOAD Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { perfmon } for pid=3569 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit[3569]: AVC avc: denied { bpf } for pid=3569 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.685000 audit: BPF prog-id=23 op=LOAD Mar 17 19:41:39.685000 audit[3569]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd0736bc80 a2=94 a3=2d items=0 ppid=3415 pid=3569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.685000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit: BPF prog-id=24 op=LOAD Mar 17 19:41:39.692000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc9d0bcab0 a2=98 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.692000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.692000 audit: BPF prog-id=24 op=UNLOAD Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.692000 audit: BPF prog-id=25 op=LOAD Mar 17 19:41:39.692000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc9d0bc890 a2=74 a3=540051 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.692000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.693000 audit: BPF prog-id=25 op=UNLOAD Mar 17 19:41:39.693000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.693000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.693000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.693000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.693000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.693000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.693000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.693000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.693000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.693000 audit: BPF prog-id=26 op=LOAD Mar 17 19:41:39.693000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc9d0bc8c0 a2=94 a3=2 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.693000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.693000 audit: BPF prog-id=26 op=UNLOAD Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit: BPF prog-id=27 op=LOAD Mar 17 19:41:39.846000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc9d0bc780 a2=40 a3=1 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.846000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.846000 audit: BPF prog-id=27 op=UNLOAD Mar 17 19:41:39.846000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.846000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=0 a1=7ffc9d0bc850 a2=50 a3=7ffc9d0bc930 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.846000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.861000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.861000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc9d0bc790 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.861000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc9d0bc7c0 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc9d0bc6d0 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc9d0bc7e0 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc9d0bc7c0 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc9d0bc7b0 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc9d0bc7e0 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc9d0bc7c0 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc9d0bc7e0 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc9d0bc7b0 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc9d0bc820 a2=28 a3=0 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffc9d0bc5d0 a2=50 a3=1 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit: BPF prog-id=28 op=LOAD Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc9d0bc5d0 a2=94 a3=5 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit: BPF prog-id=28 op=UNLOAD Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffc9d0bc680 a2=50 a3=1 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=16 a1=7ffc9d0bc7a0 a2=4 a3=38 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.862000 audit[3572]: AVC avc: denied { confidentiality } for pid=3572 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 19:41:39.862000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffc9d0bc7f0 a2=94 a3=6 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { confidentiality } for pid=3572 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 19:41:39.863000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffc9d0bbfa0 a2=94 a3=83 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.863000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { perfmon } for pid=3572 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.863000 audit[3572]: AVC avc: denied { confidentiality } for pid=3572 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 19:41:39.863000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffc9d0bbfa0 a2=94 a3=83 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.863000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.864000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.864000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffc9d0bd9e0 a2=10 a3=f0f1 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.864000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.864000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.864000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffc9d0bd880 a2=10 a3=3 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.864000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.864000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.864000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffc9d0bd820 a2=10 a3=3 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.864000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.864000 audit[3572]: AVC avc: denied { bpf } for pid=3572 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 19:41:39.864000 audit[3572]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffc9d0bd820 a2=10 a3=7 items=0 ppid=3415 pid=3572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.864000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 19:41:39.870000 audit: BPF prog-id=23 op=UNLOAD Mar 17 19:41:39.981000 audit[3598]: NETFILTER_CFG table=filter:97 family=2 entries=39 op=nft_register_chain pid=3598 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:39.981000 audit[3598]: SYSCALL arch=c000003e syscall=46 success=yes exit=18968 a0=3 a1=7fff6581c2b0 a2=0 a3=7fff6581c29c items=0 ppid=3415 pid=3598 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.981000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:39.985000 audit[3597]: NETFILTER_CFG table=mangle:98 family=2 entries=16 op=nft_register_chain pid=3597 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:39.985000 audit[3597]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffe90309640 a2=0 a3=0 items=0 ppid=3415 pid=3597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.985000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:39.987000 audit[3600]: NETFILTER_CFG table=nat:99 family=2 entries=15 op=nft_register_chain pid=3600 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:39.987000 audit[3600]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffef2e5f4f0 a2=0 a3=7ffef2e5f4dc items=0 ppid=3415 pid=3600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.987000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:39.998000 audit[3596]: NETFILTER_CFG table=raw:100 family=2 entries=21 op=nft_register_chain pid=3596 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:39.998000 audit[3596]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffc0065e170 a2=0 a3=7ffc0065e15c items=0 ppid=3415 pid=3596 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:39.998000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:40.033482 env[1250]: time="2025-03-17T19:41:40.031488412Z" level=info msg="StopPodSandbox for \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\"" Mar 17 19:41:40.033482 env[1250]: time="2025-03-17T19:41:40.032022323Z" level=info msg="StopPodSandbox for \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\"" Mar 17 19:41:40.033482 env[1250]: time="2025-03-17T19:41:40.032283533Z" level=info msg="StopPodSandbox for \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\"" Mar 17 19:41:40.033482 env[1250]: time="2025-03-17T19:41:40.032609967Z" level=info msg="StopPodSandbox for \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\"" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.192 [INFO][3660] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.195 [INFO][3660] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" iface="eth0" netns="/var/run/netns/cni-84c0e0f3-cf75-bf0d-1a08-e37a311b5448" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.197 [INFO][3660] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" iface="eth0" netns="/var/run/netns/cni-84c0e0f3-cf75-bf0d-1a08-e37a311b5448" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.202 [INFO][3660] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" iface="eth0" netns="/var/run/netns/cni-84c0e0f3-cf75-bf0d-1a08-e37a311b5448" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.202 [INFO][3660] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.202 [INFO][3660] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.267 [INFO][3687] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" HandleID="k8s-pod-network.2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.267 [INFO][3687] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.267 [INFO][3687] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.274 [WARNING][3687] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" HandleID="k8s-pod-network.2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.274 [INFO][3687] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" HandleID="k8s-pod-network.2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.279 [INFO][3687] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:40.287883 env[1250]: 2025-03-17 19:41:40.286 [INFO][3660] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:40.291116 systemd[1]: run-netns-cni\x2d84c0e0f3\x2dcf75\x2dbf0d\x2d1a08\x2de37a311b5448.mount: Deactivated successfully. Mar 17 19:41:40.292442 env[1250]: time="2025-03-17T19:41:40.292400281Z" level=info msg="TearDown network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\" successfully" Mar 17 19:41:40.292532 env[1250]: time="2025-03-17T19:41:40.292513393Z" level=info msg="StopPodSandbox for \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\" returns successfully" Mar 17 19:41:40.293477 env[1250]: time="2025-03-17T19:41:40.293451824Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-j947x,Uid:7d670e19-a00c-4881-a611-273cacfe43fc,Namespace:kube-system,Attempt:1,}" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.195 [INFO][3673] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.197 [INFO][3673] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" iface="eth0" netns="/var/run/netns/cni-cb989622-cef9-a9e6-f43b-8234f37f2056" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.198 [INFO][3673] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" iface="eth0" netns="/var/run/netns/cni-cb989622-cef9-a9e6-f43b-8234f37f2056" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.202 [INFO][3673] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" iface="eth0" netns="/var/run/netns/cni-cb989622-cef9-a9e6-f43b-8234f37f2056" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.202 [INFO][3673] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.202 [INFO][3673] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.295 [INFO][3689] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" HandleID="k8s-pod-network.79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.296 [INFO][3689] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.296 [INFO][3689] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.304 [WARNING][3689] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" HandleID="k8s-pod-network.79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.304 [INFO][3689] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" HandleID="k8s-pod-network.79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.306 [INFO][3689] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:40.312722 env[1250]: 2025-03-17 19:41:40.311 [INFO][3673] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:40.316224 systemd[1]: run-netns-cni\x2dcb989622\x2dcef9\x2da9e6\x2df43b\x2d8234f37f2056.mount: Deactivated successfully. Mar 17 19:41:40.317498 env[1250]: time="2025-03-17T19:41:40.317455988Z" level=info msg="TearDown network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\" successfully" Mar 17 19:41:40.317587 env[1250]: time="2025-03-17T19:41:40.317567557Z" level=info msg="StopPodSandbox for \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\" returns successfully" Mar 17 19:41:40.318692 env[1250]: time="2025-03-17T19:41:40.318665717Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-959455b4-2tg7d,Uid:0357a441-9a17-439b-997f-9e0244a116fd,Namespace:calico-system,Attempt:1,}" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.199 [INFO][3668] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.199 [INFO][3668] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" iface="eth0" netns="/var/run/netns/cni-44a3a745-e848-4cf1-a3b4-5807ac4c0fff" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.200 [INFO][3668] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" iface="eth0" netns="/var/run/netns/cni-44a3a745-e848-4cf1-a3b4-5807ac4c0fff" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.202 [INFO][3668] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" iface="eth0" netns="/var/run/netns/cni-44a3a745-e848-4cf1-a3b4-5807ac4c0fff" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.202 [INFO][3668] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.202 [INFO][3668] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.303 [INFO][3688] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" HandleID="k8s-pod-network.b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.304 [INFO][3688] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.306 [INFO][3688] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.321 [WARNING][3688] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" HandleID="k8s-pod-network.b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.321 [INFO][3688] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" HandleID="k8s-pod-network.b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.329 [INFO][3688] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:40.335572 env[1250]: 2025-03-17 19:41:40.334 [INFO][3668] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:40.336970 env[1250]: time="2025-03-17T19:41:40.336921486Z" level=info msg="TearDown network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\" successfully" Mar 17 19:41:40.337076 env[1250]: time="2025-03-17T19:41:40.337053123Z" level=info msg="StopPodSandbox for \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\" returns successfully" Mar 17 19:41:40.338241 env[1250]: time="2025-03-17T19:41:40.338173034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6657945b8c-wnzqx,Uid:bb9bdd56-e015-48b4-a9cb-554b7a129746,Namespace:calico-apiserver,Attempt:1,}" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.235 [INFO][3678] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.235 [INFO][3678] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" iface="eth0" netns="/var/run/netns/cni-ec788916-aa5f-9cc9-7e4f-9dcb117bdffe" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.235 [INFO][3678] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" iface="eth0" netns="/var/run/netns/cni-ec788916-aa5f-9cc9-7e4f-9dcb117bdffe" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.235 [INFO][3678] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" iface="eth0" netns="/var/run/netns/cni-ec788916-aa5f-9cc9-7e4f-9dcb117bdffe" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.235 [INFO][3678] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.235 [INFO][3678] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.305 [INFO][3701] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" HandleID="k8s-pod-network.856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.305 [INFO][3701] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.329 [INFO][3701] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.338 [WARNING][3701] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" HandleID="k8s-pod-network.856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.338 [INFO][3701] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" HandleID="k8s-pod-network.856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.340 [INFO][3701] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:40.343457 env[1250]: 2025-03-17 19:41:40.342 [INFO][3678] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:40.344219 env[1250]: time="2025-03-17T19:41:40.344186867Z" level=info msg="TearDown network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\" successfully" Mar 17 19:41:40.344332 env[1250]: time="2025-03-17T19:41:40.344310619Z" level=info msg="StopPodSandbox for \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\" returns successfully" Mar 17 19:41:40.345504 env[1250]: time="2025-03-17T19:41:40.345433065Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-tm2v7,Uid:88261017-01e9-4199-99b8-fa205595cc28,Namespace:kube-system,Attempt:1,}" Mar 17 19:41:40.403471 systemd[1]: run-netns-cni\x2d44a3a745\x2de848\x2d4cf1\x2da3b4\x2d5807ac4c0fff.mount: Deactivated successfully. Mar 17 19:41:40.403597 systemd[1]: run-netns-cni\x2dec788916\x2daa5f\x2d9cc9\x2d7e4f\x2d9dcb117bdffe.mount: Deactivated successfully. Mar 17 19:41:40.571133 systemd-networkd[1042]: calicfd0fa57c56: Link UP Mar 17 19:41:40.576326 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): calicfd0fa57c56: link becomes ready Mar 17 19:41:40.575923 systemd-networkd[1042]: calicfd0fa57c56: Gained carrier Mar 17 19:41:40.618000 audit[3791]: NETFILTER_CFG table=filter:101 family=2 entries=34 op=nft_register_chain pid=3791 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.404 [INFO][3713] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0 coredns-7db6d8ff4d- kube-system 7d670e19-a00c-4881-a611-273cacfe43fc 772 0 2025-03-17 19:41:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-3510-3-7-8-c8b8528301.novalocal coredns-7db6d8ff4d-j947x eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calicfd0fa57c56 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-j947x" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.406 [INFO][3713] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-j947x" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.490 [INFO][3747] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" HandleID="k8s-pod-network.db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.508 [INFO][3747] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" HandleID="k8s-pod-network.db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000051ca0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-3510-3-7-8-c8b8528301.novalocal", "pod":"coredns-7db6d8ff4d-j947x", "timestamp":"2025-03-17 19:41:40.4900148 +0000 UTC"}, Hostname:"ci-3510-3-7-8-c8b8528301.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.508 [INFO][3747] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.508 [INFO][3747] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.508 [INFO][3747] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510-3-7-8-c8b8528301.novalocal' Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.512 [INFO][3747] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.528 [INFO][3747] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.535 [INFO][3747] ipam/ipam.go 489: Trying affinity for 192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.538 [INFO][3747] ipam/ipam.go 155: Attempting to load block cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.541 [INFO][3747] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.541 [INFO][3747] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.106.128/26 handle="k8s-pod-network.db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.543 [INFO][3747] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8 Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.548 [INFO][3747] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.106.128/26 handle="k8s-pod-network.db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.559 [INFO][3747] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.106.129/26] block=192.168.106.128/26 handle="k8s-pod-network.db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.559 [INFO][3747] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.106.129/26] handle="k8s-pod-network.db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.559 [INFO][3747] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:40.619753 env[1250]: 2025-03-17 19:41:40.559 [INFO][3747] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.106.129/26] IPv6=[] ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" HandleID="k8s-pod-network.db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.620435 env[1250]: 2025-03-17 19:41:40.561 [INFO][3713] cni-plugin/k8s.go 386: Populated endpoint ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-j947x" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7d670e19-a00c-4881-a611-273cacfe43fc", ResourceVersion:"772", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-j947x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfd0fa57c56", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:40.620435 env[1250]: 2025-03-17 19:41:40.561 [INFO][3713] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.106.129/32] ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-j947x" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.620435 env[1250]: 2025-03-17 19:41:40.561 [INFO][3713] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calicfd0fa57c56 ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-j947x" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.620435 env[1250]: 2025-03-17 19:41:40.579 [INFO][3713] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-j947x" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.620435 env[1250]: 2025-03-17 19:41:40.588 [INFO][3713] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-j947x" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7d670e19-a00c-4881-a611-273cacfe43fc", ResourceVersion:"772", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8", Pod:"coredns-7db6d8ff4d-j947x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfd0fa57c56", MAC:"ba:06:1d:8c:11:71", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:40.620435 env[1250]: 2025-03-17 19:41:40.608 [INFO][3713] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8" Namespace="kube-system" Pod="coredns-7db6d8ff4d-j947x" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:40.618000 audit[3791]: SYSCALL arch=c000003e syscall=46 success=yes exit=19148 a0=3 a1=7ffc826ac4a0 a2=0 a3=7ffc826ac48c items=0 ppid=3415 pid=3791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:40.618000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:40.655614 env[1250]: time="2025-03-17T19:41:40.655519549Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:40.657087 env[1250]: time="2025-03-17T19:41:40.657060179Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:40.657198 env[1250]: time="2025-03-17T19:41:40.657175596Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:40.658851 env[1250]: time="2025-03-17T19:41:40.658822575Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8 pid=3810 runtime=io.containerd.runc.v2 Mar 17 19:41:40.660136 systemd-networkd[1042]: vxlan.calico: Gained IPv6LL Mar 17 19:41:40.679034 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali28555e71a88: link becomes ready Mar 17 19:41:40.679576 systemd-networkd[1042]: cali28555e71a88: Link UP Mar 17 19:41:40.679724 systemd-networkd[1042]: cali28555e71a88: Gained carrier Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.458 [INFO][3723] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0 calico-kube-controllers-959455b4- calico-system 0357a441-9a17-439b-997f-9e0244a116fd 773 0 2025-03-17 19:41:08 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:959455b4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-3510-3-7-8-c8b8528301.novalocal calico-kube-controllers-959455b4-2tg7d eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali28555e71a88 [] []}} ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Namespace="calico-system" Pod="calico-kube-controllers-959455b4-2tg7d" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.460 [INFO][3723] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Namespace="calico-system" Pod="calico-kube-controllers-959455b4-2tg7d" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.553 [INFO][3765] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" HandleID="k8s-pod-network.5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.595 [INFO][3765] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" HandleID="k8s-pod-network.5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000318960), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-3510-3-7-8-c8b8528301.novalocal", "pod":"calico-kube-controllers-959455b4-2tg7d", "timestamp":"2025-03-17 19:41:40.553165455 +0000 UTC"}, Hostname:"ci-3510-3-7-8-c8b8528301.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.595 [INFO][3765] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.595 [INFO][3765] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.595 [INFO][3765] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510-3-7-8-c8b8528301.novalocal' Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.601 [INFO][3765] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.622 [INFO][3765] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.629 [INFO][3765] ipam/ipam.go 489: Trying affinity for 192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.637 [INFO][3765] ipam/ipam.go 155: Attempting to load block cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.641 [INFO][3765] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.641 [INFO][3765] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.106.128/26 handle="k8s-pod-network.5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.643 [INFO][3765] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.648 [INFO][3765] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.106.128/26 handle="k8s-pod-network.5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.658 [INFO][3765] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.106.130/26] block=192.168.106.128/26 handle="k8s-pod-network.5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.658 [INFO][3765] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.106.130/26] handle="k8s-pod-network.5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.658 [INFO][3765] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:40.707732 env[1250]: 2025-03-17 19:41:40.658 [INFO][3765] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.106.130/26] IPv6=[] ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" HandleID="k8s-pod-network.5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.708708 env[1250]: 2025-03-17 19:41:40.659 [INFO][3723] cni-plugin/k8s.go 386: Populated endpoint ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Namespace="calico-system" Pod="calico-kube-controllers-959455b4-2tg7d" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0", GenerateName:"calico-kube-controllers-959455b4-", Namespace:"calico-system", SelfLink:"", UID:"0357a441-9a17-439b-997f-9e0244a116fd", ResourceVersion:"773", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"959455b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"", Pod:"calico-kube-controllers-959455b4-2tg7d", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.106.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali28555e71a88", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:40.708708 env[1250]: 2025-03-17 19:41:40.659 [INFO][3723] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.106.130/32] ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Namespace="calico-system" Pod="calico-kube-controllers-959455b4-2tg7d" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.708708 env[1250]: 2025-03-17 19:41:40.659 [INFO][3723] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali28555e71a88 ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Namespace="calico-system" Pod="calico-kube-controllers-959455b4-2tg7d" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.708708 env[1250]: 2025-03-17 19:41:40.662 [INFO][3723] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Namespace="calico-system" Pod="calico-kube-controllers-959455b4-2tg7d" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.708708 env[1250]: 2025-03-17 19:41:40.663 [INFO][3723] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Namespace="calico-system" Pod="calico-kube-controllers-959455b4-2tg7d" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0", GenerateName:"calico-kube-controllers-959455b4-", Namespace:"calico-system", SelfLink:"", UID:"0357a441-9a17-439b-997f-9e0244a116fd", ResourceVersion:"773", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"959455b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd", Pod:"calico-kube-controllers-959455b4-2tg7d", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.106.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali28555e71a88", MAC:"22:b2:2a:74:b3:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:40.708708 env[1250]: 2025-03-17 19:41:40.705 [INFO][3723] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd" Namespace="calico-system" Pod="calico-kube-controllers-959455b4-2tg7d" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:40.709000 audit[3833]: NETFILTER_CFG table=filter:102 family=2 entries=38 op=nft_register_chain pid=3833 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:40.709000 audit[3833]: SYSCALL arch=c000003e syscall=46 success=yes exit=20336 a0=3 a1=7fff8ea7ae10 a2=0 a3=7fff8ea7adfc items=0 ppid=3415 pid=3833 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:40.709000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:40.779919 systemd-networkd[1042]: cali501ccc5f3da: Link UP Mar 17 19:41:40.786729 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 19:41:40.786827 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali501ccc5f3da: link becomes ready Mar 17 19:41:40.789586 systemd-networkd[1042]: cali501ccc5f3da: Gained carrier Mar 17 19:41:40.803091 env[1250]: time="2025-03-17T19:41:40.803026692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-j947x,Uid:7d670e19-a00c-4881-a611-273cacfe43fc,Namespace:kube-system,Attempt:1,} returns sandbox id \"db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8\"" Mar 17 19:41:40.809996 env[1250]: time="2025-03-17T19:41:40.809932157Z" level=info msg="CreateContainer within sandbox \"db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 19:41:40.825039 systemd-networkd[1042]: cali1275f2547f5: Link UP Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.492 [INFO][3735] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0 coredns-7db6d8ff4d- kube-system 88261017-01e9-4199-99b8-fa205595cc28 775 0 2025-03-17 19:41:01 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-3510-3-7-8-c8b8528301.novalocal coredns-7db6d8ff4d-tm2v7 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali501ccc5f3da [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-tm2v7" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.492 [INFO][3735] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-tm2v7" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.624 [INFO][3775] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" HandleID="k8s-pod-network.b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.640 [INFO][3775] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" HandleID="k8s-pod-network.b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000310af0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-3510-3-7-8-c8b8528301.novalocal", "pod":"coredns-7db6d8ff4d-tm2v7", "timestamp":"2025-03-17 19:41:40.624543786 +0000 UTC"}, Hostname:"ci-3510-3-7-8-c8b8528301.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.640 [INFO][3775] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.658 [INFO][3775] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.658 [INFO][3775] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510-3-7-8-c8b8528301.novalocal' Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.674 [INFO][3775] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.682 [INFO][3775] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.706 [INFO][3775] ipam/ipam.go 489: Trying affinity for 192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.709 [INFO][3775] ipam/ipam.go 155: Attempting to load block cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.711 [INFO][3775] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.711 [INFO][3775] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.106.128/26 handle="k8s-pod-network.b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.726 [INFO][3775] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4 Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.735 [INFO][3775] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.106.128/26 handle="k8s-pod-network.b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.751 [INFO][3775] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.106.131/26] block=192.168.106.128/26 handle="k8s-pod-network.b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.751 [INFO][3775] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.106.131/26] handle="k8s-pod-network.b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.751 [INFO][3775] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:40.827220 env[1250]: 2025-03-17 19:41:40.751 [INFO][3775] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.106.131/26] IPv6=[] ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" HandleID="k8s-pod-network.b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.827787 env[1250]: 2025-03-17 19:41:40.757 [INFO][3735] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-tm2v7" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"88261017-01e9-4199-99b8-fa205595cc28", ResourceVersion:"775", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"", Pod:"coredns-7db6d8ff4d-tm2v7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali501ccc5f3da", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:40.827787 env[1250]: 2025-03-17 19:41:40.758 [INFO][3735] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.106.131/32] ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-tm2v7" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.827787 env[1250]: 2025-03-17 19:41:40.758 [INFO][3735] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali501ccc5f3da ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-tm2v7" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.827787 env[1250]: 2025-03-17 19:41:40.781 [INFO][3735] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-tm2v7" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.827787 env[1250]: 2025-03-17 19:41:40.789 [INFO][3735] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-tm2v7" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"88261017-01e9-4199-99b8-fa205595cc28", ResourceVersion:"775", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4", Pod:"coredns-7db6d8ff4d-tm2v7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali501ccc5f3da", MAC:"4a:af:e2:3b:35:75", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:40.827787 env[1250]: 2025-03-17 19:41:40.805 [INFO][3735] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4" Namespace="kube-system" Pod="coredns-7db6d8ff4d-tm2v7" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:40.832095 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali1275f2547f5: link becomes ready Mar 17 19:41:40.831781 systemd-networkd[1042]: cali1275f2547f5: Gained carrier Mar 17 19:41:40.835137 env[1250]: time="2025-03-17T19:41:40.833492749Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:40.835137 env[1250]: time="2025-03-17T19:41:40.833542753Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:40.835137 env[1250]: time="2025-03-17T19:41:40.833556779Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:40.835137 env[1250]: time="2025-03-17T19:41:40.833744030Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd pid=3864 runtime=io.containerd.runc.v2 Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.505 [INFO][3743] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0 calico-apiserver-6657945b8c- calico-apiserver bb9bdd56-e015-48b4-a9cb-554b7a129746 774 0 2025-03-17 19:41:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6657945b8c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-3510-3-7-8-c8b8528301.novalocal calico-apiserver-6657945b8c-wnzqx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali1275f2547f5 [] []}} ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-wnzqx" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.505 [INFO][3743] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-wnzqx" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.628 [INFO][3780] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" HandleID="k8s-pod-network.c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.643 [INFO][3780] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" HandleID="k8s-pod-network.c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003127b0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-3510-3-7-8-c8b8528301.novalocal", "pod":"calico-apiserver-6657945b8c-wnzqx", "timestamp":"2025-03-17 19:41:40.628199855 +0000 UTC"}, Hostname:"ci-3510-3-7-8-c8b8528301.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.643 [INFO][3780] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.751 [INFO][3780] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.752 [INFO][3780] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510-3-7-8-c8b8528301.novalocal' Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.757 [INFO][3780] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.762 [INFO][3780] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.769 [INFO][3780] ipam/ipam.go 489: Trying affinity for 192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.771 [INFO][3780] ipam/ipam.go 155: Attempting to load block cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.775 [INFO][3780] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.775 [INFO][3780] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.106.128/26 handle="k8s-pod-network.c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.777 [INFO][3780] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887 Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.790 [INFO][3780] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.106.128/26 handle="k8s-pod-network.c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.802 [INFO][3780] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.106.132/26] block=192.168.106.128/26 handle="k8s-pod-network.c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.802 [INFO][3780] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.106.132/26] handle="k8s-pod-network.c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.802 [INFO][3780] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:40.867073 env[1250]: 2025-03-17 19:41:40.802 [INFO][3780] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.106.132/26] IPv6=[] ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" HandleID="k8s-pod-network.c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.867712 env[1250]: 2025-03-17 19:41:40.806 [INFO][3743] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-wnzqx" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0", GenerateName:"calico-apiserver-6657945b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"bb9bdd56-e015-48b4-a9cb-554b7a129746", ResourceVersion:"774", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6657945b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"", Pod:"calico-apiserver-6657945b8c-wnzqx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1275f2547f5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:40.867712 env[1250]: 2025-03-17 19:41:40.807 [INFO][3743] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.106.132/32] ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-wnzqx" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.867712 env[1250]: 2025-03-17 19:41:40.807 [INFO][3743] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1275f2547f5 ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-wnzqx" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.867712 env[1250]: 2025-03-17 19:41:40.830 [INFO][3743] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-wnzqx" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.867712 env[1250]: 2025-03-17 19:41:40.835 [INFO][3743] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-wnzqx" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0", GenerateName:"calico-apiserver-6657945b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"bb9bdd56-e015-48b4-a9cb-554b7a129746", ResourceVersion:"774", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6657945b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887", Pod:"calico-apiserver-6657945b8c-wnzqx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1275f2547f5", MAC:"66:97:44:33:de:71", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:40.867712 env[1250]: 2025-03-17 19:41:40.860 [INFO][3743] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-wnzqx" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:40.877003 env[1250]: time="2025-03-17T19:41:40.876961880Z" level=info msg="CreateContainer within sandbox \"db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9f49b4e31b1087ae6789b24c026f1009667af69940500690301a0b530af3a3c8\"" Mar 17 19:41:40.877923 env[1250]: time="2025-03-17T19:41:40.877903276Z" level=info msg="StartContainer for \"9f49b4e31b1087ae6789b24c026f1009667af69940500690301a0b530af3a3c8\"" Mar 17 19:41:40.951324 env[1250]: time="2025-03-17T19:41:40.951245100Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:40.951459 env[1250]: time="2025-03-17T19:41:40.951312106Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:40.951459 env[1250]: time="2025-03-17T19:41:40.951326183Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:40.951561 env[1250]: time="2025-03-17T19:41:40.951485171Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4 pid=3926 runtime=io.containerd.runc.v2 Mar 17 19:41:40.959345 env[1250]: time="2025-03-17T19:41:40.959263324Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:40.959584 env[1250]: time="2025-03-17T19:41:40.959561092Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:40.959726 env[1250]: time="2025-03-17T19:41:40.959673994Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:40.960032 env[1250]: time="2025-03-17T19:41:40.960004824Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887 pid=3947 runtime=io.containerd.runc.v2 Mar 17 19:41:40.977000 audit[3952]: NETFILTER_CFG table=filter:103 family=2 entries=34 op=nft_register_chain pid=3952 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:40.977000 audit[3952]: SYSCALL arch=c000003e syscall=46 success=yes exit=18220 a0=3 a1=7ffc90ffff60 a2=0 a3=7ffc90ffff4c items=0 ppid=3415 pid=3952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:40.977000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:41.042108 env[1250]: time="2025-03-17T19:41:41.042069764Z" level=info msg="StartContainer for \"9f49b4e31b1087ae6789b24c026f1009667af69940500690301a0b530af3a3c8\" returns successfully" Mar 17 19:41:41.041000 audit[4003]: NETFILTER_CFG table=filter:104 family=2 entries=52 op=nft_register_chain pid=4003 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:41.041000 audit[4003]: SYSCALL arch=c000003e syscall=46 success=yes exit=27056 a0=3 a1=7fff0ae9a530 a2=0 a3=7fff0ae9a51c items=0 ppid=3415 pid=4003 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:41.041000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:41.060943 env[1250]: time="2025-03-17T19:41:41.060904098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-tm2v7,Uid:88261017-01e9-4199-99b8-fa205595cc28,Namespace:kube-system,Attempt:1,} returns sandbox id \"b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4\"" Mar 17 19:41:41.063779 env[1250]: time="2025-03-17T19:41:41.063751881Z" level=info msg="CreateContainer within sandbox \"b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 19:41:41.092537 env[1250]: time="2025-03-17T19:41:41.090511213Z" level=info msg="CreateContainer within sandbox \"b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4ae2c3a4e16d5af82c39b24ae58a40242ba935d8c7c712e2a0ac713fe2204357\"" Mar 17 19:41:41.099501 env[1250]: time="2025-03-17T19:41:41.099440976Z" level=info msg="StartContainer for \"4ae2c3a4e16d5af82c39b24ae58a40242ba935d8c7c712e2a0ac713fe2204357\"" Mar 17 19:41:41.142301 env[1250]: time="2025-03-17T19:41:41.142262441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-959455b4-2tg7d,Uid:0357a441-9a17-439b-997f-9e0244a116fd,Namespace:calico-system,Attempt:1,} returns sandbox id \"5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd\"" Mar 17 19:41:41.144694 env[1250]: time="2025-03-17T19:41:41.144667202Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Mar 17 19:41:41.188187 env[1250]: time="2025-03-17T19:41:41.188144619Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6657945b8c-wnzqx,Uid:bb9bdd56-e015-48b4-a9cb-554b7a129746,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887\"" Mar 17 19:41:41.209994 env[1250]: time="2025-03-17T19:41:41.209414413Z" level=info msg="StartContainer for \"4ae2c3a4e16d5af82c39b24ae58a40242ba935d8c7c712e2a0ac713fe2204357\" returns successfully" Mar 17 19:41:41.421367 kubelet[2215]: I0317 19:41:41.421043 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-tm2v7" podStartSLOduration=40.421024395 podStartE2EDuration="40.421024395s" podCreationTimestamp="2025-03-17 19:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 19:41:41.418640854 +0000 UTC m=+53.528482055" watchObservedRunningTime="2025-03-17 19:41:41.421024395 +0000 UTC m=+53.530865606" Mar 17 19:41:41.442172 kubelet[2215]: I0317 19:41:41.442071 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-j947x" podStartSLOduration=40.442037677 podStartE2EDuration="40.442037677s" podCreationTimestamp="2025-03-17 19:41:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 19:41:41.441232496 +0000 UTC m=+53.551073737" watchObservedRunningTime="2025-03-17 19:41:41.442037677 +0000 UTC m=+53.551878928" Mar 17 19:41:41.466000 audit[4081]: NETFILTER_CFG table=filter:105 family=2 entries=16 op=nft_register_rule pid=4081 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:41.466000 audit[4081]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7fffdcd893d0 a2=0 a3=7fffdcd893bc items=0 ppid=2387 pid=4081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:41.466000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:41.475000 audit[4081]: NETFILTER_CFG table=nat:106 family=2 entries=14 op=nft_register_rule pid=4081 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:41.475000 audit[4081]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7fffdcd893d0 a2=0 a3=0 items=0 ppid=2387 pid=4081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:41.475000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:41.488000 audit[4083]: NETFILTER_CFG table=filter:107 family=2 entries=13 op=nft_register_rule pid=4083 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:41.488000 audit[4083]: SYSCALL arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7ffffa3ca480 a2=0 a3=7ffffa3ca46c items=0 ppid=2387 pid=4083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:41.488000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:41.499000 audit[4083]: NETFILTER_CFG table=nat:108 family=2 entries=47 op=nft_register_chain pid=4083 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:41.499000 audit[4083]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffffa3ca480 a2=0 a3=7ffffa3ca46c items=0 ppid=2387 pid=4083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:41.499000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:41.811147 systemd-networkd[1042]: cali28555e71a88: Gained IPv6LL Mar 17 19:41:42.032368 env[1250]: time="2025-03-17T19:41:42.032329992Z" level=info msg="StopPodSandbox for \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\"" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.096 [INFO][4101] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.096 [INFO][4101] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" iface="eth0" netns="/var/run/netns/cni-30b95ae2-4b5d-5053-d323-2f3d4dd61946" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.097 [INFO][4101] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" iface="eth0" netns="/var/run/netns/cni-30b95ae2-4b5d-5053-d323-2f3d4dd61946" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.097 [INFO][4101] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" iface="eth0" netns="/var/run/netns/cni-30b95ae2-4b5d-5053-d323-2f3d4dd61946" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.097 [INFO][4101] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.097 [INFO][4101] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.139 [INFO][4107] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" HandleID="k8s-pod-network.d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.139 [INFO][4107] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.140 [INFO][4107] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.147 [WARNING][4107] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" HandleID="k8s-pod-network.d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.148 [INFO][4107] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" HandleID="k8s-pod-network.d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.150 [INFO][4107] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:42.156567 env[1250]: 2025-03-17 19:41:42.151 [INFO][4101] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:42.155677 systemd[1]: run-netns-cni\x2d30b95ae2\x2d4b5d\x2d5053\x2dd323\x2d2f3d4dd61946.mount: Deactivated successfully. Mar 17 19:41:42.158299 env[1250]: time="2025-03-17T19:41:42.156858884Z" level=info msg="TearDown network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\" successfully" Mar 17 19:41:42.158299 env[1250]: time="2025-03-17T19:41:42.156893468Z" level=info msg="StopPodSandbox for \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\" returns successfully" Mar 17 19:41:42.158299 env[1250]: time="2025-03-17T19:41:42.157861164Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6657945b8c-w9rz5,Uid:7ae34508-787e-4967-b222-10f79da4690c,Namespace:calico-apiserver,Attempt:1,}" Mar 17 19:41:42.341577 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 19:41:42.341668 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): caliab3fd211749: link becomes ready Mar 17 19:41:42.341478 systemd-networkd[1042]: caliab3fd211749: Link UP Mar 17 19:41:42.341943 systemd-networkd[1042]: caliab3fd211749: Gained carrier Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.239 [INFO][4113] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0 calico-apiserver-6657945b8c- calico-apiserver 7ae34508-787e-4967-b222-10f79da4690c 813 0 2025-03-17 19:41:08 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6657945b8c projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-3510-3-7-8-c8b8528301.novalocal calico-apiserver-6657945b8c-w9rz5 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliab3fd211749 [] []}} ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-w9rz5" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.239 [INFO][4113] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-w9rz5" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.283 [INFO][4125] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" HandleID="k8s-pod-network.e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.295 [INFO][4125] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" HandleID="k8s-pod-network.e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e6da0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-3510-3-7-8-c8b8528301.novalocal", "pod":"calico-apiserver-6657945b8c-w9rz5", "timestamp":"2025-03-17 19:41:42.283077526 +0000 UTC"}, Hostname:"ci-3510-3-7-8-c8b8528301.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.295 [INFO][4125] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.295 [INFO][4125] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.295 [INFO][4125] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510-3-7-8-c8b8528301.novalocal' Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.297 [INFO][4125] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.301 [INFO][4125] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.306 [INFO][4125] ipam/ipam.go 489: Trying affinity for 192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.308 [INFO][4125] ipam/ipam.go 155: Attempting to load block cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.311 [INFO][4125] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.313 [INFO][4125] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.106.128/26 handle="k8s-pod-network.e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.315 [INFO][4125] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.320 [INFO][4125] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.106.128/26 handle="k8s-pod-network.e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.332 [INFO][4125] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.106.133/26] block=192.168.106.128/26 handle="k8s-pod-network.e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.332 [INFO][4125] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.106.133/26] handle="k8s-pod-network.e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.332 [INFO][4125] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:42.361316 env[1250]: 2025-03-17 19:41:42.332 [INFO][4125] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.106.133/26] IPv6=[] ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" HandleID="k8s-pod-network.e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.361930 env[1250]: 2025-03-17 19:41:42.334 [INFO][4113] cni-plugin/k8s.go 386: Populated endpoint ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-w9rz5" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0", GenerateName:"calico-apiserver-6657945b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"7ae34508-787e-4967-b222-10f79da4690c", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6657945b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"", Pod:"calico-apiserver-6657945b8c-w9rz5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliab3fd211749", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:42.361930 env[1250]: 2025-03-17 19:41:42.334 [INFO][4113] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.106.133/32] ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-w9rz5" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.361930 env[1250]: 2025-03-17 19:41:42.334 [INFO][4113] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab3fd211749 ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-w9rz5" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.361930 env[1250]: 2025-03-17 19:41:42.341 [INFO][4113] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-w9rz5" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.361930 env[1250]: 2025-03-17 19:41:42.342 [INFO][4113] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-w9rz5" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0", GenerateName:"calico-apiserver-6657945b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"7ae34508-787e-4967-b222-10f79da4690c", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6657945b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d", Pod:"calico-apiserver-6657945b8c-w9rz5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliab3fd211749", MAC:"a2:ff:8a:37:8e:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:42.361930 env[1250]: 2025-03-17 19:41:42.355 [INFO][4113] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d" Namespace="calico-apiserver" Pod="calico-apiserver-6657945b8c-w9rz5" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:42.381000 audit[4143]: NETFILTER_CFG table=filter:109 family=2 entries=46 op=nft_register_chain pid=4143 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:42.381000 audit[4143]: SYSCALL arch=c000003e syscall=46 success=yes exit=23892 a0=3 a1=7ffc3a845730 a2=0 a3=7ffc3a84571c items=0 ppid=3415 pid=4143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:42.381000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:42.387152 systemd-networkd[1042]: calicfd0fa57c56: Gained IPv6LL Mar 17 19:41:42.390420 env[1250]: time="2025-03-17T19:41:42.390366005Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:42.390581 env[1250]: time="2025-03-17T19:41:42.390558065Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:42.390667 env[1250]: time="2025-03-17T19:41:42.390646080Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:42.390853 env[1250]: time="2025-03-17T19:41:42.390827751Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d pid=4153 runtime=io.containerd.runc.v2 Mar 17 19:41:42.444476 systemd[1]: run-containerd-runc-k8s.io-e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d-runc.0I5O9r.mount: Deactivated successfully. Mar 17 19:41:42.452395 systemd-networkd[1042]: cali1275f2547f5: Gained IPv6LL Mar 17 19:41:42.531361 env[1250]: time="2025-03-17T19:41:42.531322507Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6657945b8c-w9rz5,Uid:7ae34508-787e-4967-b222-10f79da4690c,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d\"" Mar 17 19:41:42.771236 systemd-networkd[1042]: cali501ccc5f3da: Gained IPv6LL Mar 17 19:41:43.038042 env[1250]: time="2025-03-17T19:41:43.037819310Z" level=info msg="StopPodSandbox for \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\"" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.157 [INFO][4203] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.157 [INFO][4203] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" iface="eth0" netns="/var/run/netns/cni-200180de-de52-f3ec-479e-549fc177f3b3" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.160 [INFO][4203] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" iface="eth0" netns="/var/run/netns/cni-200180de-de52-f3ec-479e-549fc177f3b3" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.160 [INFO][4203] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" iface="eth0" netns="/var/run/netns/cni-200180de-de52-f3ec-479e-549fc177f3b3" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.160 [INFO][4203] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.160 [INFO][4203] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.188 [INFO][4209] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" HandleID="k8s-pod-network.85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.188 [INFO][4209] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.188 [INFO][4209] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.196 [WARNING][4209] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" HandleID="k8s-pod-network.85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.196 [INFO][4209] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" HandleID="k8s-pod-network.85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.198 [INFO][4209] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:43.200923 env[1250]: 2025-03-17 19:41:43.199 [INFO][4203] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:43.204045 systemd[1]: run-netns-cni\x2d200180de\x2dde52\x2df3ec\x2d479e\x2d549fc177f3b3.mount: Deactivated successfully. Mar 17 19:41:43.206545 env[1250]: time="2025-03-17T19:41:43.206506012Z" level=info msg="TearDown network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\" successfully" Mar 17 19:41:43.206608 env[1250]: time="2025-03-17T19:41:43.206543252Z" level=info msg="StopPodSandbox for \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\" returns successfully" Mar 17 19:41:43.207337 env[1250]: time="2025-03-17T19:41:43.207306003Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ptzbm,Uid:b69f8cef-aedd-412b-a964-af34b5f74525,Namespace:calico-system,Attempt:1,}" Mar 17 19:41:43.346406 systemd-networkd[1042]: cali0efd90bee2a: Link UP Mar 17 19:41:43.350783 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 19:41:43.350840 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali0efd90bee2a: link becomes ready Mar 17 19:41:43.350620 systemd-networkd[1042]: cali0efd90bee2a: Gained carrier Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.266 [INFO][4215] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0 csi-node-driver- calico-system b69f8cef-aedd-412b-a964-af34b5f74525 821 0 2025-03-17 19:41:08 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-3510-3-7-8-c8b8528301.novalocal csi-node-driver-ptzbm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali0efd90bee2a [] []}} ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Namespace="calico-system" Pod="csi-node-driver-ptzbm" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.266 [INFO][4215] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Namespace="calico-system" Pod="csi-node-driver-ptzbm" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.298 [INFO][4227] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" HandleID="k8s-pod-network.f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.310 [INFO][4227] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" HandleID="k8s-pod-network.f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002e0810), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-3510-3-7-8-c8b8528301.novalocal", "pod":"csi-node-driver-ptzbm", "timestamp":"2025-03-17 19:41:43.298595837 +0000 UTC"}, Hostname:"ci-3510-3-7-8-c8b8528301.novalocal", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.310 [INFO][4227] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.310 [INFO][4227] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.310 [INFO][4227] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510-3-7-8-c8b8528301.novalocal' Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.312 [INFO][4227] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.316 [INFO][4227] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.322 [INFO][4227] ipam/ipam.go 489: Trying affinity for 192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.324 [INFO][4227] ipam/ipam.go 155: Attempting to load block cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.327 [INFO][4227] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.106.128/26 host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.327 [INFO][4227] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.106.128/26 handle="k8s-pod-network.f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.329 [INFO][4227] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4 Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.333 [INFO][4227] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.106.128/26 handle="k8s-pod-network.f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.343 [INFO][4227] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.106.134/26] block=192.168.106.128/26 handle="k8s-pod-network.f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.343 [INFO][4227] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.106.134/26] handle="k8s-pod-network.f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" host="ci-3510-3-7-8-c8b8528301.novalocal" Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.343 [INFO][4227] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:43.380041 env[1250]: 2025-03-17 19:41:43.343 [INFO][4227] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.106.134/26] IPv6=[] ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" HandleID="k8s-pod-network.f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.380680 env[1250]: 2025-03-17 19:41:43.344 [INFO][4215] cni-plugin/k8s.go 386: Populated endpoint ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Namespace="calico-system" Pod="csi-node-driver-ptzbm" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b69f8cef-aedd-412b-a964-af34b5f74525", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"", Pod:"csi-node-driver-ptzbm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.106.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0efd90bee2a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:43.380680 env[1250]: 2025-03-17 19:41:43.344 [INFO][4215] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.106.134/32] ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Namespace="calico-system" Pod="csi-node-driver-ptzbm" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.380680 env[1250]: 2025-03-17 19:41:43.344 [INFO][4215] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0efd90bee2a ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Namespace="calico-system" Pod="csi-node-driver-ptzbm" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.380680 env[1250]: 2025-03-17 19:41:43.351 [INFO][4215] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Namespace="calico-system" Pod="csi-node-driver-ptzbm" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.380680 env[1250]: 2025-03-17 19:41:43.358 [INFO][4215] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Namespace="calico-system" Pod="csi-node-driver-ptzbm" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b69f8cef-aedd-412b-a964-af34b5f74525", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4", Pod:"csi-node-driver-ptzbm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.106.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0efd90bee2a", MAC:"46:d6:9d:3c:f5:e4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:43.380680 env[1250]: 2025-03-17 19:41:43.377 [INFO][4215] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4" Namespace="calico-system" Pod="csi-node-driver-ptzbm" WorkloadEndpoint="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:43.400000 audit[4241]: NETFILTER_CFG table=filter:110 family=2 entries=50 op=nft_register_chain pid=4241 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 19:41:43.400000 audit[4241]: SYSCALL arch=c000003e syscall=46 success=yes exit=23392 a0=3 a1=7ffd6f21f410 a2=0 a3=7ffd6f21f3fc items=0 ppid=3415 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:43.400000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 19:41:43.442196 env[1250]: time="2025-03-17T19:41:43.442106989Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 19:41:43.442196 env[1250]: time="2025-03-17T19:41:43.442162102Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 19:41:43.442399 env[1250]: time="2025-03-17T19:41:43.442177291Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 19:41:43.442668 env[1250]: time="2025-03-17T19:41:43.442625001Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4 pid=4254 runtime=io.containerd.runc.v2 Mar 17 19:41:43.566028 env[1250]: time="2025-03-17T19:41:43.565984378Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-ptzbm,Uid:b69f8cef-aedd-412b-a964-af34b5f74525,Namespace:calico-system,Attempt:1,} returns sandbox id \"f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4\"" Mar 17 19:41:43.987326 systemd-networkd[1042]: caliab3fd211749: Gained IPv6LL Mar 17 19:41:45.246217 env[1250]: time="2025-03-17T19:41:45.246177773Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:45.250176 env[1250]: time="2025-03-17T19:41:45.250154594Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:45.252721 env[1250]: time="2025-03-17T19:41:45.252699317Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:45.255419 env[1250]: time="2025-03-17T19:41:45.255396438Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:45.256162 env[1250]: time="2025-03-17T19:41:45.256138980Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Mar 17 19:41:45.261009 env[1250]: time="2025-03-17T19:41:45.260978801Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Mar 17 19:41:45.272084 systemd-networkd[1042]: cali0efd90bee2a: Gained IPv6LL Mar 17 19:41:45.291558 env[1250]: time="2025-03-17T19:41:45.291519786Z" level=info msg="CreateContainer within sandbox \"5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 19:41:45.320432 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2791194239.mount: Deactivated successfully. Mar 17 19:41:45.324546 env[1250]: time="2025-03-17T19:41:45.324465413Z" level=info msg="CreateContainer within sandbox \"5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"580214302fa232511f8f62ada0ed878cc5412f7f8e2bd23794bee0ecc2d066fa\"" Mar 17 19:41:45.328212 env[1250]: time="2025-03-17T19:41:45.328182486Z" level=info msg="StartContainer for \"580214302fa232511f8f62ada0ed878cc5412f7f8e2bd23794bee0ecc2d066fa\"" Mar 17 19:41:45.422972 env[1250]: time="2025-03-17T19:41:45.420660797Z" level=info msg="StartContainer for \"580214302fa232511f8f62ada0ed878cc5412f7f8e2bd23794bee0ecc2d066fa\" returns successfully" Mar 17 19:41:45.459722 kubelet[2215]: I0317 19:41:45.459629 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-959455b4-2tg7d" podStartSLOduration=33.343342431 podStartE2EDuration="37.459610587s" podCreationTimestamp="2025-03-17 19:41:08 +0000 UTC" firstStartedPulling="2025-03-17 19:41:41.144134573 +0000 UTC m=+53.253975764" lastFinishedPulling="2025-03-17 19:41:45.260402709 +0000 UTC m=+57.370243920" observedRunningTime="2025-03-17 19:41:45.457421521 +0000 UTC m=+57.567262722" watchObservedRunningTime="2025-03-17 19:41:45.459610587 +0000 UTC m=+57.569451788" Mar 17 19:41:48.055470 env[1250]: time="2025-03-17T19:41:48.055407538Z" level=info msg="StopPodSandbox for \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\"" Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.130 [WARNING][4368] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7d670e19-a00c-4881-a611-273cacfe43fc", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8", Pod:"coredns-7db6d8ff4d-j947x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfd0fa57c56", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.130 [INFO][4368] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.130 [INFO][4368] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" iface="eth0" netns="" Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.131 [INFO][4368] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.131 [INFO][4368] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.158 [INFO][4373] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" HandleID="k8s-pod-network.2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.159 [INFO][4373] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.159 [INFO][4373] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.170 [WARNING][4373] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" HandleID="k8s-pod-network.2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.170 [INFO][4373] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" HandleID="k8s-pod-network.2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.172 [INFO][4373] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:48.180813 env[1250]: 2025-03-17 19:41:48.177 [INFO][4368] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:48.181515 env[1250]: time="2025-03-17T19:41:48.181473867Z" level=info msg="TearDown network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\" successfully" Mar 17 19:41:48.181599 env[1250]: time="2025-03-17T19:41:48.181581179Z" level=info msg="StopPodSandbox for \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\" returns successfully" Mar 17 19:41:48.226665 env[1250]: time="2025-03-17T19:41:48.226633738Z" level=info msg="RemovePodSandbox for \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\"" Mar 17 19:41:48.226847 env[1250]: time="2025-03-17T19:41:48.226810048Z" level=info msg="Forcibly stopping sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\"" Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.302 [WARNING][4392] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7d670e19-a00c-4881-a611-273cacfe43fc", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"db1a6ce1ddfee7a055787c692504b16084269b1276ba107825f79f070c2693c8", Pod:"coredns-7db6d8ff4d-j947x", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calicfd0fa57c56", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.302 [INFO][4392] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.302 [INFO][4392] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" iface="eth0" netns="" Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.302 [INFO][4392] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.302 [INFO][4392] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.327 [INFO][4398] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" HandleID="k8s-pod-network.2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.328 [INFO][4398] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.328 [INFO][4398] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.335 [WARNING][4398] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" HandleID="k8s-pod-network.2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.335 [INFO][4398] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" HandleID="k8s-pod-network.2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--j947x-eth0" Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.337 [INFO][4398] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:48.339615 env[1250]: 2025-03-17 19:41:48.338 [INFO][4392] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f" Mar 17 19:41:48.340450 env[1250]: time="2025-03-17T19:41:48.340419489Z" level=info msg="TearDown network for sandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\" successfully" Mar 17 19:41:48.345017 env[1250]: time="2025-03-17T19:41:48.344989792Z" level=info msg="RemovePodSandbox \"2597ee24c45473c8be7c63abfec5a89ae2f0a0dd77f150210d2aca223feb393f\" returns successfully" Mar 17 19:41:48.345565 env[1250]: time="2025-03-17T19:41:48.345544955Z" level=info msg="StopPodSandbox for \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\"" Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.431 [WARNING][4418] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b69f8cef-aedd-412b-a964-af34b5f74525", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4", Pod:"csi-node-driver-ptzbm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.106.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0efd90bee2a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.431 [INFO][4418] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.431 [INFO][4418] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" iface="eth0" netns="" Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.431 [INFO][4418] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.431 [INFO][4418] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.470 [INFO][4426] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" HandleID="k8s-pod-network.85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.470 [INFO][4426] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.470 [INFO][4426] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.480 [WARNING][4426] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" HandleID="k8s-pod-network.85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.480 [INFO][4426] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" HandleID="k8s-pod-network.85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.482 [INFO][4426] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:48.484522 env[1250]: 2025-03-17 19:41:48.483 [INFO][4418] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:48.485054 env[1250]: time="2025-03-17T19:41:48.484549826Z" level=info msg="TearDown network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\" successfully" Mar 17 19:41:48.485054 env[1250]: time="2025-03-17T19:41:48.484578510Z" level=info msg="StopPodSandbox for \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\" returns successfully" Mar 17 19:41:48.485118 env[1250]: time="2025-03-17T19:41:48.485058381Z" level=info msg="RemovePodSandbox for \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\"" Mar 17 19:41:48.485118 env[1250]: time="2025-03-17T19:41:48.485088467Z" level=info msg="Forcibly stopping sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\"" Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.549 [WARNING][4448] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"b69f8cef-aedd-412b-a964-af34b5f74525", ResourceVersion:"824", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4", Pod:"csi-node-driver-ptzbm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.106.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali0efd90bee2a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.549 [INFO][4448] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.549 [INFO][4448] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" iface="eth0" netns="" Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.549 [INFO][4448] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.549 [INFO][4448] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.581 [INFO][4454] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" HandleID="k8s-pod-network.85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.581 [INFO][4454] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.581 [INFO][4454] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.588 [WARNING][4454] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" HandleID="k8s-pod-network.85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.588 [INFO][4454] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" HandleID="k8s-pod-network.85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-csi--node--driver--ptzbm-eth0" Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.590 [INFO][4454] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:48.592759 env[1250]: 2025-03-17 19:41:48.591 [INFO][4448] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4" Mar 17 19:41:48.592759 env[1250]: time="2025-03-17T19:41:48.592702430Z" level=info msg="TearDown network for sandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\" successfully" Mar 17 19:41:48.598471 env[1250]: time="2025-03-17T19:41:48.598440034Z" level=info msg="RemovePodSandbox \"85ec283af42056fbb4e4083eb28eba43047554e46a69ee95074f68fed64de2c4\" returns successfully" Mar 17 19:41:48.598828 env[1250]: time="2025-03-17T19:41:48.598803846Z" level=info msg="StopPodSandbox for \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\"" Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.669 [WARNING][4473] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0", GenerateName:"calico-apiserver-6657945b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"bb9bdd56-e015-48b4-a9cb-554b7a129746", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6657945b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887", Pod:"calico-apiserver-6657945b8c-wnzqx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1275f2547f5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.670 [INFO][4473] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.670 [INFO][4473] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" iface="eth0" netns="" Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.670 [INFO][4473] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.670 [INFO][4473] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.702 [INFO][4479] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" HandleID="k8s-pod-network.b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.702 [INFO][4479] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.702 [INFO][4479] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.709 [WARNING][4479] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" HandleID="k8s-pod-network.b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.709 [INFO][4479] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" HandleID="k8s-pod-network.b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.710 [INFO][4479] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:48.712749 env[1250]: 2025-03-17 19:41:48.711 [INFO][4473] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:48.713278 env[1250]: time="2025-03-17T19:41:48.712774946Z" level=info msg="TearDown network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\" successfully" Mar 17 19:41:48.713278 env[1250]: time="2025-03-17T19:41:48.712805073Z" level=info msg="StopPodSandbox for \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\" returns successfully" Mar 17 19:41:48.713278 env[1250]: time="2025-03-17T19:41:48.713230591Z" level=info msg="RemovePodSandbox for \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\"" Mar 17 19:41:48.713368 env[1250]: time="2025-03-17T19:41:48.713258192Z" level=info msg="Forcibly stopping sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\"" Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.793 [WARNING][4500] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0", GenerateName:"calico-apiserver-6657945b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"bb9bdd56-e015-48b4-a9cb-554b7a129746", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6657945b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887", Pod:"calico-apiserver-6657945b8c-wnzqx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali1275f2547f5", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.793 [INFO][4500] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.793 [INFO][4500] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" iface="eth0" netns="" Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.793 [INFO][4500] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.793 [INFO][4500] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.820 [INFO][4506] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" HandleID="k8s-pod-network.b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.820 [INFO][4506] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.820 [INFO][4506] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.827 [WARNING][4506] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" HandleID="k8s-pod-network.b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.827 [INFO][4506] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" HandleID="k8s-pod-network.b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--wnzqx-eth0" Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.829 [INFO][4506] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:48.831645 env[1250]: 2025-03-17 19:41:48.830 [INFO][4500] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70" Mar 17 19:41:48.832205 env[1250]: time="2025-03-17T19:41:48.832170551Z" level=info msg="TearDown network for sandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\" successfully" Mar 17 19:41:48.837067 env[1250]: time="2025-03-17T19:41:48.837043593Z" level=info msg="RemovePodSandbox \"b8762995594ac4997b2ad9993e4499788a17037ced5a1280d74dba3676cfbc70\" returns successfully" Mar 17 19:41:48.837617 env[1250]: time="2025-03-17T19:41:48.837595197Z" level=info msg="StopPodSandbox for \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\"" Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.907 [WARNING][4524] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0", GenerateName:"calico-apiserver-6657945b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"7ae34508-787e-4967-b222-10f79da4690c", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6657945b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d", Pod:"calico-apiserver-6657945b8c-w9rz5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliab3fd211749", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.907 [INFO][4524] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.907 [INFO][4524] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" iface="eth0" netns="" Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.907 [INFO][4524] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.907 [INFO][4524] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.942 [INFO][4530] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" HandleID="k8s-pod-network.d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.942 [INFO][4530] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.942 [INFO][4530] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.949 [WARNING][4530] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" HandleID="k8s-pod-network.d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.949 [INFO][4530] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" HandleID="k8s-pod-network.d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.951 [INFO][4530] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:48.953219 env[1250]: 2025-03-17 19:41:48.952 [INFO][4524] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:48.954021 env[1250]: time="2025-03-17T19:41:48.953989553Z" level=info msg="TearDown network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\" successfully" Mar 17 19:41:48.954187 env[1250]: time="2025-03-17T19:41:48.954119416Z" level=info msg="StopPodSandbox for \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\" returns successfully" Mar 17 19:41:48.955365 env[1250]: time="2025-03-17T19:41:48.955344244Z" level=info msg="RemovePodSandbox for \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\"" Mar 17 19:41:48.955528 env[1250]: time="2025-03-17T19:41:48.955487633Z" level=info msg="Forcibly stopping sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\"" Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.025 [WARNING][4550] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0", GenerateName:"calico-apiserver-6657945b8c-", Namespace:"calico-apiserver", SelfLink:"", UID:"7ae34508-787e-4967-b222-10f79da4690c", ResourceVersion:"817", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6657945b8c", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d", Pod:"calico-apiserver-6657945b8c-w9rz5", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.106.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliab3fd211749", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.025 [INFO][4550] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.025 [INFO][4550] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" iface="eth0" netns="" Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.025 [INFO][4550] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.025 [INFO][4550] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.070 [INFO][4556] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" HandleID="k8s-pod-network.d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.070 [INFO][4556] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.071 [INFO][4556] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.077 [WARNING][4556] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" HandleID="k8s-pod-network.d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.077 [INFO][4556] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" HandleID="k8s-pod-network.d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--apiserver--6657945b8c--w9rz5-eth0" Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.079 [INFO][4556] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:49.081108 env[1250]: 2025-03-17 19:41:49.080 [INFO][4550] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41" Mar 17 19:41:49.081882 env[1250]: time="2025-03-17T19:41:49.081850569Z" level=info msg="TearDown network for sandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\" successfully" Mar 17 19:41:49.088218 env[1250]: time="2025-03-17T19:41:49.088190482Z" level=info msg="RemovePodSandbox \"d7189b78d602046d657e66e874d72dc8d0ca0c30c0d240d2609b4bf4b61a1e41\" returns successfully" Mar 17 19:41:49.088749 env[1250]: time="2025-03-17T19:41:49.088729634Z" level=info msg="StopPodSandbox for \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\"" Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.164 [WARNING][4576] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"88261017-01e9-4199-99b8-fa205595cc28", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4", Pod:"coredns-7db6d8ff4d-tm2v7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali501ccc5f3da", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.164 [INFO][4576] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.164 [INFO][4576] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" iface="eth0" netns="" Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.164 [INFO][4576] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.164 [INFO][4576] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.209 [INFO][4583] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" HandleID="k8s-pod-network.856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.210 [INFO][4583] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.210 [INFO][4583] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.216 [WARNING][4583] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" HandleID="k8s-pod-network.856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.216 [INFO][4583] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" HandleID="k8s-pod-network.856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.218 [INFO][4583] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:49.220674 env[1250]: 2025-03-17 19:41:49.219 [INFO][4576] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:49.221252 env[1250]: time="2025-03-17T19:41:49.220691843Z" level=info msg="TearDown network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\" successfully" Mar 17 19:41:49.221252 env[1250]: time="2025-03-17T19:41:49.220721759Z" level=info msg="StopPodSandbox for \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\" returns successfully" Mar 17 19:41:49.221702 env[1250]: time="2025-03-17T19:41:49.221669147Z" level=info msg="RemovePodSandbox for \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\"" Mar 17 19:41:49.221757 env[1250]: time="2025-03-17T19:41:49.221703351Z" level=info msg="Forcibly stopping sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\"" Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.291 [WARNING][4602] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"88261017-01e9-4199-99b8-fa205595cc28", ResourceVersion:"804", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"b78ca69f38c29986f34b4209e38e62689f608ec39ee3a439c08d3b49632ecbd4", Pod:"coredns-7db6d8ff4d-tm2v7", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.106.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali501ccc5f3da", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.291 [INFO][4602] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.291 [INFO][4602] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" iface="eth0" netns="" Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.291 [INFO][4602] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.291 [INFO][4602] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.323 [INFO][4608] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" HandleID="k8s-pod-network.856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.323 [INFO][4608] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.323 [INFO][4608] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.330 [WARNING][4608] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" HandleID="k8s-pod-network.856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.330 [INFO][4608] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" HandleID="k8s-pod-network.856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-coredns--7db6d8ff4d--tm2v7-eth0" Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.333 [INFO][4608] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:49.335171 env[1250]: 2025-03-17 19:41:49.334 [INFO][4602] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0" Mar 17 19:41:49.335740 env[1250]: time="2025-03-17T19:41:49.335708091Z" level=info msg="TearDown network for sandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\" successfully" Mar 17 19:41:49.340216 env[1250]: time="2025-03-17T19:41:49.340189068Z" level=info msg="RemovePodSandbox \"856cc988cb4acb6cfa1466c59b059fe6c8de93595faeb876b556ac0a255801c0\" returns successfully" Mar 17 19:41:49.340702 env[1250]: time="2025-03-17T19:41:49.340681813Z" level=info msg="StopPodSandbox for \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\"" Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.398 [WARNING][4627] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0", GenerateName:"calico-kube-controllers-959455b4-", Namespace:"calico-system", SelfLink:"", UID:"0357a441-9a17-439b-997f-9e0244a116fd", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"959455b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd", Pod:"calico-kube-controllers-959455b4-2tg7d", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.106.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali28555e71a88", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.398 [INFO][4627] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.398 [INFO][4627] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" iface="eth0" netns="" Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.398 [INFO][4627] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.399 [INFO][4627] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.423 [INFO][4633] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" HandleID="k8s-pod-network.79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.423 [INFO][4633] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.423 [INFO][4633] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.430 [WARNING][4633] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" HandleID="k8s-pod-network.79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.430 [INFO][4633] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" HandleID="k8s-pod-network.79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.432 [INFO][4633] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:49.434234 env[1250]: 2025-03-17 19:41:49.433 [INFO][4627] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:49.434776 env[1250]: time="2025-03-17T19:41:49.434260514Z" level=info msg="TearDown network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\" successfully" Mar 17 19:41:49.434776 env[1250]: time="2025-03-17T19:41:49.434289208Z" level=info msg="StopPodSandbox for \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\" returns successfully" Mar 17 19:41:49.435214 env[1250]: time="2025-03-17T19:41:49.435190560Z" level=info msg="RemovePodSandbox for \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\"" Mar 17 19:41:49.435273 env[1250]: time="2025-03-17T19:41:49.435221848Z" level=info msg="Forcibly stopping sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\"" Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.505 [WARNING][4653] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0", GenerateName:"calico-kube-controllers-959455b4-", Namespace:"calico-system", SelfLink:"", UID:"0357a441-9a17-439b-997f-9e0244a116fd", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 19, 41, 8, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"959455b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510-3-7-8-c8b8528301.novalocal", ContainerID:"5090dcce1d53eb8e368d7d7c169a60dda6d0bea01eb1bca161c03912c52a63fd", Pod:"calico-kube-controllers-959455b4-2tg7d", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.106.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali28555e71a88", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.506 [INFO][4653] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.506 [INFO][4653] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" iface="eth0" netns="" Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.506 [INFO][4653] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.506 [INFO][4653] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.537 [INFO][4659] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" HandleID="k8s-pod-network.79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.537 [INFO][4659] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.537 [INFO][4659] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.544 [WARNING][4659] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" HandleID="k8s-pod-network.79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.544 [INFO][4659] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" HandleID="k8s-pod-network.79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Workload="ci--3510--3--7--8--c8b8528301.novalocal-k8s-calico--kube--controllers--959455b4--2tg7d-eth0" Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.546 [INFO][4659] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 19:41:49.550079 env[1250]: 2025-03-17 19:41:49.548 [INFO][4653] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1" Mar 17 19:41:49.551966 env[1250]: time="2025-03-17T19:41:49.551910695Z" level=info msg="TearDown network for sandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\" successfully" Mar 17 19:41:49.556292 env[1250]: time="2025-03-17T19:41:49.556267869Z" level=info msg="RemovePodSandbox \"79b932f9ecee2f0d949c5b5f6fdc968a36cef036f58057e43d5253ae0df7c5e1\" returns successfully" Mar 17 19:41:49.815765 env[1250]: time="2025-03-17T19:41:49.815584834Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:49.818604 env[1250]: time="2025-03-17T19:41:49.818552460Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:49.820698 env[1250]: time="2025-03-17T19:41:49.820647070Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:49.824139 env[1250]: time="2025-03-17T19:41:49.824091231Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:49.825679 env[1250]: time="2025-03-17T19:41:49.825629287Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Mar 17 19:41:49.828733 env[1250]: time="2025-03-17T19:41:49.828681773Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Mar 17 19:41:49.830215 env[1250]: time="2025-03-17T19:41:49.830176618Z" level=info msg="CreateContainer within sandbox \"c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 19:41:49.848710 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1637884574.mount: Deactivated successfully. Mar 17 19:41:49.865193 env[1250]: time="2025-03-17T19:41:49.857370352Z" level=info msg="CreateContainer within sandbox \"c5d24e0ac427026365456fd7825e087b64ae28f6f2fc4070338eb44ba5489887\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"2d3affb2086066b7addf6334edc181d4f4b7bf71a597946b4a504e120f8eaf6f\"" Mar 17 19:41:49.865193 env[1250]: time="2025-03-17T19:41:49.858183859Z" level=info msg="StartContainer for \"2d3affb2086066b7addf6334edc181d4f4b7bf71a597946b4a504e120f8eaf6f\"" Mar 17 19:41:49.893450 systemd[1]: run-containerd-runc-k8s.io-2d3affb2086066b7addf6334edc181d4f4b7bf71a597946b4a504e120f8eaf6f-runc.LpdR4f.mount: Deactivated successfully. Mar 17 19:41:49.946196 env[1250]: time="2025-03-17T19:41:49.946130416Z" level=info msg="StartContainer for \"2d3affb2086066b7addf6334edc181d4f4b7bf71a597946b4a504e120f8eaf6f\" returns successfully" Mar 17 19:41:50.376930 env[1250]: time="2025-03-17T19:41:50.376810071Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:50.381206 env[1250]: time="2025-03-17T19:41:50.381158509Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:50.384923 env[1250]: time="2025-03-17T19:41:50.384882685Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:50.388423 env[1250]: time="2025-03-17T19:41:50.388380127Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:50.390219 env[1250]: time="2025-03-17T19:41:50.390152712Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Mar 17 19:41:50.399050 env[1250]: time="2025-03-17T19:41:50.398900132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Mar 17 19:41:50.401661 env[1250]: time="2025-03-17T19:41:50.401603583Z" level=info msg="CreateContainer within sandbox \"e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 19:41:50.429202 env[1250]: time="2025-03-17T19:41:50.429120335Z" level=info msg="CreateContainer within sandbox \"e9510337c65e3fdb7b1954cd87bc14738805a4e0ee17313c627840ba635d601d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"e2cccb00a4b1abd4bb571046c1cbe696b70b1cabfd2402dc08e1ff3f954d1e1f\"" Mar 17 19:41:50.434024 env[1250]: time="2025-03-17T19:41:50.433906593Z" level=info msg="StartContainer for \"e2cccb00a4b1abd4bb571046c1cbe696b70b1cabfd2402dc08e1ff3f954d1e1f\"" Mar 17 19:41:50.549000 audit[4739]: NETFILTER_CFG table=filter:111 family=2 entries=10 op=nft_register_rule pid=4739 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:50.551584 kernel: kauditd_printk_skb: 463 callbacks suppressed Mar 17 19:41:50.551638 kernel: audit: type=1325 audit(1742240510.549:409): table=filter:111 family=2 entries=10 op=nft_register_rule pid=4739 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:50.549000 audit[4739]: SYSCALL arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7ffc0d1bebb0 a2=0 a3=7ffc0d1beb9c items=0 ppid=2387 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:50.561782 kernel: audit: type=1300 audit(1742240510.549:409): arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7ffc0d1bebb0 a2=0 a3=7ffc0d1beb9c items=0 ppid=2387 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:50.549000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:50.566968 kernel: audit: type=1327 audit(1742240510.549:409): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:50.571674 kernel: audit: type=1325 audit(1742240510.566:410): table=nat:112 family=2 entries=20 op=nft_register_rule pid=4739 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:50.571733 kernel: audit: type=1300 audit(1742240510.566:410): arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffc0d1bebb0 a2=0 a3=7ffc0d1beb9c items=0 ppid=2387 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:50.566000 audit[4739]: NETFILTER_CFG table=nat:112 family=2 entries=20 op=nft_register_rule pid=4739 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:50.566000 audit[4739]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffc0d1bebb0 a2=0 a3=7ffc0d1beb9c items=0 ppid=2387 pid=4739 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:50.566000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:50.582995 kernel: audit: type=1327 audit(1742240510.566:410): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:50.584292 env[1250]: time="2025-03-17T19:41:50.584256196Z" level=info msg="StartContainer for \"e2cccb00a4b1abd4bb571046c1cbe696b70b1cabfd2402dc08e1ff3f954d1e1f\" returns successfully" Mar 17 19:41:51.363491 kubelet[2215]: I0317 19:41:51.363435 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6657945b8c-wnzqx" podStartSLOduration=34.726285587 podStartE2EDuration="43.363401813s" podCreationTimestamp="2025-03-17 19:41:08 +0000 UTC" firstStartedPulling="2025-03-17 19:41:41.190358583 +0000 UTC m=+53.300199784" lastFinishedPulling="2025-03-17 19:41:49.827474769 +0000 UTC m=+61.937316010" observedRunningTime="2025-03-17 19:41:50.499559556 +0000 UTC m=+62.609400748" watchObservedRunningTime="2025-03-17 19:41:51.363401813 +0000 UTC m=+63.473243014" Mar 17 19:41:51.387000 audit[4749]: NETFILTER_CFG table=filter:113 family=2 entries=9 op=nft_register_rule pid=4749 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:51.387000 audit[4749]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffe6a772d90 a2=0 a3=7ffe6a772d7c items=0 ppid=2387 pid=4749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:51.399513 kernel: audit: type=1325 audit(1742240511.387:411): table=filter:113 family=2 entries=9 op=nft_register_rule pid=4749 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:51.399563 kernel: audit: type=1300 audit(1742240511.387:411): arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffe6a772d90 a2=0 a3=7ffe6a772d7c items=0 ppid=2387 pid=4749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:51.387000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:51.401000 audit[4749]: NETFILTER_CFG table=nat:114 family=2 entries=27 op=nft_register_chain pid=4749 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:51.411302 kernel: audit: type=1327 audit(1742240511.387:411): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:51.411344 kernel: audit: type=1325 audit(1742240511.401:412): table=nat:114 family=2 entries=27 op=nft_register_chain pid=4749 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:51.401000 audit[4749]: SYSCALL arch=c000003e syscall=46 success=yes exit=9348 a0=3 a1=7ffe6a772d90 a2=0 a3=7ffe6a772d7c items=0 ppid=2387 pid=4749 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:51.401000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:52.427000 audit[4751]: NETFILTER_CFG table=filter:115 family=2 entries=8 op=nft_register_rule pid=4751 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:52.427000 audit[4751]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffc622ecd70 a2=0 a3=7ffc622ecd5c items=0 ppid=2387 pid=4751 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:52.427000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:52.431760 env[1250]: time="2025-03-17T19:41:52.431720583Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:52.434823 env[1250]: time="2025-03-17T19:41:52.434795262Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:52.435000 audit[4751]: NETFILTER_CFG table=nat:116 family=2 entries=30 op=nft_register_rule pid=4751 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:52.435000 audit[4751]: SYSCALL arch=c000003e syscall=46 success=yes exit=9348 a0=3 a1=7ffc622ecd70 a2=0 a3=7ffc622ecd5c items=0 ppid=2387 pid=4751 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:52.435000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:52.437820 env[1250]: time="2025-03-17T19:41:52.437790340Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/csi:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:52.440814 env[1250]: time="2025-03-17T19:41:52.440790839Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:52.441670 env[1250]: time="2025-03-17T19:41:52.441645843Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Mar 17 19:41:52.444521 env[1250]: time="2025-03-17T19:41:52.444494326Z" level=info msg="CreateContainer within sandbox \"f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 19:41:52.465413 env[1250]: time="2025-03-17T19:41:52.465374397Z" level=info msg="CreateContainer within sandbox \"f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"128a6ca189b3f088ac8384aa1f7e19d82445835c2a14ee706e6b73a45a37aca3\"" Mar 17 19:41:52.466248 env[1250]: time="2025-03-17T19:41:52.466227998Z" level=info msg="StartContainer for \"128a6ca189b3f088ac8384aa1f7e19d82445835c2a14ee706e6b73a45a37aca3\"" Mar 17 19:41:52.495009 systemd[1]: run-containerd-runc-k8s.io-128a6ca189b3f088ac8384aa1f7e19d82445835c2a14ee706e6b73a45a37aca3-runc.zdf6wp.mount: Deactivated successfully. Mar 17 19:41:52.553443 env[1250]: time="2025-03-17T19:41:52.553395712Z" level=info msg="StartContainer for \"128a6ca189b3f088ac8384aa1f7e19d82445835c2a14ee706e6b73a45a37aca3\" returns successfully" Mar 17 19:41:52.554882 env[1250]: time="2025-03-17T19:41:52.554860198Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Mar 17 19:41:52.628873 kubelet[2215]: I0317 19:41:52.628775 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6657945b8c-w9rz5" podStartSLOduration=36.769455807 podStartE2EDuration="44.628756055s" podCreationTimestamp="2025-03-17 19:41:08 +0000 UTC" firstStartedPulling="2025-03-17 19:41:42.532672238 +0000 UTC m=+54.642513429" lastFinishedPulling="2025-03-17 19:41:50.391972436 +0000 UTC m=+62.501813677" observedRunningTime="2025-03-17 19:41:51.48491656 +0000 UTC m=+63.594757751" watchObservedRunningTime="2025-03-17 19:41:52.628756055 +0000 UTC m=+64.738597246" Mar 17 19:41:52.646000 audit[4782]: NETFILTER_CFG table=filter:117 family=2 entries=8 op=nft_register_rule pid=4782 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:52.646000 audit[4782]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffc31eed590 a2=0 a3=7ffc31eed57c items=0 ppid=2387 pid=4782 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:52.646000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:52.650000 audit[4782]: NETFILTER_CFG table=nat:118 family=2 entries=34 op=nft_register_chain pid=4782 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:41:52.650000 audit[4782]: SYSCALL arch=c000003e syscall=46 success=yes exit=11236 a0=3 a1=7ffc31eed590 a2=0 a3=7ffc31eed57c items=0 ppid=2387 pid=4782 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:41:52.650000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:41:54.920761 env[1250]: time="2025-03-17T19:41:54.920674267Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:54.924536 env[1250]: time="2025-03-17T19:41:54.924487050Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:54.927541 env[1250]: time="2025-03-17T19:41:54.927477249Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:54.929995 env[1250]: time="2025-03-17T19:41:54.929903851Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 19:41:54.930364 env[1250]: time="2025-03-17T19:41:54.930321404Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Mar 17 19:41:54.936330 env[1250]: time="2025-03-17T19:41:54.936247913Z" level=info msg="CreateContainer within sandbox \"f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 19:41:54.969018 env[1250]: time="2025-03-17T19:41:54.968888715Z" level=info msg="CreateContainer within sandbox \"f47129dcad22209c4e8538bb1918856ef8c0b760daa733006790aaf78d1157e4\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"7818d8a1e01477b8b280232b6787802977cb83d56f2ecc791b28d05f488eeff6\"" Mar 17 19:41:54.971274 env[1250]: time="2025-03-17T19:41:54.971159736Z" level=info msg="StartContainer for \"7818d8a1e01477b8b280232b6787802977cb83d56f2ecc791b28d05f488eeff6\"" Mar 17 19:41:54.976348 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3657009236.mount: Deactivated successfully. Mar 17 19:41:55.053691 env[1250]: time="2025-03-17T19:41:55.053646315Z" level=info msg="StartContainer for \"7818d8a1e01477b8b280232b6787802977cb83d56f2ecc791b28d05f488eeff6\" returns successfully" Mar 17 19:41:55.226181 kubelet[2215]: I0317 19:41:55.226160 2215 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 19:41:55.226571 kubelet[2215]: I0317 19:41:55.226560 2215 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 19:42:06.647458 kubelet[2215]: I0317 19:42:06.647343 2215 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-ptzbm" podStartSLOduration=47.283193496 podStartE2EDuration="58.647298498s" podCreationTimestamp="2025-03-17 19:41:08 +0000 UTC" firstStartedPulling="2025-03-17 19:41:43.567544184 +0000 UTC m=+55.677385375" lastFinishedPulling="2025-03-17 19:41:54.931649186 +0000 UTC m=+67.041490377" observedRunningTime="2025-03-17 19:41:55.542256519 +0000 UTC m=+67.652097760" watchObservedRunningTime="2025-03-17 19:42:06.647298498 +0000 UTC m=+78.757139739" Mar 17 19:42:56.297308 systemd[1]: run-containerd-runc-k8s.io-580214302fa232511f8f62ada0ed878cc5412f7f8e2bd23794bee0ecc2d066fa-runc.YqJnF0.mount: Deactivated successfully. Mar 17 19:43:06.553510 systemd[1]: run-containerd-runc-k8s.io-4c1ef740ef47f5ff497973bc373f8c9106da486bac8ad0d98fcdbbc883538f45-runc.wrw1gd.mount: Deactivated successfully. Mar 17 19:43:14.738087 systemd[1]: Started sshd@9-172.24.4.218:22-172.24.4.1:51162.service. Mar 17 19:43:14.738000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-172.24.4.218:22-172.24.4.1:51162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:14.744025 kernel: kauditd_printk_skb: 14 callbacks suppressed Mar 17 19:43:14.744158 kernel: audit: type=1130 audit(1742240594.738:417): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-172.24.4.218:22-172.24.4.1:51162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:16.136000 audit[5018]: USER_ACCT pid=5018 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:16.138203 sshd[5018]: Accepted publickey for core from 172.24.4.1 port 51162 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:43:16.151000 audit[5018]: CRED_ACQ pid=5018 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:16.154451 sshd[5018]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:43:16.167293 kernel: audit: type=1101 audit(1742240596.136:418): pid=5018 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:16.167442 kernel: audit: type=1103 audit(1742240596.151:419): pid=5018 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:16.179010 kernel: audit: type=1006 audit(1742240596.152:420): pid=5018 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Mar 17 19:43:16.152000 audit[5018]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd03ec35f0 a2=3 a3=0 items=0 ppid=1 pid=5018 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:16.205613 kernel: audit: type=1300 audit(1742240596.152:420): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd03ec35f0 a2=3 a3=0 items=0 ppid=1 pid=5018 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:16.205868 kernel: audit: type=1327 audit(1742240596.152:420): proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:16.152000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:16.215092 systemd-logind[1236]: New session 10 of user core. Mar 17 19:43:16.215648 systemd[1]: Started session-10.scope. Mar 17 19:43:16.223000 audit[5018]: USER_START pid=5018 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:16.233973 kernel: audit: type=1105 audit(1742240596.223:421): pid=5018 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:16.233000 audit[5021]: CRED_ACQ pid=5021 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:16.242059 kernel: audit: type=1103 audit(1742240596.233:422): pid=5021 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:17.037724 sshd[5018]: pam_unix(sshd:session): session closed for user core Mar 17 19:43:17.038000 audit[5018]: USER_END pid=5018 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:17.042255 systemd[1]: sshd@9-172.24.4.218:22-172.24.4.1:51162.service: Deactivated successfully. Mar 17 19:43:17.043804 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 19:43:17.046972 kernel: audit: type=1106 audit(1742240597.038:423): pid=5018 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:17.047431 systemd-logind[1236]: Session 10 logged out. Waiting for processes to exit. Mar 17 19:43:17.048493 systemd-logind[1236]: Removed session 10. Mar 17 19:43:17.038000 audit[5018]: CRED_DISP pid=5018 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:17.041000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-172.24.4.218:22-172.24.4.1:51162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:17.056973 kernel: audit: type=1104 audit(1742240597.038:424): pid=5018 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:22.048000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-172.24.4.218:22-172.24.4.1:51178 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:22.048782 systemd[1]: Started sshd@10-172.24.4.218:22-172.24.4.1:51178.service. Mar 17 19:43:22.066086 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 19:43:22.066279 kernel: audit: type=1130 audit(1742240602.048:426): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-172.24.4.218:22-172.24.4.1:51178 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:23.465000 audit[5051]: USER_ACCT pid=5051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:23.468912 sshd[5051]: Accepted publickey for core from 172.24.4.1 port 51178 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:43:23.481997 kernel: audit: type=1101 audit(1742240603.465:427): pid=5051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:23.481000 audit[5051]: CRED_ACQ pid=5051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:23.483754 sshd[5051]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:43:23.506806 kernel: audit: type=1103 audit(1742240603.481:428): pid=5051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:23.514345 kernel: audit: type=1006 audit(1742240603.482:429): pid=5051 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Mar 17 19:43:23.519401 kernel: audit: type=1300 audit(1742240603.482:429): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5b4794d0 a2=3 a3=0 items=0 ppid=1 pid=5051 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:23.482000 audit[5051]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff5b4794d0 a2=3 a3=0 items=0 ppid=1 pid=5051 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:23.482000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:23.539994 kernel: audit: type=1327 audit(1742240603.482:429): proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:23.549576 systemd[1]: Started session-11.scope. Mar 17 19:43:23.551014 systemd-logind[1236]: New session 11 of user core. Mar 17 19:43:23.557000 audit[5051]: USER_START pid=5051 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:23.565197 kernel: audit: type=1105 audit(1742240603.557:430): pid=5051 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:23.564000 audit[5054]: CRED_ACQ pid=5054 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:23.573795 kernel: audit: type=1103 audit(1742240603.564:431): pid=5054 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:24.326374 sshd[5051]: pam_unix(sshd:session): session closed for user core Mar 17 19:43:24.327000 audit[5051]: USER_END pid=5051 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:24.345991 kernel: audit: type=1106 audit(1742240604.327:432): pid=5051 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:24.347769 systemd[1]: sshd@10-172.24.4.218:22-172.24.4.1:51178.service: Deactivated successfully. Mar 17 19:43:24.349076 systemd-logind[1236]: Session 11 logged out. Waiting for processes to exit. Mar 17 19:43:24.350944 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 19:43:24.352255 systemd-logind[1236]: Removed session 11. Mar 17 19:43:24.327000 audit[5051]: CRED_DISP pid=5051 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:24.366122 kernel: audit: type=1104 audit(1742240604.327:433): pid=5051 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:24.347000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-172.24.4.218:22-172.24.4.1:51178 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:26.309406 systemd[1]: run-containerd-runc-k8s.io-580214302fa232511f8f62ada0ed878cc5412f7f8e2bd23794bee0ecc2d066fa-runc.D7SJOM.mount: Deactivated successfully. Mar 17 19:43:29.333000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-172.24.4.218:22-172.24.4.1:60806 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:29.333613 systemd[1]: Started sshd@11-172.24.4.218:22-172.24.4.1:60806.service. Mar 17 19:43:29.337670 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 19:43:29.337794 kernel: audit: type=1130 audit(1742240609.333:435): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-172.24.4.218:22-172.24.4.1:60806 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:30.517000 audit[5085]: USER_ACCT pid=5085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:30.533626 sshd[5085]: Accepted publickey for core from 172.24.4.1 port 60806 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:43:30.534325 kernel: audit: type=1101 audit(1742240610.517:436): pid=5085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:30.534329 sshd[5085]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:43:30.532000 audit[5085]: CRED_ACQ pid=5085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:30.558272 kernel: audit: type=1103 audit(1742240610.532:437): pid=5085 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:30.558460 kernel: audit: type=1006 audit(1742240610.532:438): pid=5085 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Mar 17 19:43:30.532000 audit[5085]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef124c3f0 a2=3 a3=0 items=0 ppid=1 pid=5085 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:30.573900 kernel: audit: type=1300 audit(1742240610.532:438): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffef124c3f0 a2=3 a3=0 items=0 ppid=1 pid=5085 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:30.532000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:30.580013 kernel: audit: type=1327 audit(1742240610.532:438): proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:30.586047 systemd-logind[1236]: New session 12 of user core. Mar 17 19:43:30.587366 systemd[1]: Started session-12.scope. Mar 17 19:43:30.604345 kernel: audit: type=1105 audit(1742240610.595:439): pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:30.595000 audit[5085]: USER_START pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:30.603000 audit[5088]: CRED_ACQ pid=5088 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:30.612077 kernel: audit: type=1103 audit(1742240610.603:440): pid=5088 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:31.410230 sshd[5085]: pam_unix(sshd:session): session closed for user core Mar 17 19:43:31.413000 audit[5085]: USER_END pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:31.416835 systemd[1]: Started sshd@12-172.24.4.218:22-172.24.4.1:60810.service. Mar 17 19:43:31.432443 kernel: audit: type=1106 audit(1742240611.413:441): pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:31.413000 audit[5085]: CRED_DISP pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:31.436329 systemd[1]: sshd@11-172.24.4.218:22-172.24.4.1:60806.service: Deactivated successfully. Mar 17 19:43:31.438287 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 19:43:31.447997 kernel: audit: type=1104 audit(1742240611.413:442): pid=5085 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:31.416000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-172.24.4.218:22-172.24.4.1:60810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:31.435000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-172.24.4.218:22-172.24.4.1:60806 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:31.449615 systemd-logind[1236]: Session 12 logged out. Waiting for processes to exit. Mar 17 19:43:31.451767 systemd-logind[1236]: Removed session 12. Mar 17 19:43:32.495274 sshd[5096]: Accepted publickey for core from 172.24.4.1 port 60810 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:43:32.494000 audit[5096]: USER_ACCT pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:32.497000 audit[5096]: CRED_ACQ pid=5096 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:32.497000 audit[5096]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc74bfd380 a2=3 a3=0 items=0 ppid=1 pid=5096 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:32.497000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:32.499291 sshd[5096]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:43:32.511078 systemd-logind[1236]: New session 13 of user core. Mar 17 19:43:32.512612 systemd[1]: Started session-13.scope. Mar 17 19:43:32.530000 audit[5096]: USER_START pid=5096 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:32.534000 audit[5103]: CRED_ACQ pid=5103 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:33.208000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-172.24.4.218:22-172.24.4.1:60826 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:33.209482 systemd[1]: Started sshd@13-172.24.4.218:22-172.24.4.1:60826.service. Mar 17 19:43:33.212686 sshd[5096]: pam_unix(sshd:session): session closed for user core Mar 17 19:43:33.213000 audit[5096]: USER_END pid=5096 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:33.213000 audit[5096]: CRED_DISP pid=5096 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:33.215653 systemd[1]: sshd@12-172.24.4.218:22-172.24.4.1:60810.service: Deactivated successfully. Mar 17 19:43:33.217000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-172.24.4.218:22-172.24.4.1:60810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:33.220475 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 19:43:33.221317 systemd-logind[1236]: Session 13 logged out. Waiting for processes to exit. Mar 17 19:43:33.222660 systemd-logind[1236]: Removed session 13. Mar 17 19:43:34.512000 audit[5109]: USER_ACCT pid=5109 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:34.514263 sshd[5109]: Accepted publickey for core from 172.24.4.1 port 60826 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:43:34.532934 kernel: kauditd_printk_skb: 13 callbacks suppressed Mar 17 19:43:34.547186 kernel: audit: type=1101 audit(1742240614.512:454): pid=5109 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:34.547393 kernel: audit: type=1103 audit(1742240614.531:455): pid=5109 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:34.547459 kernel: audit: type=1006 audit(1742240614.531:456): pid=5109 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Mar 17 19:43:34.531000 audit[5109]: CRED_ACQ pid=5109 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:34.548320 sshd[5109]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:43:34.531000 audit[5109]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc77824840 a2=3 a3=0 items=0 ppid=1 pid=5109 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:34.570760 kernel: audit: type=1300 audit(1742240614.531:456): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc77824840 a2=3 a3=0 items=0 ppid=1 pid=5109 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:34.531000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:34.575054 kernel: audit: type=1327 audit(1742240614.531:456): proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:34.578920 systemd-logind[1236]: New session 14 of user core. Mar 17 19:43:34.579611 systemd[1]: Started session-14.scope. Mar 17 19:43:34.595000 audit[5109]: USER_START pid=5109 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:34.602000 audit[5114]: CRED_ACQ pid=5114 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:34.609315 kernel: audit: type=1105 audit(1742240614.595:457): pid=5109 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:34.609450 kernel: audit: type=1103 audit(1742240614.602:458): pid=5114 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:35.517985 sshd[5109]: pam_unix(sshd:session): session closed for user core Mar 17 19:43:35.519000 audit[5109]: USER_END pid=5109 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:35.537006 kernel: audit: type=1106 audit(1742240615.519:459): pid=5109 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:35.537018 systemd[1]: sshd@13-172.24.4.218:22-172.24.4.1:60826.service: Deactivated successfully. Mar 17 19:43:35.540061 systemd-logind[1236]: Session 14 logged out. Waiting for processes to exit. Mar 17 19:43:35.541259 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 19:43:35.519000 audit[5109]: CRED_DISP pid=5109 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:35.555192 kernel: audit: type=1104 audit(1742240615.519:460): pid=5109 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:35.537000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-172.24.4.218:22-172.24.4.1:60826 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:35.568511 systemd-logind[1236]: Removed session 14. Mar 17 19:43:35.569009 kernel: audit: type=1131 audit(1742240615.537:461): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-172.24.4.218:22-172.24.4.1:60826 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:36.575519 systemd[1]: run-containerd-runc-k8s.io-4c1ef740ef47f5ff497973bc373f8c9106da486bac8ad0d98fcdbbc883538f45-runc.bqd7va.mount: Deactivated successfully. Mar 17 19:43:40.519432 systemd[1]: Started sshd@14-172.24.4.218:22-172.24.4.1:36206.service. Mar 17 19:43:40.519000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-172.24.4.218:22-172.24.4.1:36206 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:40.535311 kernel: audit: type=1130 audit(1742240620.519:462): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-172.24.4.218:22-172.24.4.1:36206 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:41.810000 audit[5145]: USER_ACCT pid=5145 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:41.812128 sshd[5145]: Accepted publickey for core from 172.24.4.1 port 36206 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:43:41.825000 audit[5145]: CRED_ACQ pid=5145 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:41.828433 sshd[5145]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:43:41.840757 kernel: audit: type=1101 audit(1742240621.810:463): pid=5145 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:41.840900 kernel: audit: type=1103 audit(1742240621.825:464): pid=5145 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:41.852321 kernel: audit: type=1006 audit(1742240621.826:465): pid=5145 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Mar 17 19:43:41.852432 kernel: audit: type=1300 audit(1742240621.826:465): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff84e926e0 a2=3 a3=0 items=0 ppid=1 pid=5145 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:41.826000 audit[5145]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff84e926e0 a2=3 a3=0 items=0 ppid=1 pid=5145 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:41.867276 kernel: audit: type=1327 audit(1742240621.826:465): proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:41.826000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:41.877513 systemd-logind[1236]: New session 15 of user core. Mar 17 19:43:41.878358 systemd[1]: Started session-15.scope. Mar 17 19:43:41.901000 audit[5145]: USER_START pid=5145 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:41.919076 kernel: audit: type=1105 audit(1742240621.901:466): pid=5145 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:41.904000 audit[5148]: CRED_ACQ pid=5148 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:41.928074 kernel: audit: type=1103 audit(1742240621.904:467): pid=5148 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:42.586396 sshd[5145]: pam_unix(sshd:session): session closed for user core Mar 17 19:43:42.607057 kernel: audit: type=1106 audit(1742240622.588:468): pid=5145 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:42.588000 audit[5145]: USER_END pid=5145 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:42.592461 systemd[1]: sshd@14-172.24.4.218:22-172.24.4.1:36206.service: Deactivated successfully. Mar 17 19:43:42.594367 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 19:43:42.588000 audit[5145]: CRED_DISP pid=5145 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:42.608659 systemd-logind[1236]: Session 15 logged out. Waiting for processes to exit. Mar 17 19:43:42.611197 systemd-logind[1236]: Removed session 15. Mar 17 19:43:42.592000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-172.24.4.218:22-172.24.4.1:36206 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:42.622712 kernel: audit: type=1104 audit(1742240622.588:469): pid=5145 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:47.587000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-172.24.4.218:22-172.24.4.1:60620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:47.588597 systemd[1]: Started sshd@15-172.24.4.218:22-172.24.4.1:60620.service. Mar 17 19:43:47.591963 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 19:43:47.592037 kernel: audit: type=1130 audit(1742240627.587:471): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-172.24.4.218:22-172.24.4.1:60620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:48.948000 audit[5163]: USER_ACCT pid=5163 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:48.949647 sshd[5163]: Accepted publickey for core from 172.24.4.1 port 60620 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:43:48.965011 kernel: audit: type=1101 audit(1742240628.948:472): pid=5163 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:48.963000 audit[5163]: CRED_ACQ pid=5163 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:48.966264 sshd[5163]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:43:48.981099 kernel: audit: type=1103 audit(1742240628.963:473): pid=5163 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:48.981286 kernel: audit: type=1006 audit(1742240628.964:474): pid=5163 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Mar 17 19:43:48.980585 systemd[1]: Started session-16.scope. Mar 17 19:43:48.982341 systemd-logind[1236]: New session 16 of user core. Mar 17 19:43:48.964000 audit[5163]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc99d16000 a2=3 a3=0 items=0 ppid=1 pid=5163 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:48.964000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:49.020099 kernel: audit: type=1300 audit(1742240628.964:474): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc99d16000 a2=3 a3=0 items=0 ppid=1 pid=5163 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:49.020231 kernel: audit: type=1327 audit(1742240628.964:474): proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:49.020290 kernel: audit: type=1105 audit(1742240628.995:475): pid=5163 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:48.995000 audit[5163]: USER_START pid=5163 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:49.029882 kernel: audit: type=1103 audit(1742240628.999:476): pid=5168 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:48.999000 audit[5168]: CRED_ACQ pid=5168 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:49.708701 sshd[5163]: pam_unix(sshd:session): session closed for user core Mar 17 19:43:49.710000 audit[5163]: USER_END pid=5163 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:49.710000 audit[5163]: CRED_DISP pid=5163 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:49.738786 systemd[1]: sshd@15-172.24.4.218:22-172.24.4.1:60620.service: Deactivated successfully. Mar 17 19:43:49.740585 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 19:43:49.745057 systemd-logind[1236]: Session 16 logged out. Waiting for processes to exit. Mar 17 19:43:49.747390 systemd-logind[1236]: Removed session 16. Mar 17 19:43:49.751758 kernel: audit: type=1106 audit(1742240629.710:477): pid=5163 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:49.751917 kernel: audit: type=1104 audit(1742240629.710:478): pid=5163 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:49.737000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-172.24.4.218:22-172.24.4.1:60620 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:54.732918 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 19:43:54.733184 kernel: audit: type=1130 audit(1742240634.715:480): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-172.24.4.218:22-172.24.4.1:39072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:54.715000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-172.24.4.218:22-172.24.4.1:39072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:54.716224 systemd[1]: Started sshd@16-172.24.4.218:22-172.24.4.1:39072.service. Mar 17 19:43:56.013000 audit[5178]: USER_ACCT pid=5178 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.015456 sshd[5178]: Accepted publickey for core from 172.24.4.1 port 39072 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:43:56.030112 kernel: audit: type=1101 audit(1742240636.013:481): pid=5178 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.033911 sshd[5178]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:43:56.031000 audit[5178]: CRED_ACQ pid=5178 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.049555 kernel: audit: type=1103 audit(1742240636.031:482): pid=5178 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.049706 kernel: audit: type=1006 audit(1742240636.032:483): pid=5178 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Mar 17 19:43:56.032000 audit[5178]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffece05f9c0 a2=3 a3=0 items=0 ppid=1 pid=5178 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:56.032000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:56.079800 kernel: audit: type=1300 audit(1742240636.032:483): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffece05f9c0 a2=3 a3=0 items=0 ppid=1 pid=5178 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:56.080013 kernel: audit: type=1327 audit(1742240636.032:483): proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:56.087363 systemd-logind[1236]: New session 17 of user core. Mar 17 19:43:56.091598 systemd[1]: Started session-17.scope. Mar 17 19:43:56.106000 audit[5178]: USER_START pid=5178 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.126006 kernel: audit: type=1105 audit(1742240636.106:484): pid=5178 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.126000 audit[5181]: CRED_ACQ pid=5181 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.139048 kernel: audit: type=1103 audit(1742240636.126:485): pid=5181 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.289298 systemd[1]: run-containerd-runc-k8s.io-580214302fa232511f8f62ada0ed878cc5412f7f8e2bd23794bee0ecc2d066fa-runc.bUiOWr.mount: Deactivated successfully. Mar 17 19:43:56.761679 sshd[5178]: pam_unix(sshd:session): session closed for user core Mar 17 19:43:56.762000 audit[5178]: USER_END pid=5178 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.765708 systemd[1]: sshd@16-172.24.4.218:22-172.24.4.1:39072.service: Deactivated successfully. Mar 17 19:43:56.768533 systemd[1]: Started sshd@17-172.24.4.218:22-172.24.4.1:39082.service. Mar 17 19:43:56.768910 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 19:43:56.781011 kernel: audit: type=1106 audit(1742240636.762:486): pid=5178 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.781044 systemd-logind[1236]: Session 17 logged out. Waiting for processes to exit. Mar 17 19:43:56.788054 kernel: audit: type=1104 audit(1742240636.762:487): pid=5178 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.762000 audit[5178]: CRED_DISP pid=5178 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:56.788793 systemd-logind[1236]: Removed session 17. Mar 17 19:43:56.765000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-172.24.4.218:22-172.24.4.1:39072 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:56.767000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-172.24.4.218:22-172.24.4.1:39082 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:57.974000 audit[5210]: USER_ACCT pid=5210 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:57.976080 sshd[5210]: Accepted publickey for core from 172.24.4.1 port 39082 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:43:57.977000 audit[5210]: CRED_ACQ pid=5210 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:57.977000 audit[5210]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffda9eaa30 a2=3 a3=0 items=0 ppid=1 pid=5210 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:43:57.977000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:43:57.979881 sshd[5210]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:43:57.991072 systemd-logind[1236]: New session 18 of user core. Mar 17 19:43:57.994035 systemd[1]: Started session-18.scope. Mar 17 19:43:58.012000 audit[5210]: USER_START pid=5210 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:58.016000 audit[5214]: CRED_ACQ pid=5214 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:59.021010 sshd[5210]: pam_unix(sshd:session): session closed for user core Mar 17 19:43:59.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-172.24.4.218:22-172.24.4.1:39088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:59.023656 systemd[1]: Started sshd@18-172.24.4.218:22-172.24.4.1:39088.service. Mar 17 19:43:59.026000 audit[5210]: USER_END pid=5210 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:59.026000 audit[5210]: CRED_DISP pid=5210 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:43:59.029000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-172.24.4.218:22-172.24.4.1:39082 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:43:59.030173 systemd[1]: sshd@17-172.24.4.218:22-172.24.4.1:39082.service: Deactivated successfully. Mar 17 19:43:59.032129 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 19:43:59.032262 systemd-logind[1236]: Session 18 logged out. Waiting for processes to exit. Mar 17 19:43:59.039242 systemd-logind[1236]: Removed session 18. Mar 17 19:44:00.382935 sshd[5220]: Accepted publickey for core from 172.24.4.1 port 39088 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:44:00.390278 kernel: kauditd_printk_skb: 13 callbacks suppressed Mar 17 19:44:00.390462 kernel: audit: type=1101 audit(1742240640.381:499): pid=5220 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:00.381000 audit[5220]: USER_ACCT pid=5220 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:00.388213 sshd[5220]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:44:00.386000 audit[5220]: CRED_ACQ pid=5220 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:00.416160 kernel: audit: type=1103 audit(1742240640.386:500): pid=5220 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:00.428014 kernel: audit: type=1006 audit(1742240640.386:501): pid=5220 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Mar 17 19:44:00.386000 audit[5220]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce55b6e50 a2=3 a3=0 items=0 ppid=1 pid=5220 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:00.386000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:00.449066 kernel: audit: type=1300 audit(1742240640.386:501): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce55b6e50 a2=3 a3=0 items=0 ppid=1 pid=5220 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:00.449190 kernel: audit: type=1327 audit(1742240640.386:501): proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:00.451084 systemd-logind[1236]: New session 19 of user core. Mar 17 19:44:00.451932 systemd[1]: Started session-19.scope. Mar 17 19:44:00.464000 audit[5220]: USER_START pid=5220 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:00.464000 audit[5225]: CRED_ACQ pid=5225 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:00.477823 kernel: audit: type=1105 audit(1742240640.464:502): pid=5220 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:00.477937 kernel: audit: type=1103 audit(1742240640.464:503): pid=5225 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:03.269000 audit[5238]: NETFILTER_CFG table=filter:119 family=2 entries=20 op=nft_register_rule pid=5238 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:44:03.269000 audit[5238]: SYSCALL arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffee0b26090 a2=0 a3=7ffee0b2607c items=0 ppid=2387 pid=5238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:03.282178 kernel: audit: type=1325 audit(1742240643.269:504): table=filter:119 family=2 entries=20 op=nft_register_rule pid=5238 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:44:03.282342 kernel: audit: type=1300 audit(1742240643.269:504): arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffee0b26090 a2=0 a3=7ffee0b2607c items=0 ppid=2387 pid=5238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:03.269000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:44:03.286211 kernel: audit: type=1327 audit(1742240643.269:504): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:44:03.287000 audit[5238]: NETFILTER_CFG table=nat:120 family=2 entries=22 op=nft_register_rule pid=5238 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:44:03.287000 audit[5238]: SYSCALL arch=c000003e syscall=46 success=yes exit=6540 a0=3 a1=7ffee0b26090 a2=0 a3=0 items=0 ppid=2387 pid=5238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:03.287000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:44:03.308000 audit[5240]: NETFILTER_CFG table=filter:121 family=2 entries=32 op=nft_register_rule pid=5240 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:44:03.308000 audit[5240]: SYSCALL arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffc4e630420 a2=0 a3=7ffc4e63040c items=0 ppid=2387 pid=5240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:03.308000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:44:03.315000 audit[5240]: NETFILTER_CFG table=nat:122 family=2 entries=22 op=nft_register_rule pid=5240 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:44:03.315000 audit[5240]: SYSCALL arch=c000003e syscall=46 success=yes exit=6540 a0=3 a1=7ffc4e630420 a2=0 a3=0 items=0 ppid=2387 pid=5240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:03.315000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:44:03.535105 sshd[5220]: pam_unix(sshd:session): session closed for user core Mar 17 19:44:03.536281 systemd[1]: Started sshd@19-172.24.4.218:22-172.24.4.1:48386.service. Mar 17 19:44:03.535000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-172.24.4.218:22-172.24.4.1:48386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:03.538000 audit[5220]: USER_END pid=5220 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:03.538000 audit[5220]: CRED_DISP pid=5220 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:03.542000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-172.24.4.218:22-172.24.4.1:39088 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:03.543163 systemd[1]: sshd@18-172.24.4.218:22-172.24.4.1:39088.service: Deactivated successfully. Mar 17 19:44:03.547218 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 19:44:03.549090 systemd-logind[1236]: Session 19 logged out. Waiting for processes to exit. Mar 17 19:44:03.552761 systemd-logind[1236]: Removed session 19. Mar 17 19:44:04.690000 audit[5241]: USER_ACCT pid=5241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:04.691477 sshd[5241]: Accepted publickey for core from 172.24.4.1 port 48386 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:44:04.691000 audit[5241]: CRED_ACQ pid=5241 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:04.692000 audit[5241]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff674b9000 a2=3 a3=0 items=0 ppid=1 pid=5241 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:04.692000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:04.693419 sshd[5241]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:44:04.702278 systemd-logind[1236]: New session 20 of user core. Mar 17 19:44:04.702847 systemd[1]: Started session-20.scope. Mar 17 19:44:04.719000 audit[5241]: USER_START pid=5241 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:04.721000 audit[5246]: CRED_ACQ pid=5246 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:05.727888 sshd[5241]: pam_unix(sshd:session): session closed for user core Mar 17 19:44:05.731000 audit[5241]: USER_END pid=5241 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:05.735648 kernel: kauditd_printk_skb: 20 callbacks suppressed Mar 17 19:44:05.735745 kernel: audit: type=1106 audit(1742240645.731:517): pid=5241 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:05.731000 audit[5241]: CRED_DISP pid=5241 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:05.753910 systemd[1]: Started sshd@20-172.24.4.218:22-172.24.4.1:48396.service. Mar 17 19:44:05.757768 systemd[1]: sshd@19-172.24.4.218:22-172.24.4.1:48386.service: Deactivated successfully. Mar 17 19:44:05.760605 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 19:44:05.763052 systemd-logind[1236]: Session 20 logged out. Waiting for processes to exit. Mar 17 19:44:05.765652 kernel: audit: type=1104 audit(1742240645.731:518): pid=5241 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:05.754000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-172.24.4.218:22-172.24.4.1:48396 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:05.767226 systemd-logind[1236]: Removed session 20. Mar 17 19:44:05.757000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-172.24.4.218:22-172.24.4.1:48386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:05.792663 kernel: audit: type=1130 audit(1742240645.754:519): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-172.24.4.218:22-172.24.4.1:48396 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:05.792911 kernel: audit: type=1131 audit(1742240645.757:520): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-172.24.4.218:22-172.24.4.1:48386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:06.981287 sshd[5253]: Accepted publickey for core from 172.24.4.1 port 48396 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:44:06.980000 audit[5253]: USER_ACCT pid=5253 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:06.997070 kernel: audit: type=1101 audit(1742240646.980:521): pid=5253 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:06.996000 audit[5253]: CRED_ACQ pid=5253 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:06.998260 sshd[5253]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:44:07.012164 kernel: audit: type=1103 audit(1742240646.996:522): pid=5253 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:07.022055 kernel: audit: type=1006 audit(1742240646.996:523): pid=5253 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Mar 17 19:44:06.996000 audit[5253]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdfe205240 a2=3 a3=0 items=0 ppid=1 pid=5253 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:07.038008 kernel: audit: type=1300 audit(1742240646.996:523): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdfe205240 a2=3 a3=0 items=0 ppid=1 pid=5253 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:06.996000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:07.045266 systemd-logind[1236]: New session 21 of user core. Mar 17 19:44:07.047225 kernel: audit: type=1327 audit(1742240646.996:523): proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:07.046560 systemd[1]: Started session-21.scope. Mar 17 19:44:07.058000 audit[5253]: USER_START pid=5253 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:07.078236 kernel: audit: type=1105 audit(1742240647.058:524): pid=5253 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:07.076000 audit[5277]: CRED_ACQ pid=5277 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:07.892200 sshd[5253]: pam_unix(sshd:session): session closed for user core Mar 17 19:44:07.893000 audit[5253]: USER_END pid=5253 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:07.893000 audit[5253]: CRED_DISP pid=5253 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:07.897709 systemd[1]: sshd@20-172.24.4.218:22-172.24.4.1:48396.service: Deactivated successfully. Mar 17 19:44:07.897000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-172.24.4.218:22-172.24.4.1:48396 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:07.900638 systemd[1]: session-21.scope: Deactivated successfully. Mar 17 19:44:07.901380 systemd-logind[1236]: Session 21 logged out. Waiting for processes to exit. Mar 17 19:44:07.904443 systemd-logind[1236]: Removed session 21. Mar 17 19:44:11.711000 audit[5288]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=5288 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:44:11.716896 kernel: kauditd_printk_skb: 4 callbacks suppressed Mar 17 19:44:11.717043 kernel: audit: type=1325 audit(1742240651.711:529): table=filter:123 family=2 entries=20 op=nft_register_rule pid=5288 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:44:11.711000 audit[5288]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffd233ec130 a2=0 a3=7ffd233ec11c items=0 ppid=2387 pid=5288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:11.747874 kernel: audit: type=1300 audit(1742240651.711:529): arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffd233ec130 a2=0 a3=7ffd233ec11c items=0 ppid=2387 pid=5288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:11.711000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:44:11.756033 kernel: audit: type=1327 audit(1742240651.711:529): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:44:11.762000 audit[5288]: NETFILTER_CFG table=nat:124 family=2 entries=106 op=nft_register_chain pid=5288 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:44:11.762000 audit[5288]: SYSCALL arch=c000003e syscall=46 success=yes exit=49452 a0=3 a1=7ffd233ec130 a2=0 a3=7ffd233ec11c items=0 ppid=2387 pid=5288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:11.776071 kernel: audit: type=1325 audit(1742240651.762:530): table=nat:124 family=2 entries=106 op=nft_register_chain pid=5288 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 19:44:11.776190 kernel: audit: type=1300 audit(1742240651.762:530): arch=c000003e syscall=46 success=yes exit=49452 a0=3 a1=7ffd233ec130 a2=0 a3=7ffd233ec11c items=0 ppid=2387 pid=5288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:11.776256 kernel: audit: type=1327 audit(1742240651.762:530): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:44:11.762000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 19:44:12.901249 systemd[1]: Started sshd@21-172.24.4.218:22-172.24.4.1:48404.service. Mar 17 19:44:12.901000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-172.24.4.218:22-172.24.4.1:48404 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:12.917059 kernel: audit: type=1130 audit(1742240652.901:531): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-172.24.4.218:22-172.24.4.1:48404 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:14.111000 audit[5290]: USER_ACCT pid=5290 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:14.127938 sshd[5290]: Accepted publickey for core from 172.24.4.1 port 48404 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:44:14.128589 kernel: audit: type=1101 audit(1742240654.111:532): pid=5290 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:14.128884 sshd[5290]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:44:14.144137 kernel: audit: type=1103 audit(1742240654.126:533): pid=5290 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:14.126000 audit[5290]: CRED_ACQ pid=5290 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:14.148000 systemd[1]: Started session-22.scope. Mar 17 19:44:14.149063 systemd-logind[1236]: New session 22 of user core. Mar 17 19:44:14.154387 kernel: audit: type=1006 audit(1742240654.127:534): pid=5290 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Mar 17 19:44:14.127000 audit[5290]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd4ee569a0 a2=3 a3=0 items=0 ppid=1 pid=5290 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:14.127000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:14.166000 audit[5290]: USER_START pid=5290 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:14.170000 audit[5293]: CRED_ACQ pid=5293 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:14.977241 sshd[5290]: pam_unix(sshd:session): session closed for user core Mar 17 19:44:14.979000 audit[5290]: USER_END pid=5290 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:14.980000 audit[5290]: CRED_DISP pid=5290 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:14.982857 systemd[1]: sshd@21-172.24.4.218:22-172.24.4.1:48404.service: Deactivated successfully. Mar 17 19:44:14.982000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-172.24.4.218:22-172.24.4.1:48404 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:14.984127 systemd[1]: session-22.scope: Deactivated successfully. Mar 17 19:44:14.984608 systemd-logind[1236]: Session 22 logged out. Waiting for processes to exit. Mar 17 19:44:14.986284 systemd-logind[1236]: Removed session 22. Mar 17 19:44:19.985071 systemd[1]: Started sshd@22-172.24.4.218:22-172.24.4.1:48198.service. Mar 17 19:44:20.004059 kernel: kauditd_printk_skb: 7 callbacks suppressed Mar 17 19:44:20.004138 kernel: audit: type=1130 audit(1742240659.984:540): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-172.24.4.218:22-172.24.4.1:48198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:19.984000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-172.24.4.218:22-172.24.4.1:48198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:21.117000 audit[5322]: USER_ACCT pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.134117 kernel: audit: type=1101 audit(1742240661.117:541): pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.134199 sshd[5322]: Accepted publickey for core from 172.24.4.1 port 48198 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:44:21.133000 audit[5322]: CRED_ACQ pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.135286 sshd[5322]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:44:21.158866 kernel: audit: type=1103 audit(1742240661.133:542): pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.159038 kernel: audit: type=1006 audit(1742240661.133:543): pid=5322 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Mar 17 19:44:21.133000 audit[5322]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd3bb93010 a2=3 a3=0 items=0 ppid=1 pid=5322 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:21.170603 systemd[1]: Started session-23.scope. Mar 17 19:44:21.173458 systemd-logind[1236]: New session 23 of user core. Mar 17 19:44:21.185120 kernel: audit: type=1300 audit(1742240661.133:543): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd3bb93010 a2=3 a3=0 items=0 ppid=1 pid=5322 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:21.133000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:21.191000 audit[5322]: USER_START pid=5322 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.206356 kernel: audit: type=1327 audit(1742240661.133:543): proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:21.206480 kernel: audit: type=1105 audit(1742240661.191:544): pid=5322 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.206534 kernel: audit: type=1103 audit(1742240661.196:545): pid=5331 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.196000 audit[5331]: CRED_ACQ pid=5331 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.871499 sshd[5322]: pam_unix(sshd:session): session closed for user core Mar 17 19:44:21.873000 audit[5322]: USER_END pid=5322 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.894054 kernel: audit: type=1106 audit(1742240661.873:546): pid=5322 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.873000 audit[5322]: CRED_DISP pid=5322 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.895341 systemd[1]: sshd@22-172.24.4.218:22-172.24.4.1:48198.service: Deactivated successfully. Mar 17 19:44:21.913059 kernel: audit: type=1104 audit(1742240661.873:547): pid=5322 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:21.914449 systemd-logind[1236]: Session 23 logged out. Waiting for processes to exit. Mar 17 19:44:21.914772 systemd[1]: session-23.scope: Deactivated successfully. Mar 17 19:44:21.893000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-172.24.4.218:22-172.24.4.1:48198 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:21.918815 systemd-logind[1236]: Removed session 23. Mar 17 19:44:26.881119 systemd[1]: Started sshd@23-172.24.4.218:22-172.24.4.1:51808.service. Mar 17 19:44:26.881000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-172.24.4.218:22-172.24.4.1:51808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:26.887135 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 19:44:26.887250 kernel: audit: type=1130 audit(1742240666.881:549): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-172.24.4.218:22-172.24.4.1:51808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:28.174000 audit[5363]: USER_ACCT pid=5363 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.175686 sshd[5363]: Accepted publickey for core from 172.24.4.1 port 51808 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:44:28.191008 kernel: audit: type=1101 audit(1742240668.174:550): pid=5363 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.189000 audit[5363]: CRED_ACQ pid=5363 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.192098 sshd[5363]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:44:28.214735 kernel: audit: type=1103 audit(1742240668.189:551): pid=5363 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.215165 kernel: audit: type=1006 audit(1742240668.190:552): pid=5363 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Mar 17 19:44:28.190000 audit[5363]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe06ab4240 a2=3 a3=0 items=0 ppid=1 pid=5363 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:28.228031 systemd[1]: Started session-24.scope. Mar 17 19:44:28.229938 systemd-logind[1236]: New session 24 of user core. Mar 17 19:44:28.232120 kernel: audit: type=1300 audit(1742240668.190:552): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe06ab4240 a2=3 a3=0 items=0 ppid=1 pid=5363 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:28.232239 kernel: audit: type=1327 audit(1742240668.190:552): proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:28.190000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:28.254000 audit[5363]: USER_START pid=5363 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.257000 audit[5366]: CRED_ACQ pid=5366 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.278014 kernel: audit: type=1105 audit(1742240668.254:553): pid=5363 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.278261 kernel: audit: type=1103 audit(1742240668.257:554): pid=5366 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.864569 sshd[5363]: pam_unix(sshd:session): session closed for user core Mar 17 19:44:28.866000 audit[5363]: USER_END pid=5363 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.885027 kernel: audit: type=1106 audit(1742240668.866:555): pid=5363 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.884936 systemd[1]: sshd@23-172.24.4.218:22-172.24.4.1:51808.service: Deactivated successfully. Mar 17 19:44:28.867000 audit[5363]: CRED_DISP pid=5363 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.898012 kernel: audit: type=1104 audit(1742240668.867:556): pid=5363 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:28.897072 systemd[1]: session-24.scope: Deactivated successfully. Mar 17 19:44:28.897269 systemd-logind[1236]: Session 24 logged out. Waiting for processes to exit. Mar 17 19:44:28.884000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-172.24.4.218:22-172.24.4.1:51808 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:28.901057 systemd-logind[1236]: Removed session 24. Mar 17 19:44:33.869000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-172.24.4.218:22-172.24.4.1:50946 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:33.870398 systemd[1]: Started sshd@24-172.24.4.218:22-172.24.4.1:50946.service. Mar 17 19:44:33.873997 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 19:44:33.874113 kernel: audit: type=1130 audit(1742240673.869:558): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-172.24.4.218:22-172.24.4.1:50946 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:35.068000 audit[5378]: USER_ACCT pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.070021 sshd[5378]: Accepted publickey for core from 172.24.4.1 port 50946 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:44:35.083000 audit[5378]: CRED_ACQ pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.086560 sshd[5378]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:44:35.100128 kernel: audit: type=1101 audit(1742240675.068:559): pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.100264 kernel: audit: type=1103 audit(1742240675.083:560): pid=5378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.109514 kernel: audit: type=1006 audit(1742240675.083:561): pid=5378 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Mar 17 19:44:35.083000 audit[5378]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcfc29a160 a2=3 a3=0 items=0 ppid=1 pid=5378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:35.127234 kernel: audit: type=1300 audit(1742240675.083:561): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcfc29a160 a2=3 a3=0 items=0 ppid=1 pid=5378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:35.127363 kernel: audit: type=1327 audit(1742240675.083:561): proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:35.083000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:35.138814 systemd[1]: Started session-25.scope. Mar 17 19:44:35.139838 systemd-logind[1236]: New session 25 of user core. Mar 17 19:44:35.154000 audit[5378]: USER_START pid=5378 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.158000 audit[5381]: CRED_ACQ pid=5381 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.181070 kernel: audit: type=1105 audit(1742240675.154:562): pid=5378 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.181167 kernel: audit: type=1103 audit(1742240675.158:563): pid=5381 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.862792 sshd[5378]: pam_unix(sshd:session): session closed for user core Mar 17 19:44:35.862000 audit[5378]: USER_END pid=5378 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.871995 kernel: audit: type=1106 audit(1742240675.862:564): pid=5378 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.872025 systemd[1]: sshd@24-172.24.4.218:22-172.24.4.1:50946.service: Deactivated successfully. Mar 17 19:44:35.872884 systemd[1]: session-25.scope: Deactivated successfully. Mar 17 19:44:35.862000 audit[5378]: CRED_DISP pid=5378 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.870000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-172.24.4.218:22-172.24.4.1:50946 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:35.879233 systemd-logind[1236]: Session 25 logged out. Waiting for processes to exit. Mar 17 19:44:35.880060 kernel: audit: type=1104 audit(1742240675.862:565): pid=5378 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:35.880341 systemd-logind[1236]: Removed session 25. Mar 17 19:44:36.574128 systemd[1]: run-containerd-runc-k8s.io-4c1ef740ef47f5ff497973bc373f8c9106da486bac8ad0d98fcdbbc883538f45-runc.nvQ1Sq.mount: Deactivated successfully. Mar 17 19:44:40.869000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-172.24.4.218:22-172.24.4.1:50948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:40.875752 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 19:44:40.875898 kernel: audit: type=1130 audit(1742240680.869:567): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-172.24.4.218:22-172.24.4.1:50948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:40.870741 systemd[1]: Started sshd@25-172.24.4.218:22-172.24.4.1:50948.service. Mar 17 19:44:42.087276 kernel: audit: type=1101 audit(1742240682.069:568): pid=5412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.069000 audit[5412]: USER_ACCT pid=5412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.087794 sshd[5412]: Accepted publickey for core from 172.24.4.1 port 50948 ssh2: RSA SHA256:0qismvO9/NycYojDPV3BgQur5FYKlC/KcDYVOn7KNLI Mar 17 19:44:42.088285 sshd[5412]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 19:44:42.085000 audit[5412]: CRED_ACQ pid=5412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.112909 kernel: audit: type=1103 audit(1742240682.085:569): pid=5412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.113224 kernel: audit: type=1006 audit(1742240682.085:570): pid=5412 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Mar 17 19:44:42.085000 audit[5412]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6402cf40 a2=3 a3=0 items=0 ppid=1 pid=5412 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:42.132229 kernel: audit: type=1300 audit(1742240682.085:570): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe6402cf40 a2=3 a3=0 items=0 ppid=1 pid=5412 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 19:44:42.132417 kernel: audit: type=1327 audit(1742240682.085:570): proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:42.085000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 19:44:42.142489 systemd-logind[1236]: New session 26 of user core. Mar 17 19:44:42.144137 systemd[1]: Started session-26.scope. Mar 17 19:44:42.150000 audit[5412]: USER_START pid=5412 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.161161 kernel: audit: type=1105 audit(1742240682.150:571): pid=5412 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.160000 audit[5418]: CRED_ACQ pid=5418 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.169217 kernel: audit: type=1103 audit(1742240682.160:572): pid=5418 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.871184 sshd[5412]: pam_unix(sshd:session): session closed for user core Mar 17 19:44:42.872000 audit[5412]: USER_END pid=5412 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.890674 systemd[1]: sshd@25-172.24.4.218:22-172.24.4.1:50948.service: Deactivated successfully. Mar 17 19:44:42.890997 kernel: audit: type=1106 audit(1742240682.872:573): pid=5412 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.872000 audit[5412]: CRED_DISP pid=5412 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.889000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-172.24.4.218:22-172.24.4.1:50948 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 19:44:42.906167 kernel: audit: type=1104 audit(1742240682.872:574): pid=5412 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=172.24.4.1 addr=172.24.4.1 terminal=ssh res=success' Mar 17 19:44:42.906206 systemd[1]: session-26.scope: Deactivated successfully. Mar 17 19:44:42.907057 systemd-logind[1236]: Session 26 logged out. Waiting for processes to exit. Mar 17 19:44:42.909411 systemd-logind[1236]: Removed session 26.