Mar 17 18:32:50.963039 kernel: Linux version 5.15.179-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 11.3.1_p20221209 p3) 11.3.1 20221209, GNU ld (Gentoo 2.39 p5) 2.39.0) #1 SMP Mon Mar 17 17:12:34 -00 2025 Mar 17 18:32:50.963060 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=249ccd113f901380672c0d31e18f792e8e0344094c0e39eedc449f039418b31a Mar 17 18:32:50.963070 kernel: BIOS-provided physical RAM map: Mar 17 18:32:50.963075 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 17 18:32:50.963081 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000007fffff] usable Mar 17 18:32:50.963086 kernel: BIOS-e820: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Mar 17 18:32:50.963093 kernel: BIOS-e820: [mem 0x0000000000808000-0x000000000080afff] usable Mar 17 18:32:50.963099 kernel: BIOS-e820: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Mar 17 18:32:50.963104 kernel: BIOS-e820: [mem 0x000000000080c000-0x000000000080ffff] usable Mar 17 18:32:50.963112 kernel: BIOS-e820: [mem 0x0000000000810000-0x00000000008fffff] ACPI NVS Mar 17 18:32:50.963117 kernel: BIOS-e820: [mem 0x0000000000900000-0x000000009c8eefff] usable Mar 17 18:32:50.963123 kernel: BIOS-e820: [mem 0x000000009c8ef000-0x000000009cb6efff] reserved Mar 17 18:32:50.963128 kernel: BIOS-e820: [mem 0x000000009cb6f000-0x000000009cb7efff] ACPI data Mar 17 18:32:50.963134 kernel: BIOS-e820: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Mar 17 18:32:50.963141 kernel: BIOS-e820: [mem 0x000000009cbff000-0x000000009cf3ffff] usable Mar 17 18:32:50.963148 kernel: BIOS-e820: [mem 0x000000009cf40000-0x000000009cf5ffff] reserved Mar 17 18:32:50.963154 kernel: BIOS-e820: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Mar 17 18:32:50.963160 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 17 18:32:50.963166 kernel: NX (Execute Disable) protection: active Mar 17 18:32:50.963172 kernel: e820: update [mem 0x9b475018-0x9b47ec57] usable ==> usable Mar 17 18:32:50.963178 kernel: e820: update [mem 0x9b475018-0x9b47ec57] usable ==> usable Mar 17 18:32:50.963184 kernel: e820: update [mem 0x9b438018-0x9b474e57] usable ==> usable Mar 17 18:32:50.963189 kernel: e820: update [mem 0x9b438018-0x9b474e57] usable ==> usable Mar 17 18:32:50.963195 kernel: extended physical RAM map: Mar 17 18:32:50.963201 kernel: reserve setup_data: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 17 18:32:50.963208 kernel: reserve setup_data: [mem 0x0000000000100000-0x00000000007fffff] usable Mar 17 18:32:50.963214 kernel: reserve setup_data: [mem 0x0000000000800000-0x0000000000807fff] ACPI NVS Mar 17 18:32:50.963220 kernel: reserve setup_data: [mem 0x0000000000808000-0x000000000080afff] usable Mar 17 18:32:50.963226 kernel: reserve setup_data: [mem 0x000000000080b000-0x000000000080bfff] ACPI NVS Mar 17 18:32:50.963232 kernel: reserve setup_data: [mem 0x000000000080c000-0x000000000080ffff] usable Mar 17 18:32:50.963238 kernel: reserve setup_data: [mem 0x0000000000810000-0x00000000008fffff] ACPI NVS Mar 17 18:32:50.963243 kernel: reserve setup_data: [mem 0x0000000000900000-0x000000009b438017] usable Mar 17 18:32:50.963249 kernel: reserve setup_data: [mem 0x000000009b438018-0x000000009b474e57] usable Mar 17 18:32:50.963255 kernel: reserve setup_data: [mem 0x000000009b474e58-0x000000009b475017] usable Mar 17 18:32:50.963261 kernel: reserve setup_data: [mem 0x000000009b475018-0x000000009b47ec57] usable Mar 17 18:32:50.963267 kernel: reserve setup_data: [mem 0x000000009b47ec58-0x000000009c8eefff] usable Mar 17 18:32:50.963274 kernel: reserve setup_data: [mem 0x000000009c8ef000-0x000000009cb6efff] reserved Mar 17 18:32:50.963280 kernel: reserve setup_data: [mem 0x000000009cb6f000-0x000000009cb7efff] ACPI data Mar 17 18:32:50.963286 kernel: reserve setup_data: [mem 0x000000009cb7f000-0x000000009cbfefff] ACPI NVS Mar 17 18:32:50.963292 kernel: reserve setup_data: [mem 0x000000009cbff000-0x000000009cf3ffff] usable Mar 17 18:32:50.963301 kernel: reserve setup_data: [mem 0x000000009cf40000-0x000000009cf5ffff] reserved Mar 17 18:32:50.963307 kernel: reserve setup_data: [mem 0x000000009cf60000-0x000000009cffffff] ACPI NVS Mar 17 18:32:50.963314 kernel: reserve setup_data: [mem 0x00000000b0000000-0x00000000bfffffff] reserved Mar 17 18:32:50.963321 kernel: efi: EFI v2.70 by EDK II Mar 17 18:32:50.963336 kernel: efi: SMBIOS=0x9c9ab000 ACPI=0x9cb7e000 ACPI 2.0=0x9cb7e014 MEMATTR=0x9b673018 RNG=0x9cb73018 Mar 17 18:32:50.963343 kernel: random: crng init done Mar 17 18:32:50.963350 kernel: SMBIOS 2.8 present. Mar 17 18:32:50.963356 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 0.0.0 02/06/2015 Mar 17 18:32:50.963363 kernel: Hypervisor detected: KVM Mar 17 18:32:50.963369 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 Mar 17 18:32:50.963375 kernel: kvm-clock: cpu 0, msr 5919a001, primary cpu clock Mar 17 18:32:50.963382 kernel: kvm-clock: using sched offset of 4683664608 cycles Mar 17 18:32:50.963393 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns Mar 17 18:32:50.963399 kernel: tsc: Detected 2794.748 MHz processor Mar 17 18:32:50.963406 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 17 18:32:50.963413 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 17 18:32:50.963419 kernel: last_pfn = 0x9cf40 max_arch_pfn = 0x400000000 Mar 17 18:32:50.963426 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 17 18:32:50.963433 kernel: Using GB pages for direct mapping Mar 17 18:32:50.963439 kernel: Secure boot disabled Mar 17 18:32:50.963445 kernel: ACPI: Early table checksum verification disabled Mar 17 18:32:50.963453 kernel: ACPI: RSDP 0x000000009CB7E014 000024 (v02 BOCHS ) Mar 17 18:32:50.963460 kernel: ACPI: XSDT 0x000000009CB7D0E8 000054 (v01 BOCHS BXPC 00000001 01000013) Mar 17 18:32:50.963466 kernel: ACPI: FACP 0x000000009CB79000 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 18:32:50.963473 kernel: ACPI: DSDT 0x000000009CB7A000 0021A8 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 18:32:50.963479 kernel: ACPI: FACS 0x000000009CBDD000 000040 Mar 17 18:32:50.963486 kernel: ACPI: APIC 0x000000009CB78000 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 18:32:50.963492 kernel: ACPI: HPET 0x000000009CB77000 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 18:32:50.963499 kernel: ACPI: MCFG 0x000000009CB76000 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 18:32:50.963506 kernel: ACPI: WAET 0x000000009CB75000 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) Mar 17 18:32:50.963514 kernel: ACPI: BGRT 0x000000009CB74000 000038 (v01 INTEL EDK2 00000002 01000013) Mar 17 18:32:50.963520 kernel: ACPI: Reserving FACP table memory at [mem 0x9cb79000-0x9cb790f3] Mar 17 18:32:50.963527 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cb7a000-0x9cb7c1a7] Mar 17 18:32:50.963533 kernel: ACPI: Reserving FACS table memory at [mem 0x9cbdd000-0x9cbdd03f] Mar 17 18:32:50.963540 kernel: ACPI: Reserving APIC table memory at [mem 0x9cb78000-0x9cb7808f] Mar 17 18:32:50.963546 kernel: ACPI: Reserving HPET table memory at [mem 0x9cb77000-0x9cb77037] Mar 17 18:32:50.963553 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cb76000-0x9cb7603b] Mar 17 18:32:50.963559 kernel: ACPI: Reserving WAET table memory at [mem 0x9cb75000-0x9cb75027] Mar 17 18:32:50.963566 kernel: ACPI: Reserving BGRT table memory at [mem 0x9cb74000-0x9cb74037] Mar 17 18:32:50.963574 kernel: No NUMA configuration found Mar 17 18:32:50.963580 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cf3ffff] Mar 17 18:32:50.963587 kernel: NODE_DATA(0) allocated [mem 0x9cea6000-0x9ceabfff] Mar 17 18:32:50.963594 kernel: Zone ranges: Mar 17 18:32:50.963600 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 17 18:32:50.963607 kernel: DMA32 [mem 0x0000000001000000-0x000000009cf3ffff] Mar 17 18:32:50.963613 kernel: Normal empty Mar 17 18:32:50.963619 kernel: Movable zone start for each node Mar 17 18:32:50.963626 kernel: Early memory node ranges Mar 17 18:32:50.963633 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 17 18:32:50.963640 kernel: node 0: [mem 0x0000000000100000-0x00000000007fffff] Mar 17 18:32:50.963646 kernel: node 0: [mem 0x0000000000808000-0x000000000080afff] Mar 17 18:32:50.963653 kernel: node 0: [mem 0x000000000080c000-0x000000000080ffff] Mar 17 18:32:50.963659 kernel: node 0: [mem 0x0000000000900000-0x000000009c8eefff] Mar 17 18:32:50.963666 kernel: node 0: [mem 0x000000009cbff000-0x000000009cf3ffff] Mar 17 18:32:50.963672 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cf3ffff] Mar 17 18:32:50.963679 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 17 18:32:50.963685 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 17 18:32:50.963691 kernel: On node 0, zone DMA: 8 pages in unavailable ranges Mar 17 18:32:50.963699 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 17 18:32:50.963706 kernel: On node 0, zone DMA: 240 pages in unavailable ranges Mar 17 18:32:50.963712 kernel: On node 0, zone DMA32: 784 pages in unavailable ranges Mar 17 18:32:50.963719 kernel: On node 0, zone DMA32: 12480 pages in unavailable ranges Mar 17 18:32:50.963725 kernel: ACPI: PM-Timer IO Port: 0x608 Mar 17 18:32:50.963731 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) Mar 17 18:32:50.963738 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 Mar 17 18:32:50.963744 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) Mar 17 18:32:50.963751 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) Mar 17 18:32:50.963759 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 17 18:32:50.963765 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) Mar 17 18:32:50.963772 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) Mar 17 18:32:50.963781 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 17 18:32:50.963787 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 Mar 17 18:32:50.963794 kernel: TSC deadline timer available Mar 17 18:32:50.963800 kernel: smpboot: Allowing 4 CPUs, 0 hotplug CPUs Mar 17 18:32:50.963807 kernel: kvm-guest: KVM setup pv remote TLB flush Mar 17 18:32:50.963813 kernel: kvm-guest: setup PV sched yield Mar 17 18:32:50.963821 kernel: [mem 0xc0000000-0xffffffff] available for PCI devices Mar 17 18:32:50.963827 kernel: Booting paravirtualized kernel on KVM Mar 17 18:32:50.963839 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 17 18:32:50.963848 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:512 nr_cpu_ids:4 nr_node_ids:1 Mar 17 18:32:50.963855 kernel: percpu: Embedded 56 pages/cpu s188696 r8192 d32488 u524288 Mar 17 18:32:50.963862 kernel: pcpu-alloc: s188696 r8192 d32488 u524288 alloc=1*2097152 Mar 17 18:32:50.963869 kernel: pcpu-alloc: [0] 0 1 2 3 Mar 17 18:32:50.963886 kernel: kvm-guest: setup async PF for cpu 0 Mar 17 18:32:50.963893 kernel: kvm-guest: stealtime: cpu 0, msr 9b21c0c0 Mar 17 18:32:50.963900 kernel: kvm-guest: PV spinlocks enabled Mar 17 18:32:50.963907 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 17 18:32:50.963914 kernel: Built 1 zonelists, mobility grouping on. Total pages: 629759 Mar 17 18:32:50.963922 kernel: Policy zone: DMA32 Mar 17 18:32:50.963930 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=249ccd113f901380672c0d31e18f792e8e0344094c0e39eedc449f039418b31a Mar 17 18:32:50.963938 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 18:32:50.963944 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 18:32:50.963953 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Mar 17 18:32:50.963960 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 18:32:50.963967 kernel: Memory: 2397432K/2567000K available (12294K kernel code, 2278K rwdata, 13724K rodata, 47472K init, 4108K bss, 169308K reserved, 0K cma-reserved) Mar 17 18:32:50.963974 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Mar 17 18:32:50.963981 kernel: ftrace: allocating 34580 entries in 136 pages Mar 17 18:32:50.963987 kernel: ftrace: allocated 136 pages with 2 groups Mar 17 18:32:50.963994 kernel: rcu: Hierarchical RCU implementation. Mar 17 18:32:50.964002 kernel: rcu: RCU event tracing is enabled. Mar 17 18:32:50.964009 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Mar 17 18:32:50.964017 kernel: Rude variant of Tasks RCU enabled. Mar 17 18:32:50.964024 kernel: Tracing variant of Tasks RCU enabled. Mar 17 18:32:50.964041 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 18:32:50.964048 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Mar 17 18:32:50.964055 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 Mar 17 18:32:50.964062 kernel: Console: colour dummy device 80x25 Mar 17 18:32:50.964068 kernel: printk: console [ttyS0] enabled Mar 17 18:32:50.964075 kernel: ACPI: Core revision 20210730 Mar 17 18:32:50.964082 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns Mar 17 18:32:50.964092 kernel: APIC: Switch to symmetric I/O mode setup Mar 17 18:32:50.964098 kernel: x2apic enabled Mar 17 18:32:50.964105 kernel: Switched APIC routing to physical x2apic. Mar 17 18:32:50.964112 kernel: kvm-guest: setup PV IPIs Mar 17 18:32:50.964119 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 Mar 17 18:32:50.964126 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized Mar 17 18:32:50.964133 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794748) Mar 17 18:32:50.964140 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated Mar 17 18:32:50.964147 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 Mar 17 18:32:50.964155 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 Mar 17 18:32:50.964162 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 17 18:32:50.964169 kernel: Spectre V2 : Mitigation: Retpolines Mar 17 18:32:50.964175 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 17 18:32:50.964182 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 17 18:32:50.964189 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls Mar 17 18:32:50.964196 kernel: RETBleed: Mitigation: untrained return thunk Mar 17 18:32:50.964205 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier Mar 17 18:32:50.964213 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl and seccomp Mar 17 18:32:50.964221 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 17 18:32:50.964228 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 17 18:32:50.964235 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 17 18:32:50.964242 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 17 18:32:50.964249 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'standard' format. Mar 17 18:32:50.964256 kernel: Freeing SMP alternatives memory: 32K Mar 17 18:32:50.964263 kernel: pid_max: default: 32768 minimum: 301 Mar 17 18:32:50.964269 kernel: LSM: Security Framework initializing Mar 17 18:32:50.964276 kernel: SELinux: Initializing. Mar 17 18:32:50.964284 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 18:32:50.964291 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Mar 17 18:32:50.964298 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) Mar 17 18:32:50.964305 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. Mar 17 18:32:50.964312 kernel: ... version: 0 Mar 17 18:32:50.964318 kernel: ... bit width: 48 Mar 17 18:32:50.964333 kernel: ... generic registers: 6 Mar 17 18:32:50.964340 kernel: ... value mask: 0000ffffffffffff Mar 17 18:32:50.964346 kernel: ... max period: 00007fffffffffff Mar 17 18:32:50.964355 kernel: ... fixed-purpose events: 0 Mar 17 18:32:50.964362 kernel: ... event mask: 000000000000003f Mar 17 18:32:50.964369 kernel: signal: max sigframe size: 1776 Mar 17 18:32:50.964376 kernel: rcu: Hierarchical SRCU implementation. Mar 17 18:32:50.964383 kernel: smp: Bringing up secondary CPUs ... Mar 17 18:32:50.964390 kernel: x86: Booting SMP configuration: Mar 17 18:32:50.964396 kernel: .... node #0, CPUs: #1 Mar 17 18:32:50.964403 kernel: kvm-clock: cpu 1, msr 5919a041, secondary cpu clock Mar 17 18:32:50.964410 kernel: kvm-guest: setup async PF for cpu 1 Mar 17 18:32:50.964419 kernel: kvm-guest: stealtime: cpu 1, msr 9b29c0c0 Mar 17 18:32:50.964425 kernel: #2 Mar 17 18:32:50.964432 kernel: kvm-clock: cpu 2, msr 5919a081, secondary cpu clock Mar 17 18:32:50.964439 kernel: kvm-guest: setup async PF for cpu 2 Mar 17 18:32:50.964446 kernel: kvm-guest: stealtime: cpu 2, msr 9b31c0c0 Mar 17 18:32:50.964453 kernel: #3 Mar 17 18:32:50.964459 kernel: kvm-clock: cpu 3, msr 5919a0c1, secondary cpu clock Mar 17 18:32:50.964466 kernel: kvm-guest: setup async PF for cpu 3 Mar 17 18:32:50.964473 kernel: kvm-guest: stealtime: cpu 3, msr 9b39c0c0 Mar 17 18:32:50.964481 kernel: smp: Brought up 1 node, 4 CPUs Mar 17 18:32:50.964488 kernel: smpboot: Max logical packages: 1 Mar 17 18:32:50.964495 kernel: smpboot: Total of 4 processors activated (22357.98 BogoMIPS) Mar 17 18:32:50.964501 kernel: devtmpfs: initialized Mar 17 18:32:50.964508 kernel: x86/mm: Memory block size: 128MB Mar 17 18:32:50.964515 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00800000-0x00807fff] (32768 bytes) Mar 17 18:32:50.964522 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x0080b000-0x0080bfff] (4096 bytes) Mar 17 18:32:50.964529 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x00810000-0x008fffff] (983040 bytes) Mar 17 18:32:50.964536 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cb7f000-0x9cbfefff] (524288 bytes) Mar 17 18:32:50.964544 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x9cf60000-0x9cffffff] (655360 bytes) Mar 17 18:32:50.964551 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 18:32:50.964558 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Mar 17 18:32:50.964565 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 18:32:50.964571 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 18:32:50.964578 kernel: audit: initializing netlink subsys (disabled) Mar 17 18:32:50.964585 kernel: audit: type=2000 audit(1742236370.374:1): state=initialized audit_enabled=0 res=1 Mar 17 18:32:50.964592 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 18:32:50.964599 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 17 18:32:50.964607 kernel: cpuidle: using governor menu Mar 17 18:32:50.964614 kernel: ACPI: bus type PCI registered Mar 17 18:32:50.964621 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 18:32:50.964627 kernel: dca service started, version 1.12.1 Mar 17 18:32:50.964634 kernel: PCI: MMCONFIG for domain 0000 [bus 00-ff] at [mem 0xb0000000-0xbfffffff] (base 0xb0000000) Mar 17 18:32:50.964641 kernel: PCI: MMCONFIG at [mem 0xb0000000-0xbfffffff] reserved in E820 Mar 17 18:32:50.964648 kernel: PCI: Using configuration type 1 for base access Mar 17 18:32:50.964655 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 17 18:32:50.964662 kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 18:32:50.964670 kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 18:32:50.964677 kernel: ACPI: Added _OSI(Module Device) Mar 17 18:32:50.964684 kernel: ACPI: Added _OSI(Processor Device) Mar 17 18:32:50.964691 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 18:32:50.964697 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 18:32:50.964704 kernel: ACPI: Added _OSI(Linux-Dell-Video) Mar 17 18:32:50.964711 kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Mar 17 18:32:50.964718 kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Mar 17 18:32:50.964725 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 18:32:50.964732 kernel: ACPI: Interpreter enabled Mar 17 18:32:50.964739 kernel: ACPI: PM: (supports S0 S3 S5) Mar 17 18:32:50.964746 kernel: ACPI: Using IOAPIC for interrupt routing Mar 17 18:32:50.964753 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 17 18:32:50.964760 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F Mar 17 18:32:50.964767 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Mar 17 18:32:50.964918 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Mar 17 18:32:50.964994 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] Mar 17 18:32:50.965078 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] Mar 17 18:32:50.965088 kernel: PCI host bridge to bus 0000:00 Mar 17 18:32:50.965174 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] Mar 17 18:32:50.965242 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] Mar 17 18:32:50.965306 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] Mar 17 18:32:50.965380 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] Mar 17 18:32:50.965444 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] Mar 17 18:32:50.965512 kernel: pci_bus 0000:00: root bus resource [mem 0x800000000-0xfffffffff window] Mar 17 18:32:50.965577 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Mar 17 18:32:50.965678 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 Mar 17 18:32:50.965782 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 Mar 17 18:32:50.965895 kernel: pci 0000:00:01.0: reg 0x10: [mem 0xc0000000-0xc0ffffff pref] Mar 17 18:32:50.965973 kernel: pci 0000:00:01.0: reg 0x18: [mem 0xc1044000-0xc1044fff] Mar 17 18:32:50.966049 kernel: pci 0000:00:01.0: reg 0x30: [mem 0xffff0000-0xffffffff pref] Mar 17 18:32:50.966122 kernel: pci 0000:00:01.0: BAR 0: assigned to efifb Mar 17 18:32:50.966196 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] Mar 17 18:32:50.966304 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 Mar 17 18:32:50.966392 kernel: pci 0000:00:02.0: reg 0x10: [io 0x6100-0x611f] Mar 17 18:32:50.966466 kernel: pci 0000:00:02.0: reg 0x14: [mem 0xc1043000-0xc1043fff] Mar 17 18:32:50.966539 kernel: pci 0000:00:02.0: reg 0x20: [mem 0x800000000-0x800003fff 64bit pref] Mar 17 18:32:50.966627 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 Mar 17 18:32:50.966700 kernel: pci 0000:00:03.0: reg 0x10: [io 0x6000-0x607f] Mar 17 18:32:50.966774 kernel: pci 0000:00:03.0: reg 0x14: [mem 0xc1042000-0xc1042fff] Mar 17 18:32:50.966845 kernel: pci 0000:00:03.0: reg 0x20: [mem 0x800004000-0x800007fff 64bit pref] Mar 17 18:32:50.966959 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 Mar 17 18:32:50.967034 kernel: pci 0000:00:04.0: reg 0x10: [io 0x60e0-0x60ff] Mar 17 18:32:50.967106 kernel: pci 0000:00:04.0: reg 0x14: [mem 0xc1041000-0xc1041fff] Mar 17 18:32:50.967182 kernel: pci 0000:00:04.0: reg 0x20: [mem 0x800008000-0x80000bfff 64bit pref] Mar 17 18:32:50.967254 kernel: pci 0000:00:04.0: reg 0x30: [mem 0xfffc0000-0xffffffff pref] Mar 17 18:32:50.967344 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 Mar 17 18:32:50.967418 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO Mar 17 18:32:50.967524 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 Mar 17 18:32:50.967607 kernel: pci 0000:00:1f.2: reg 0x20: [io 0x60c0-0x60df] Mar 17 18:32:50.967701 kernel: pci 0000:00:1f.2: reg 0x24: [mem 0xc1040000-0xc1040fff] Mar 17 18:32:50.967838 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 Mar 17 18:32:50.967972 kernel: pci 0000:00:1f.3: reg 0x20: [io 0x6080-0x60bf] Mar 17 18:32:50.967982 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 Mar 17 18:32:50.967990 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 Mar 17 18:32:50.968005 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 Mar 17 18:32:50.968012 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 Mar 17 18:32:50.968019 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 Mar 17 18:32:50.968037 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 Mar 17 18:32:50.968046 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 Mar 17 18:32:50.968065 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 Mar 17 18:32:50.968073 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 Mar 17 18:32:50.968086 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 Mar 17 18:32:50.968099 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 Mar 17 18:32:50.968112 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 Mar 17 18:32:50.968119 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 Mar 17 18:32:50.968126 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 Mar 17 18:32:50.968136 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 Mar 17 18:32:50.968143 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 Mar 17 18:32:50.968152 kernel: iommu: Default domain type: Translated Mar 17 18:32:50.968163 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 17 18:32:50.968332 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device Mar 17 18:32:50.968462 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none Mar 17 18:32:50.968572 kernel: pci 0000:00:01.0: vgaarb: bridge control possible Mar 17 18:32:50.968582 kernel: vgaarb: loaded Mar 17 18:32:50.968589 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 17 18:32:50.968602 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 17 18:32:50.968612 kernel: PTP clock support registered Mar 17 18:32:50.968619 kernel: Registered efivars operations Mar 17 18:32:50.968626 kernel: PCI: Using ACPI for IRQ routing Mar 17 18:32:50.968633 kernel: PCI: pci_cache_line_size set to 64 bytes Mar 17 18:32:50.968639 kernel: e820: reserve RAM buffer [mem 0x0080b000-0x008fffff] Mar 17 18:32:50.968649 kernel: e820: reserve RAM buffer [mem 0x00810000-0x008fffff] Mar 17 18:32:50.968655 kernel: e820: reserve RAM buffer [mem 0x9b438018-0x9bffffff] Mar 17 18:32:50.968662 kernel: e820: reserve RAM buffer [mem 0x9b475018-0x9bffffff] Mar 17 18:32:50.968671 kernel: e820: reserve RAM buffer [mem 0x9c8ef000-0x9fffffff] Mar 17 18:32:50.968678 kernel: e820: reserve RAM buffer [mem 0x9cf40000-0x9fffffff] Mar 17 18:32:50.968685 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 Mar 17 18:32:50.968692 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter Mar 17 18:32:50.968699 kernel: clocksource: Switched to clocksource kvm-clock Mar 17 18:32:50.968706 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 18:32:50.968713 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 18:32:50.968719 kernel: pnp: PnP ACPI init Mar 17 18:32:50.968813 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved Mar 17 18:32:50.968826 kernel: pnp: PnP ACPI: found 6 devices Mar 17 18:32:50.968834 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 17 18:32:50.968841 kernel: NET: Registered PF_INET protocol family Mar 17 18:32:50.968847 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 18:32:50.968854 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Mar 17 18:32:50.968861 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 18:32:50.968868 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Mar 17 18:32:50.968899 kernel: TCP bind hash table entries: 32768 (order: 7, 524288 bytes, linear) Mar 17 18:32:50.968910 kernel: TCP: Hash tables configured (established 32768 bind 32768) Mar 17 18:32:50.968917 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 18:32:50.968924 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Mar 17 18:32:50.968931 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 18:32:50.968938 kernel: NET: Registered PF_XDP protocol family Mar 17 18:32:50.969017 kernel: pci 0000:00:04.0: can't claim BAR 6 [mem 0xfffc0000-0xffffffff pref]: no compatible bridge window Mar 17 18:32:50.969092 kernel: pci 0000:00:04.0: BAR 6: assigned [mem 0x9d000000-0x9d03ffff pref] Mar 17 18:32:50.969159 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] Mar 17 18:32:50.969227 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] Mar 17 18:32:50.969292 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] Mar 17 18:32:50.969382 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] Mar 17 18:32:50.969473 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] Mar 17 18:32:50.969553 kernel: pci_bus 0000:00: resource 9 [mem 0x800000000-0xfffffffff window] Mar 17 18:32:50.969564 kernel: PCI: CLS 0 bytes, default 64 Mar 17 18:32:50.969571 kernel: Initialise system trusted keyrings Mar 17 18:32:50.969578 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Mar 17 18:32:50.969588 kernel: Key type asymmetric registered Mar 17 18:32:50.969595 kernel: Asymmetric key parser 'x509' registered Mar 17 18:32:50.969603 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 17 18:32:50.969619 kernel: io scheduler mq-deadline registered Mar 17 18:32:50.969628 kernel: io scheduler kyber registered Mar 17 18:32:50.969635 kernel: io scheduler bfq registered Mar 17 18:32:50.969642 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 17 18:32:50.969650 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 Mar 17 18:32:50.969657 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 Mar 17 18:32:50.969665 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 Mar 17 18:32:50.969673 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 18:32:50.969681 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 17 18:32:50.969688 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 Mar 17 18:32:50.969695 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 Mar 17 18:32:50.969702 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 Mar 17 18:32:50.969832 kernel: rtc_cmos 00:04: RTC can wake from S4 Mar 17 18:32:50.969844 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input0 Mar 17 18:32:50.969926 kernel: rtc_cmos 00:04: registered as rtc0 Mar 17 18:32:50.969999 kernel: rtc_cmos 00:04: setting system clock to 2025-03-17T18:32:50 UTC (1742236370) Mar 17 18:32:50.970066 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs Mar 17 18:32:50.970075 kernel: efifb: probing for efifb Mar 17 18:32:50.970083 kernel: efifb: framebuffer at 0xc0000000, using 4000k, total 4000k Mar 17 18:32:50.970090 kernel: efifb: mode is 1280x800x32, linelength=5120, pages=1 Mar 17 18:32:50.970097 kernel: efifb: scrolling: redraw Mar 17 18:32:50.970104 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 17 18:32:50.970112 kernel: Console: switching to colour frame buffer device 160x50 Mar 17 18:32:50.970121 kernel: fb0: EFI VGA frame buffer device Mar 17 18:32:50.970128 kernel: pstore: Registered efi as persistent store backend Mar 17 18:32:50.970135 kernel: NET: Registered PF_INET6 protocol family Mar 17 18:32:50.970143 kernel: Segment Routing with IPv6 Mar 17 18:32:50.970151 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 18:32:50.970170 kernel: NET: Registered PF_PACKET protocol family Mar 17 18:32:50.970187 kernel: Key type dns_resolver registered Mar 17 18:32:50.970201 kernel: IPI shorthand broadcast: enabled Mar 17 18:32:50.970208 kernel: sched_clock: Marking stable (468048213, 127242436)->(683673071, -88382422) Mar 17 18:32:50.970215 kernel: registered taskstats version 1 Mar 17 18:32:50.970223 kernel: Loading compiled-in X.509 certificates Mar 17 18:32:50.970230 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 5.15.179-flatcar: d5b956bbabb2d386c0246a969032c0de9eaa8220' Mar 17 18:32:50.970240 kernel: Key type .fscrypt registered Mar 17 18:32:50.970247 kernel: Key type fscrypt-provisioning registered Mar 17 18:32:50.970255 kernel: pstore: Using crash dump compression: deflate Mar 17 18:32:50.970264 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 18:32:50.970272 kernel: ima: Allocated hash algorithm: sha1 Mar 17 18:32:50.970279 kernel: ima: No architecture policies found Mar 17 18:32:50.970291 kernel: clk: Disabling unused clocks Mar 17 18:32:50.970298 kernel: Freeing unused kernel image (initmem) memory: 47472K Mar 17 18:32:50.970310 kernel: Write protecting the kernel read-only data: 28672k Mar 17 18:32:50.970318 kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Mar 17 18:32:50.970333 kernel: Freeing unused kernel image (rodata/data gap) memory: 612K Mar 17 18:32:50.970340 kernel: Run /init as init process Mar 17 18:32:50.970350 kernel: with arguments: Mar 17 18:32:50.970357 kernel: /init Mar 17 18:32:50.970365 kernel: with environment: Mar 17 18:32:50.970379 kernel: HOME=/ Mar 17 18:32:50.970396 kernel: TERM=linux Mar 17 18:32:50.970411 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 18:32:50.970424 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Mar 17 18:32:50.970446 systemd[1]: Detected virtualization kvm. Mar 17 18:32:50.970469 systemd[1]: Detected architecture x86-64. Mar 17 18:32:50.970483 systemd[1]: Running in initrd. Mar 17 18:32:50.970491 systemd[1]: No hostname configured, using default hostname. Mar 17 18:32:50.970499 systemd[1]: Hostname set to . Mar 17 18:32:50.970507 systemd[1]: Initializing machine ID from VM UUID. Mar 17 18:32:50.970514 systemd[1]: Queued start job for default target initrd.target. Mar 17 18:32:50.970522 systemd[1]: Started systemd-ask-password-console.path. Mar 17 18:32:50.970530 systemd[1]: Reached target cryptsetup.target. Mar 17 18:32:50.970539 systemd[1]: Reached target paths.target. Mar 17 18:32:50.970547 systemd[1]: Reached target slices.target. Mar 17 18:32:50.970555 systemd[1]: Reached target swap.target. Mar 17 18:32:50.970569 systemd[1]: Reached target timers.target. Mar 17 18:32:50.970583 systemd[1]: Listening on iscsid.socket. Mar 17 18:32:50.970598 systemd[1]: Listening on iscsiuio.socket. Mar 17 18:32:50.970613 systemd[1]: Listening on systemd-journald-audit.socket. Mar 17 18:32:50.970621 systemd[1]: Listening on systemd-journald-dev-log.socket. Mar 17 18:32:50.970631 systemd[1]: Listening on systemd-journald.socket. Mar 17 18:32:50.970638 systemd[1]: Listening on systemd-networkd.socket. Mar 17 18:32:50.970646 systemd[1]: Listening on systemd-udevd-control.socket. Mar 17 18:32:50.970654 systemd[1]: Listening on systemd-udevd-kernel.socket. Mar 17 18:32:50.970662 systemd[1]: Reached target sockets.target. Mar 17 18:32:50.970670 systemd[1]: Starting kmod-static-nodes.service... Mar 17 18:32:50.970678 systemd[1]: Finished network-cleanup.service. Mar 17 18:32:50.970685 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 18:32:50.970694 systemd[1]: Starting systemd-journald.service... Mar 17 18:32:50.970703 systemd[1]: Starting systemd-modules-load.service... Mar 17 18:32:50.970711 systemd[1]: Starting systemd-resolved.service... Mar 17 18:32:50.970719 systemd[1]: Starting systemd-vconsole-setup.service... Mar 17 18:32:50.970727 systemd[1]: Finished kmod-static-nodes.service. Mar 17 18:32:50.970735 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 18:32:50.970743 kernel: audit: type=1130 audit(1742236370.962:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:50.970751 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Mar 17 18:32:50.970761 systemd-journald[198]: Journal started Mar 17 18:32:50.970807 systemd-journald[198]: Runtime Journal (/run/log/journal/e128e9245a94407ba792ae40b92fda25) is 6.0M, max 48.4M, 42.4M free. Mar 17 18:32:50.962000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:50.968077 systemd-modules-load[199]: Inserted module 'overlay' Mar 17 18:32:50.974184 systemd[1]: Started systemd-journald.service. Mar 17 18:32:50.973000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:50.977892 kernel: audit: type=1130 audit(1742236370.973:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:50.977925 systemd[1]: Finished systemd-vconsole-setup.service. Mar 17 18:32:50.978000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:50.982902 kernel: audit: type=1130 audit(1742236370.978:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:50.983024 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Mar 17 18:32:50.982000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:50.984381 systemd[1]: Starting dracut-cmdline-ask.service... Mar 17 18:32:50.989529 kernel: audit: type=1130 audit(1742236370.982:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:50.998982 systemd[1]: Finished dracut-cmdline-ask.service. Mar 17 18:32:50.999432 systemd-resolved[200]: Positive Trust Anchors: Mar 17 18:32:50.999441 systemd-resolved[200]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 18:32:51.005030 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 18:32:50.999468 systemd-resolved[200]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Mar 17 18:32:51.014531 kernel: audit: type=1130 audit(1742236371.004:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.014548 kernel: Bridge firewalling registered Mar 17 18:32:51.004000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.014000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.002149 systemd-resolved[200]: Defaulting to hostname 'linux'. Mar 17 18:32:51.019780 kernel: audit: type=1130 audit(1742236371.014:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.005127 systemd[1]: Started systemd-resolved.service. Mar 17 18:32:51.014540 systemd-modules-load[199]: Inserted module 'br_netfilter' Mar 17 18:32:51.014586 systemd[1]: Reached target nss-lookup.target. Mar 17 18:32:51.018950 systemd[1]: Starting dracut-cmdline.service... Mar 17 18:32:51.027747 dracut-cmdline[217]: dracut-dracut-053 Mar 17 18:32:51.029319 dracut-cmdline[217]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=249ccd113f901380672c0d31e18f792e8e0344094c0e39eedc449f039418b31a Mar 17 18:32:51.037900 kernel: SCSI subsystem initialized Mar 17 18:32:51.048904 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 18:32:51.048929 kernel: device-mapper: uevent: version 1.0.3 Mar 17 18:32:51.050906 kernel: device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com Mar 17 18:32:51.053687 systemd-modules-load[199]: Inserted module 'dm_multipath' Mar 17 18:32:51.054455 systemd[1]: Finished systemd-modules-load.service. Mar 17 18:32:51.060026 kernel: audit: type=1130 audit(1742236371.054:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.059291 systemd[1]: Starting systemd-sysctl.service... Mar 17 18:32:51.066655 systemd[1]: Finished systemd-sysctl.service. Mar 17 18:32:51.070962 kernel: audit: type=1130 audit(1742236371.066:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.066000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.087902 kernel: Loading iSCSI transport class v2.0-870. Mar 17 18:32:51.104921 kernel: iscsi: registered transport (tcp) Mar 17 18:32:51.128915 kernel: iscsi: registered transport (qla4xxx) Mar 17 18:32:51.128947 kernel: QLogic iSCSI HBA Driver Mar 17 18:32:51.160100 systemd[1]: Finished dracut-cmdline.service. Mar 17 18:32:51.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.162631 systemd[1]: Starting dracut-pre-udev.service... Mar 17 18:32:51.166077 kernel: audit: type=1130 audit(1742236371.161:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.207911 kernel: raid6: avx2x4 gen() 29274 MB/s Mar 17 18:32:51.224901 kernel: raid6: avx2x4 xor() 6977 MB/s Mar 17 18:32:51.241905 kernel: raid6: avx2x2 gen() 29380 MB/s Mar 17 18:32:51.258902 kernel: raid6: avx2x2 xor() 18650 MB/s Mar 17 18:32:51.275924 kernel: raid6: avx2x1 gen() 25204 MB/s Mar 17 18:32:51.292905 kernel: raid6: avx2x1 xor() 14726 MB/s Mar 17 18:32:51.309907 kernel: raid6: sse2x4 gen() 13794 MB/s Mar 17 18:32:51.326902 kernel: raid6: sse2x4 xor() 7271 MB/s Mar 17 18:32:51.343900 kernel: raid6: sse2x2 gen() 16242 MB/s Mar 17 18:32:51.360899 kernel: raid6: sse2x2 xor() 9813 MB/s Mar 17 18:32:51.377915 kernel: raid6: sse2x1 gen() 11898 MB/s Mar 17 18:32:51.395433 kernel: raid6: sse2x1 xor() 7609 MB/s Mar 17 18:32:51.395500 kernel: raid6: using algorithm avx2x2 gen() 29380 MB/s Mar 17 18:32:51.395510 kernel: raid6: .... xor() 18650 MB/s, rmw enabled Mar 17 18:32:51.396142 kernel: raid6: using avx2x2 recovery algorithm Mar 17 18:32:51.408919 kernel: xor: automatically using best checksumming function avx Mar 17 18:32:51.506919 kernel: Btrfs loaded, crc32c=crc32c-intel, zoned=no, fsverity=no Mar 17 18:32:51.515617 systemd[1]: Finished dracut-pre-udev.service. Mar 17 18:32:51.516000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.516000 audit: BPF prog-id=7 op=LOAD Mar 17 18:32:51.516000 audit: BPF prog-id=8 op=LOAD Mar 17 18:32:51.518000 systemd[1]: Starting systemd-udevd.service... Mar 17 18:32:51.533572 systemd-udevd[401]: Using default interface naming scheme 'v252'. Mar 17 18:32:51.538183 systemd[1]: Started systemd-udevd.service. Mar 17 18:32:51.538000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.541223 systemd[1]: Starting dracut-pre-trigger.service... Mar 17 18:32:51.552321 dracut-pre-trigger[409]: rd.md=0: removing MD RAID activation Mar 17 18:32:51.576768 systemd[1]: Finished dracut-pre-trigger.service. Mar 17 18:32:51.577000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.579220 systemd[1]: Starting systemd-udev-trigger.service... Mar 17 18:32:51.614617 systemd[1]: Finished systemd-udev-trigger.service. Mar 17 18:32:51.618000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:51.649128 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) Mar 17 18:32:51.655072 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Mar 17 18:32:51.655087 kernel: GPT:9289727 != 19775487 Mar 17 18:32:51.655096 kernel: GPT:Alternate GPT header not at the end of the disk. Mar 17 18:32:51.655104 kernel: GPT:9289727 != 19775487 Mar 17 18:32:51.655113 kernel: GPT: Use GNU Parted to correct GPT errors. Mar 17 18:32:51.655121 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 18:32:51.657899 kernel: cryptd: max_cpu_qlen set to 1000 Mar 17 18:32:51.667902 kernel: libata version 3.00 loaded. Mar 17 18:32:51.671358 kernel: AVX2 version of gcm_enc/dec engaged. Mar 17 18:32:51.671384 kernel: AES CTR mode by8 optimization enabled Mar 17 18:32:51.677208 kernel: ahci 0000:00:1f.2: version 3.0 Mar 17 18:32:51.704292 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 Mar 17 18:32:51.704320 kernel: ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode Mar 17 18:32:51.704416 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only Mar 17 18:32:51.704492 kernel: scsi host0: ahci Mar 17 18:32:51.704592 kernel: scsi host1: ahci Mar 17 18:32:51.704683 kernel: scsi host2: ahci Mar 17 18:32:51.704780 kernel: scsi host3: ahci Mar 17 18:32:51.704864 kernel: scsi host4: ahci Mar 17 18:32:51.704999 kernel: scsi host5: ahci Mar 17 18:32:51.705084 kernel: ata1: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040100 irq 34 Mar 17 18:32:51.705094 kernel: ata2: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040180 irq 34 Mar 17 18:32:51.705106 kernel: ata3: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040200 irq 34 Mar 17 18:32:51.705116 kernel: ata4: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040280 irq 34 Mar 17 18:32:51.705125 kernel: ata5: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040300 irq 34 Mar 17 18:32:51.705134 kernel: ata6: SATA max UDMA/133 abar m4096@0xc1040000 port 0xc1040380 irq 34 Mar 17 18:32:51.694255 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device. Mar 17 18:32:51.706350 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device. Mar 17 18:32:51.711348 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/vda6 scanned by (udev-worker) (456) Mar 17 18:32:51.714376 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device. Mar 17 18:32:51.722538 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device. Mar 17 18:32:51.725740 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Mar 17 18:32:51.726913 systemd[1]: Starting disk-uuid.service... Mar 17 18:32:51.737365 disk-uuid[526]: Primary Header is updated. Mar 17 18:32:51.737365 disk-uuid[526]: Secondary Entries is updated. Mar 17 18:32:51.737365 disk-uuid[526]: Secondary Header is updated. Mar 17 18:32:51.742930 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 18:32:51.746916 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 18:32:52.017866 kernel: ata5: SATA link down (SStatus 0 SControl 300) Mar 17 18:32:52.017946 kernel: ata2: SATA link down (SStatus 0 SControl 300) Mar 17 18:32:52.019912 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) Mar 17 18:32:52.019996 kernel: ata4: SATA link down (SStatus 0 SControl 300) Mar 17 18:32:52.020903 kernel: ata1: SATA link down (SStatus 0 SControl 300) Mar 17 18:32:52.021919 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 Mar 17 18:32:52.023089 kernel: ata3.00: applying bridge limits Mar 17 18:32:52.023908 kernel: ata6: SATA link down (SStatus 0 SControl 300) Mar 17 18:32:52.024908 kernel: ata3.00: configured for UDMA/100 Mar 17 18:32:52.025905 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 Mar 17 18:32:52.058446 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray Mar 17 18:32:52.075672 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 18:32:52.075693 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 Mar 17 18:32:52.755908 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Mar 17 18:32:52.755990 disk-uuid[527]: The operation has completed successfully. Mar 17 18:32:52.782563 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 18:32:52.783000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:52.783000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:52.782654 systemd[1]: Finished disk-uuid.service. Mar 17 18:32:52.786071 systemd[1]: Starting verity-setup.service... Mar 17 18:32:52.800909 kernel: device-mapper: verity: sha256 using implementation "sha256-ni" Mar 17 18:32:52.824060 systemd[1]: Found device dev-mapper-usr.device. Mar 17 18:32:52.825657 systemd[1]: Mounting sysusr-usr.mount... Mar 17 18:32:52.829000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:52.828049 systemd[1]: Finished verity-setup.service. Mar 17 18:32:52.927912 kernel: EXT4-fs (dm-0): mounted filesystem without journal. Opts: norecovery. Quota mode: none. Mar 17 18:32:52.928507 systemd[1]: Mounted sysusr-usr.mount. Mar 17 18:32:52.929555 systemd[1]: afterburn-network-kargs.service was skipped because no trigger condition checks were met. Mar 17 18:32:52.930327 systemd[1]: Starting ignition-setup.service... Mar 17 18:32:52.933068 systemd[1]: Starting parse-ip-for-networkd.service... Mar 17 18:32:52.943656 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 18:32:52.943684 kernel: BTRFS info (device vda6): using free space tree Mar 17 18:32:52.943698 kernel: BTRFS info (device vda6): has skinny extents Mar 17 18:32:52.953540 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 18:32:52.962113 systemd[1]: Finished ignition-setup.service. Mar 17 18:32:52.961000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:52.963197 systemd[1]: Starting ignition-fetch-offline.service... Mar 17 18:32:53.002827 systemd[1]: Finished parse-ip-for-networkd.service. Mar 17 18:32:53.003000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.003000 audit: BPF prog-id=9 op=LOAD Mar 17 18:32:53.005436 systemd[1]: Starting systemd-networkd.service... Mar 17 18:32:53.051142 systemd-networkd[718]: lo: Link UP Mar 17 18:32:53.051151 systemd-networkd[718]: lo: Gained carrier Mar 17 18:32:53.053000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.051755 systemd-networkd[718]: Enumeration completed Mar 17 18:32:53.051868 systemd[1]: Started systemd-networkd.service. Mar 17 18:32:53.052170 systemd-networkd[718]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:32:53.053528 systemd[1]: Reached target network.target. Mar 17 18:32:53.054299 systemd-networkd[718]: eth0: Link UP Mar 17 18:32:53.054302 systemd-networkd[718]: eth0: Gained carrier Mar 17 18:32:53.057023 systemd[1]: Starting iscsiuio.service... Mar 17 18:32:53.065564 ignition[649]: Ignition 2.14.0 Mar 17 18:32:53.065574 ignition[649]: Stage: fetch-offline Mar 17 18:32:53.065652 ignition[649]: no configs at "/usr/lib/ignition/base.d" Mar 17 18:32:53.065665 ignition[649]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 18:32:53.065806 ignition[649]: parsed url from cmdline: "" Mar 17 18:32:53.065811 ignition[649]: no config URL provided Mar 17 18:32:53.065817 ignition[649]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 18:32:53.065825 ignition[649]: no config at "/usr/lib/ignition/user.ign" Mar 17 18:32:53.065846 ignition[649]: op(1): [started] loading QEMU firmware config module Mar 17 18:32:53.065850 ignition[649]: op(1): executing: "modprobe" "qemu_fw_cfg" Mar 17 18:32:53.072141 ignition[649]: op(1): [finished] loading QEMU firmware config module Mar 17 18:32:53.080925 systemd[1]: Started iscsiuio.service. Mar 17 18:32:53.081000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.083520 systemd[1]: Starting iscsid.service... Mar 17 18:32:53.088756 iscsid[725]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Mar 17 18:32:53.088756 iscsid[725]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Mar 17 18:32:53.088756 iscsid[725]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Mar 17 18:32:53.088756 iscsid[725]: If using hardware iscsi like qla4xxx this message can be ignored. Mar 17 18:32:53.088756 iscsid[725]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Mar 17 18:32:53.088756 iscsid[725]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Mar 17 18:32:53.097000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.095909 systemd[1]: Started iscsid.service. Mar 17 18:32:53.102921 systemd[1]: Starting dracut-initqueue.service... Mar 17 18:32:53.138368 ignition[649]: parsing config with SHA512: 1ba050266b2a76f0130be328cf1e72ad3d9c3081f39d4dd96df11458c6025d71fb0c3eadf76edb35469607e08dd3333e4ee61088bb5ec7d78e08649b4f6e73f3 Mar 17 18:32:53.139409 systemd[1]: Finished dracut-initqueue.service. Mar 17 18:32:53.140000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.141648 systemd[1]: Reached target remote-fs-pre.target. Mar 17 18:32:53.143482 systemd[1]: Reached target remote-cryptsetup.target. Mar 17 18:32:53.144978 systemd-networkd[718]: eth0: DHCPv4 address 10.0.0.12/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 17 18:32:53.147237 systemd[1]: Reached target remote-fs.target. Mar 17 18:32:53.151294 systemd[1]: Starting dracut-pre-mount.service... Mar 17 18:32:53.156278 unknown[649]: fetched base config from "system" Mar 17 18:32:53.156291 unknown[649]: fetched user config from "qemu" Mar 17 18:32:53.159269 ignition[649]: fetch-offline: fetch-offline passed Mar 17 18:32:53.159389 ignition[649]: Ignition finished successfully Mar 17 18:32:53.160000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.160000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.161030 systemd[1]: Finished ignition-fetch-offline.service. Mar 17 18:32:53.161577 systemd[1]: Finished dracut-pre-mount.service. Mar 17 18:32:53.161829 systemd[1]: ignition-fetch.service was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). Mar 17 18:32:53.162432 systemd[1]: Starting ignition-kargs.service... Mar 17 18:32:53.174364 ignition[739]: Ignition 2.14.0 Mar 17 18:32:53.174377 ignition[739]: Stage: kargs Mar 17 18:32:53.174522 ignition[739]: no configs at "/usr/lib/ignition/base.d" Mar 17 18:32:53.174536 ignition[739]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 18:32:53.177379 systemd[1]: Finished ignition-kargs.service. Mar 17 18:32:53.178000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.176338 ignition[739]: kargs: kargs passed Mar 17 18:32:53.179745 systemd[1]: Starting ignition-disks.service... Mar 17 18:32:53.176389 ignition[739]: Ignition finished successfully Mar 17 18:32:53.192984 ignition[745]: Ignition 2.14.0 Mar 17 18:32:53.192992 ignition[745]: Stage: disks Mar 17 18:32:53.193081 ignition[745]: no configs at "/usr/lib/ignition/base.d" Mar 17 18:32:53.194735 systemd[1]: Finished ignition-disks.service. Mar 17 18:32:53.195000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.193090 ignition[745]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 18:32:53.196313 systemd[1]: Reached target initrd-root-device.target. Mar 17 18:32:53.194104 ignition[745]: disks: disks passed Mar 17 18:32:53.197806 systemd[1]: Reached target local-fs-pre.target. Mar 17 18:32:53.194139 ignition[745]: Ignition finished successfully Mar 17 18:32:53.198612 systemd[1]: Reached target local-fs.target. Mar 17 18:32:53.199385 systemd[1]: Reached target sysinit.target. Mar 17 18:32:53.201095 systemd[1]: Reached target basic.target. Mar 17 18:32:53.202555 systemd[1]: Starting systemd-fsck-root.service... Mar 17 18:32:53.216076 systemd-fsck[753]: ROOT: clean, 623/553520 files, 56022/553472 blocks Mar 17 18:32:53.221583 systemd[1]: Finished systemd-fsck-root.service. Mar 17 18:32:53.221000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.223439 systemd[1]: Mounting sysroot.mount... Mar 17 18:32:53.229900 kernel: EXT4-fs (vda9): mounted filesystem with ordered data mode. Opts: (null). Quota mode: none. Mar 17 18:32:53.230171 systemd[1]: Mounted sysroot.mount. Mar 17 18:32:53.230929 systemd[1]: Reached target initrd-root-fs.target. Mar 17 18:32:53.233417 systemd[1]: Mounting sysroot-usr.mount... Mar 17 18:32:53.234261 systemd[1]: flatcar-metadata-hostname.service was skipped because no trigger condition checks were met. Mar 17 18:32:53.234303 systemd[1]: ignition-remount-sysroot.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 18:32:53.234331 systemd[1]: Reached target ignition-diskful.target. Mar 17 18:32:53.236326 systemd[1]: Mounted sysroot-usr.mount. Mar 17 18:32:53.238274 systemd[1]: Starting initrd-setup-root.service... Mar 17 18:32:53.243460 initrd-setup-root[763]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 18:32:53.245471 initrd-setup-root[771]: cut: /sysroot/etc/group: No such file or directory Mar 17 18:32:53.250221 initrd-setup-root[779]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 18:32:53.253330 initrd-setup-root[787]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 18:32:53.275832 systemd[1]: Finished initrd-setup-root.service. Mar 17 18:32:53.275000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.277664 systemd[1]: Starting ignition-mount.service... Mar 17 18:32:53.279112 systemd[1]: Starting sysroot-boot.service... Mar 17 18:32:53.282131 bash[804]: umount: /sysroot/usr/share/oem: not mounted. Mar 17 18:32:53.296304 ignition[805]: INFO : Ignition 2.14.0 Mar 17 18:32:53.296000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.296367 systemd[1]: Finished sysroot-boot.service. Mar 17 18:32:53.299898 ignition[805]: INFO : Stage: mount Mar 17 18:32:53.299898 ignition[805]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 18:32:53.299898 ignition[805]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 18:32:53.302619 ignition[805]: INFO : mount: mount passed Mar 17 18:32:53.302619 ignition[805]: INFO : Ignition finished successfully Mar 17 18:32:53.304667 systemd[1]: Finished ignition-mount.service. Mar 17 18:32:53.304000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:53.836679 systemd[1]: Mounting sysroot-usr-share-oem.mount... Mar 17 18:32:53.843733 kernel: BTRFS: device label OEM devid 1 transid 13 /dev/vda6 scanned by mount (815) Mar 17 18:32:53.846722 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 18:32:53.846746 kernel: BTRFS info (device vda6): using free space tree Mar 17 18:32:53.846758 kernel: BTRFS info (device vda6): has skinny extents Mar 17 18:32:53.850506 systemd[1]: Mounted sysroot-usr-share-oem.mount. Mar 17 18:32:53.852116 systemd[1]: Starting ignition-files.service... Mar 17 18:32:53.865920 ignition[835]: INFO : Ignition 2.14.0 Mar 17 18:32:53.865920 ignition[835]: INFO : Stage: files Mar 17 18:32:53.867855 ignition[835]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 18:32:53.867855 ignition[835]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 18:32:53.867855 ignition[835]: DEBUG : files: compiled without relabeling support, skipping Mar 17 18:32:53.871913 ignition[835]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 18:32:53.871913 ignition[835]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 18:32:53.871913 ignition[835]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 18:32:53.871913 ignition[835]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 18:32:53.871913 ignition[835]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 18:32:53.871913 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 18:32:53.871913 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 18:32:53.871913 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 18:32:53.871913 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 17 18:32:53.871080 unknown[835]: wrote ssh authorized keys file for user: core Mar 17 18:32:54.025258 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Mar 17 18:32:54.229438 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 18:32:54.229438 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Mar 17 18:32:54.233404 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 18:32:54.233404 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 18:32:54.237109 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 18:32:54.262280 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 18:32:54.264201 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 18:32:54.265948 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 18:32:54.267626 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 18:32:54.269394 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 18:32:54.271105 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 18:32:54.272785 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 18:32:54.275151 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 18:32:54.277521 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 18:32:54.279575 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 17 18:32:54.546092 systemd-networkd[718]: eth0: Gained IPv6LL Mar 17 18:32:54.764239 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(b): GET result: OK Mar 17 18:32:55.356483 ignition[835]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 18:32:55.356483 ignition[835]: INFO : files: op(c): [started] processing unit "containerd.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(c): op(d): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(c): op(d): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(c): [finished] processing unit "containerd.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(e): [started] processing unit "prepare-helm.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(e): op(f): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(e): op(f): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(e): [finished] processing unit "prepare-helm.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(10): [started] processing unit "coreos-metadata.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(10): op(11): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(10): op(11): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(10): [finished] processing unit "coreos-metadata.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(12): [started] setting preset to enabled for "prepare-helm.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(12): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(13): [started] setting preset to disabled for "coreos-metadata.service" Mar 17 18:32:55.360895 ignition[835]: INFO : files: op(13): op(14): [started] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 18:32:55.391494 ignition[835]: INFO : files: op(13): op(14): [finished] removing enablement symlink(s) for "coreos-metadata.service" Mar 17 18:32:55.393257 ignition[835]: INFO : files: op(13): [finished] setting preset to disabled for "coreos-metadata.service" Mar 17 18:32:55.393257 ignition[835]: INFO : files: createResultFile: createFiles: op(15): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 18:32:55.393257 ignition[835]: INFO : files: createResultFile: createFiles: op(15): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 18:32:55.393257 ignition[835]: INFO : files: files passed Mar 17 18:32:55.393257 ignition[835]: INFO : Ignition finished successfully Mar 17 18:32:55.417462 kernel: kauditd_printk_skb: 24 callbacks suppressed Mar 17 18:32:55.417486 kernel: audit: type=1130 audit(1742236375.393:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.417503 kernel: audit: type=1130 audit(1742236375.404:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.417513 kernel: audit: type=1130 audit(1742236375.409:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.417523 kernel: audit: type=1131 audit(1742236375.409:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.393000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.404000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.409000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.409000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.393031 systemd[1]: Finished ignition-files.service. Mar 17 18:32:55.395332 systemd[1]: Starting initrd-setup-root-after-ignition.service... Mar 17 18:32:55.400300 systemd[1]: torcx-profile-populate.service was skipped because of an unmet condition check (ConditionPathExists=/sysroot/etc/torcx/next-profile). Mar 17 18:32:55.422292 initrd-setup-root-after-ignition[858]: grep: /sysroot/usr/share/oem/oem-release: No such file or directory Mar 17 18:32:55.400888 systemd[1]: Starting ignition-quench.service... Mar 17 18:32:55.424662 initrd-setup-root-after-ignition[860]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 18:32:55.402919 systemd[1]: Finished initrd-setup-root-after-ignition.service. Mar 17 18:32:55.405573 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 18:32:55.405637 systemd[1]: Finished ignition-quench.service. Mar 17 18:32:55.410057 systemd[1]: Reached target ignition-complete.target. Mar 17 18:32:55.418190 systemd[1]: Starting initrd-parse-etc.service... Mar 17 18:32:55.432279 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 18:32:55.432365 systemd[1]: Finished initrd-parse-etc.service. Mar 17 18:32:55.441386 kernel: audit: type=1130 audit(1742236375.433:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.441401 kernel: audit: type=1131 audit(1742236375.433:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.433000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.433000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.434259 systemd[1]: Reached target initrd-fs.target. Mar 17 18:32:55.441384 systemd[1]: Reached target initrd.target. Mar 17 18:32:55.442211 systemd[1]: dracut-mount.service was skipped because no trigger condition checks were met. Mar 17 18:32:55.443092 systemd[1]: Starting dracut-pre-pivot.service... Mar 17 18:32:55.454184 systemd[1]: Finished dracut-pre-pivot.service. Mar 17 18:32:55.459392 kernel: audit: type=1130 audit(1742236375.454:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.454000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.455720 systemd[1]: Starting initrd-cleanup.service... Mar 17 18:32:55.465067 systemd[1]: Stopped target nss-lookup.target. Mar 17 18:32:55.500568 kernel: audit: type=1131 audit(1742236375.465:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.500601 kernel: audit: type=1131 audit(1742236375.470:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.500627 kernel: audit: type=1131 audit(1742236375.474:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.465000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.470000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.474000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.477000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.482000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.482000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.486000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.488000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.490000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.493000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.493000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.495000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.501119 iscsid[725]: iscsid shutting down. Mar 17 18:32:55.465593 systemd[1]: Stopped target remote-cryptsetup.target. Mar 17 18:32:55.465760 systemd[1]: Stopped target timers.target. Mar 17 18:32:55.465931 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 18:32:55.466021 systemd[1]: Stopped dracut-pre-pivot.service. Mar 17 18:32:55.466360 systemd[1]: Stopped target initrd.target. Mar 17 18:32:55.469586 systemd[1]: Stopped target basic.target. Mar 17 18:32:55.469746 systemd[1]: Stopped target ignition-complete.target. Mar 17 18:32:55.469925 systemd[1]: Stopped target ignition-diskful.target. Mar 17 18:32:55.470082 systemd[1]: Stopped target initrd-root-device.target. Mar 17 18:32:55.470263 systemd[1]: Stopped target remote-fs.target. Mar 17 18:32:55.470443 systemd[1]: Stopped target remote-fs-pre.target. Mar 17 18:32:55.470620 systemd[1]: Stopped target sysinit.target. Mar 17 18:32:55.470789 systemd[1]: Stopped target local-fs.target. Mar 17 18:32:55.471129 systemd[1]: Stopped target local-fs-pre.target. Mar 17 18:32:55.471336 systemd[1]: Stopped target swap.target. Mar 17 18:32:55.471450 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 18:32:55.471534 systemd[1]: Stopped dracut-pre-mount.service. Mar 17 18:32:55.471713 systemd[1]: Stopped target cryptsetup.target. Mar 17 18:32:55.475126 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 18:32:55.475216 systemd[1]: Stopped dracut-initqueue.service. Mar 17 18:32:55.475354 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 18:32:55.475437 systemd[1]: Stopped ignition-fetch-offline.service. Mar 17 18:32:55.478681 systemd[1]: Stopped target paths.target. Mar 17 18:32:55.478765 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 18:32:55.481926 systemd[1]: Stopped systemd-ask-password-console.path. Mar 17 18:32:55.482385 systemd[1]: Stopped target slices.target. Mar 17 18:32:55.482517 systemd[1]: Stopped target sockets.target. Mar 17 18:32:55.482696 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 18:32:55.482831 systemd[1]: Stopped initrd-setup-root-after-ignition.service. Mar 17 18:32:55.483235 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 18:32:55.483361 systemd[1]: Stopped ignition-files.service. Mar 17 18:32:55.484828 systemd[1]: Stopping ignition-mount.service... Mar 17 18:32:55.485306 systemd[1]: Stopping iscsid.service... Mar 17 18:32:55.486371 systemd[1]: Stopping sysroot-boot.service... Mar 17 18:32:55.486691 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 18:32:55.486843 systemd[1]: Stopped systemd-udev-trigger.service. Mar 17 18:32:55.487300 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 18:32:55.487418 systemd[1]: Stopped dracut-pre-trigger.service. Mar 17 18:32:55.490970 systemd[1]: iscsid.service: Deactivated successfully. Mar 17 18:32:55.491075 systemd[1]: Stopped iscsid.service. Mar 17 18:32:55.491753 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 18:32:55.491843 systemd[1]: Closed iscsid.socket. Mar 17 18:32:55.492355 systemd[1]: Stopping iscsiuio.service... Mar 17 18:32:55.494009 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 18:32:55.494137 systemd[1]: Finished initrd-cleanup.service. Mar 17 18:32:55.495847 systemd[1]: iscsiuio.service: Deactivated successfully. Mar 17 18:32:55.496095 systemd[1]: Stopped iscsiuio.service. Mar 17 18:32:55.496666 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 18:32:55.496745 systemd[1]: Closed iscsiuio.socket. Mar 17 18:32:55.516157 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 18:32:55.588987 ignition[875]: INFO : Ignition 2.14.0 Mar 17 18:32:55.588987 ignition[875]: INFO : Stage: umount Mar 17 18:32:55.590629 ignition[875]: INFO : no configs at "/usr/lib/ignition/base.d" Mar 17 18:32:55.590629 ignition[875]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" Mar 17 18:32:55.590629 ignition[875]: INFO : umount: umount passed Mar 17 18:32:55.590629 ignition[875]: INFO : Ignition finished successfully Mar 17 18:32:55.591000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.595000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.596000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.591185 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 18:32:55.598000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.591303 systemd[1]: Stopped ignition-mount.service. Mar 17 18:32:55.592708 systemd[1]: Stopped target network.target. Mar 17 18:32:55.594470 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 18:32:55.594530 systemd[1]: Stopped ignition-disks.service. Mar 17 18:32:55.596118 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 18:32:55.596151 systemd[1]: Stopped ignition-kargs.service. Mar 17 18:32:55.605000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.597789 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 18:32:55.597822 systemd[1]: Stopped ignition-setup.service. Mar 17 18:32:55.598853 systemd[1]: Stopping systemd-networkd.service... Mar 17 18:32:55.611000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.613000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.600701 systemd[1]: Stopping systemd-resolved.service... Mar 17 18:32:55.615000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.603925 systemd-networkd[718]: eth0: DHCPv6 lease lost Mar 17 18:32:55.615000 audit: BPF prog-id=9 op=UNLOAD Mar 17 18:32:55.605364 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 18:32:55.605446 systemd[1]: Stopped systemd-networkd.service. Mar 17 18:32:55.607076 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 18:32:55.607109 systemd[1]: Closed systemd-networkd.socket. Mar 17 18:32:55.621000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.609461 systemd[1]: Stopping network-cleanup.service... Mar 17 18:32:55.610738 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 18:32:55.610793 systemd[1]: Stopped parse-ip-for-networkd.service. Mar 17 18:32:55.625000 audit: BPF prog-id=6 op=UNLOAD Mar 17 18:32:55.612543 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 18:32:55.627000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.612588 systemd[1]: Stopped systemd-sysctl.service. Mar 17 18:32:55.614558 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 18:32:55.614623 systemd[1]: Stopped systemd-modules-load.service. Mar 17 18:32:55.630000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.616545 systemd[1]: Stopping systemd-udevd.service... Mar 17 18:32:55.620506 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 17 18:32:55.621058 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 18:32:55.636000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.639000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.621175 systemd[1]: Stopped systemd-resolved.service. Mar 17 18:32:55.626585 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 18:32:55.640000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.626715 systemd[1]: Stopped systemd-udevd.service. Mar 17 18:32:55.645000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.629645 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 18:32:55.629727 systemd[1]: Stopped network-cleanup.service. Mar 17 18:32:55.647000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.649000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.631717 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 18:32:55.631756 systemd[1]: Closed systemd-udevd-control.socket. Mar 17 18:32:55.633496 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 18:32:55.652000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.652000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.633529 systemd[1]: Closed systemd-udevd-kernel.socket. Mar 17 18:32:55.635466 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 18:32:55.635511 systemd[1]: Stopped dracut-pre-udev.service. Mar 17 18:32:55.637918 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 18:32:55.637958 systemd[1]: Stopped dracut-cmdline.service. Mar 17 18:32:55.639815 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 18:32:55.639856 systemd[1]: Stopped dracut-cmdline-ask.service. Mar 17 18:32:55.642514 systemd[1]: Starting initrd-udevadm-cleanup-db.service... Mar 17 18:32:55.643463 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Mar 17 18:32:55.643545 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service. Mar 17 18:32:55.645527 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 18:32:55.645570 systemd[1]: Stopped kmod-static-nodes.service. Mar 17 18:32:55.647517 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 18:32:55.665000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.647559 systemd[1]: Stopped systemd-vconsole-setup.service. Mar 17 18:32:55.650463 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Mar 17 18:32:55.669000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:55.651216 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 18:32:55.651336 systemd[1]: Finished initrd-udevadm-cleanup-db.service. Mar 17 18:32:55.665288 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 18:32:55.665399 systemd[1]: Stopped sysroot-boot.service. Mar 17 18:32:55.667033 systemd[1]: Reached target initrd-switch-root.target. Mar 17 18:32:55.668971 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 18:32:55.669026 systemd[1]: Stopped initrd-setup-root.service. Mar 17 18:32:55.671657 systemd[1]: Starting initrd-switch-root.service... Mar 17 18:32:55.678038 systemd[1]: Switching root. Mar 17 18:32:55.680000 audit: BPF prog-id=5 op=UNLOAD Mar 17 18:32:55.680000 audit: BPF prog-id=4 op=UNLOAD Mar 17 18:32:55.680000 audit: BPF prog-id=3 op=UNLOAD Mar 17 18:32:55.680000 audit: BPF prog-id=8 op=UNLOAD Mar 17 18:32:55.680000 audit: BPF prog-id=7 op=UNLOAD Mar 17 18:32:55.699107 systemd-journald[198]: Journal stopped Mar 17 18:32:59.559039 systemd-journald[198]: Received SIGTERM from PID 1 (systemd). Mar 17 18:32:59.559111 kernel: SELinux: Class mctp_socket not defined in policy. Mar 17 18:32:59.559124 kernel: SELinux: Class anon_inode not defined in policy. Mar 17 18:32:59.559828 kernel: SELinux: the above unknown classes and permissions will be allowed Mar 17 18:32:59.559842 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 18:32:59.559851 kernel: SELinux: policy capability open_perms=1 Mar 17 18:32:59.559861 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 18:32:59.559871 kernel: SELinux: policy capability always_check_network=0 Mar 17 18:32:59.559894 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 18:32:59.559908 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 18:32:59.559918 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 18:32:59.559928 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 18:32:59.559939 systemd[1]: Successfully loaded SELinux policy in 45.015ms. Mar 17 18:32:59.559959 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 7.026ms. Mar 17 18:32:59.559972 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Mar 17 18:32:59.559985 systemd[1]: Detected virtualization kvm. Mar 17 18:32:59.559995 systemd[1]: Detected architecture x86-64. Mar 17 18:32:59.560006 systemd[1]: Detected first boot. Mar 17 18:32:59.560016 systemd[1]: Initializing machine ID from VM UUID. Mar 17 18:32:59.560042 kernel: SELinux: Context system_u:object_r:container_file_t:s0:c1022,c1023 is not valid (left unmapped). Mar 17 18:32:59.560052 systemd[1]: Populated /etc with preset unit settings. Mar 17 18:32:59.560063 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:32:59.560083 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:32:59.560096 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:32:59.560107 systemd[1]: Queued start job for default target multi-user.target. Mar 17 18:32:59.560118 systemd[1]: Unnecessary job was removed for dev-vda6.device. Mar 17 18:32:59.560128 systemd[1]: Created slice system-addon\x2dconfig.slice. Mar 17 18:32:59.560138 systemd[1]: Created slice system-addon\x2drun.slice. Mar 17 18:32:59.560150 systemd[1]: Created slice system-getty.slice. Mar 17 18:32:59.560160 systemd[1]: Created slice system-modprobe.slice. Mar 17 18:32:59.560172 systemd[1]: Created slice system-serial\x2dgetty.slice. Mar 17 18:32:59.560183 systemd[1]: Created slice system-system\x2dcloudinit.slice. Mar 17 18:32:59.560193 systemd[1]: Created slice system-systemd\x2dfsck.slice. Mar 17 18:32:59.560204 systemd[1]: Created slice user.slice. Mar 17 18:32:59.560214 systemd[1]: Started systemd-ask-password-console.path. Mar 17 18:32:59.560225 systemd[1]: Started systemd-ask-password-wall.path. Mar 17 18:32:59.560236 systemd[1]: Set up automount boot.automount. Mar 17 18:32:59.560245 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount. Mar 17 18:32:59.560255 systemd[1]: Reached target integritysetup.target. Mar 17 18:32:59.560267 systemd[1]: Reached target remote-cryptsetup.target. Mar 17 18:32:59.560278 systemd[1]: Reached target remote-fs.target. Mar 17 18:32:59.560288 systemd[1]: Reached target slices.target. Mar 17 18:32:59.560299 systemd[1]: Reached target swap.target. Mar 17 18:32:59.560310 systemd[1]: Reached target torcx.target. Mar 17 18:32:59.560320 systemd[1]: Reached target veritysetup.target. Mar 17 18:32:59.560330 systemd[1]: Listening on systemd-coredump.socket. Mar 17 18:32:59.560341 systemd[1]: Listening on systemd-initctl.socket. Mar 17 18:32:59.560352 systemd[1]: Listening on systemd-journald-audit.socket. Mar 17 18:32:59.560363 systemd[1]: Listening on systemd-journald-dev-log.socket. Mar 17 18:32:59.560374 systemd[1]: Listening on systemd-journald.socket. Mar 17 18:32:59.560385 systemd[1]: Listening on systemd-networkd.socket. Mar 17 18:32:59.560395 systemd[1]: Listening on systemd-udevd-control.socket. Mar 17 18:32:59.560406 systemd[1]: Listening on systemd-udevd-kernel.socket. Mar 17 18:32:59.560416 systemd[1]: Listening on systemd-userdbd.socket. Mar 17 18:32:59.560427 systemd[1]: Mounting dev-hugepages.mount... Mar 17 18:32:59.560438 systemd[1]: Mounting dev-mqueue.mount... Mar 17 18:32:59.560448 systemd[1]: Mounting media.mount... Mar 17 18:32:59.560461 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:32:59.560471 systemd[1]: Mounting sys-kernel-debug.mount... Mar 17 18:32:59.560483 systemd[1]: Mounting sys-kernel-tracing.mount... Mar 17 18:32:59.560497 systemd[1]: Mounting tmp.mount... Mar 17 18:32:59.560508 systemd[1]: Starting flatcar-tmpfiles.service... Mar 17 18:32:59.560519 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:32:59.560529 systemd[1]: Starting kmod-static-nodes.service... Mar 17 18:32:59.560540 systemd[1]: Starting modprobe@configfs.service... Mar 17 18:32:59.560550 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:32:59.560562 systemd[1]: Starting modprobe@drm.service... Mar 17 18:32:59.560573 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:32:59.560583 systemd[1]: Starting modprobe@fuse.service... Mar 17 18:32:59.560593 systemd[1]: Starting modprobe@loop.service... Mar 17 18:32:59.560604 systemd[1]: setup-nsswitch.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 18:32:59.560614 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Mar 17 18:32:59.560625 systemd[1]: (This warning is only shown for the first unit using IP firewalling.) Mar 17 18:32:59.560635 systemd[1]: Starting systemd-journald.service... Mar 17 18:32:59.560645 kernel: fuse: init (API version 7.34) Mar 17 18:32:59.560660 kernel: loop: module loaded Mar 17 18:32:59.560670 systemd[1]: Starting systemd-modules-load.service... Mar 17 18:32:59.560680 systemd[1]: Starting systemd-network-generator.service... Mar 17 18:32:59.560691 systemd[1]: Starting systemd-remount-fs.service... Mar 17 18:32:59.560701 systemd[1]: Starting systemd-udev-trigger.service... Mar 17 18:32:59.560712 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:32:59.560722 systemd[1]: Mounted dev-hugepages.mount. Mar 17 18:32:59.560733 systemd[1]: Mounted dev-mqueue.mount. Mar 17 18:32:59.560745 systemd[1]: Mounted media.mount. Mar 17 18:32:59.560759 systemd[1]: Mounted sys-kernel-debug.mount. Mar 17 18:32:59.560770 systemd[1]: Mounted sys-kernel-tracing.mount. Mar 17 18:32:59.560780 systemd[1]: Mounted tmp.mount. Mar 17 18:32:59.560790 systemd[1]: Finished kmod-static-nodes.service. Mar 17 18:32:59.560807 systemd-journald[1020]: Journal started Mar 17 18:32:59.560854 systemd-journald[1020]: Runtime Journal (/run/log/journal/e128e9245a94407ba792ae40b92fda25) is 6.0M, max 48.4M, 42.4M free. Mar 17 18:32:59.439000 audit[1]: AVC avc: denied { audit_read } for pid=1 comm="systemd" capability=37 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Mar 17 18:32:59.439000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Mar 17 18:32:59.519000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Mar 17 18:32:59.519000 audit[1020]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=6 a1=7ffe41475710 a2=4000 a3=7ffe414757ac items=0 ppid=1 pid=1020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:32:59.519000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Mar 17 18:32:59.561000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.565780 systemd[1]: Started systemd-journald.service. Mar 17 18:32:59.562000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.564000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.564000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.564532 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 18:32:59.564741 systemd[1]: Finished modprobe@configfs.service. Mar 17 18:32:59.565898 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:32:59.566133 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:32:59.566000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.566000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.567352 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 18:32:59.567566 systemd[1]: Finished modprobe@drm.service. Mar 17 18:32:59.567000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.567000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.568545 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:32:59.568810 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:32:59.569000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.569000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.569872 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 18:32:59.570060 systemd[1]: Finished modprobe@fuse.service. Mar 17 18:32:59.569000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.569000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.571100 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:32:59.571332 systemd[1]: Finished modprobe@loop.service. Mar 17 18:32:59.573000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.573000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.574000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.575000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.574294 systemd[1]: Finished flatcar-tmpfiles.service. Mar 17 18:32:59.575527 systemd[1]: Finished systemd-modules-load.service. Mar 17 18:32:59.577000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.577033 systemd[1]: Finished systemd-network-generator.service. Mar 17 18:32:59.578253 systemd[1]: Finished systemd-remount-fs.service. Mar 17 18:32:59.578000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.579442 systemd[1]: Reached target network-pre.target. Mar 17 18:32:59.581396 systemd[1]: Mounting sys-fs-fuse-connections.mount... Mar 17 18:32:59.583257 systemd[1]: Mounting sys-kernel-config.mount... Mar 17 18:32:59.584209 systemd[1]: remount-root.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 18:32:59.585799 systemd[1]: Starting systemd-hwdb-update.service... Mar 17 18:32:59.587664 systemd[1]: Starting systemd-journal-flush.service... Mar 17 18:32:59.588759 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:32:59.589797 systemd[1]: Starting systemd-random-seed.service... Mar 17 18:32:59.590844 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:32:59.591783 systemd[1]: Starting systemd-sysctl.service... Mar 17 18:32:59.592999 systemd-journald[1020]: Time spent on flushing to /var/log/journal/e128e9245a94407ba792ae40b92fda25 is 13.426ms for 1105 entries. Mar 17 18:32:59.592999 systemd-journald[1020]: System Journal (/var/log/journal/e128e9245a94407ba792ae40b92fda25) is 8.0M, max 195.6M, 187.6M free. Mar 17 18:32:59.830505 systemd-journald[1020]: Received client request to flush runtime journal. Mar 17 18:32:59.625000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.638000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.661000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.717000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.737000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:32:59.595041 systemd[1]: Starting systemd-sysusers.service... Mar 17 18:32:59.597730 systemd[1]: Mounted sys-fs-fuse-connections.mount. Mar 17 18:32:59.834256 udevadm[1056]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 17 18:32:59.600410 systemd[1]: Mounted sys-kernel-config.mount. Mar 17 18:32:59.620494 systemd[1]: Finished systemd-udev-trigger.service. Mar 17 18:32:59.628219 systemd[1]: Starting systemd-udev-settle.service... Mar 17 18:32:59.638779 systemd[1]: Finished systemd-sysctl.service. Mar 17 18:32:59.643647 systemd[1]: Finished systemd-sysusers.service. Mar 17 18:32:59.663111 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Mar 17 18:32:59.717690 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Mar 17 18:32:59.737632 systemd[1]: Finished systemd-random-seed.service. Mar 17 18:32:59.738567 systemd[1]: Reached target first-boot-complete.target. Mar 17 18:32:59.836202 systemd[1]: Finished systemd-journal-flush.service. Mar 17 18:32:59.837000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.253072 systemd[1]: Finished systemd-hwdb-update.service. Mar 17 18:33:00.253000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.256039 systemd[1]: Starting systemd-udevd.service... Mar 17 18:33:00.274538 systemd-udevd[1068]: Using default interface naming scheme 'v252'. Mar 17 18:33:00.289715 systemd[1]: Started systemd-udevd.service. Mar 17 18:33:00.289000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.295803 systemd[1]: Starting systemd-networkd.service... Mar 17 18:33:00.301349 systemd[1]: Starting systemd-userdbd.service... Mar 17 18:33:00.343800 systemd[1]: Found device dev-ttyS0.device. Mar 17 18:33:00.351480 systemd[1]: Started systemd-userdbd.service. Mar 17 18:33:00.351000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.363295 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Mar 17 18:33:00.391929 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 Mar 17 18:33:00.399165 kernel: ACPI: button: Power Button [PWRF] Mar 17 18:33:00.409027 (udev-worker)[1097]: could not read from '/sys/module/pcc_cpufreq/initstate': No such device Mar 17 18:33:00.409558 systemd-networkd[1078]: lo: Link UP Mar 17 18:33:00.409968 systemd-networkd[1078]: lo: Gained carrier Mar 17 18:33:00.410599 systemd-networkd[1078]: Enumeration completed Mar 17 18:33:00.411000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.410816 systemd[1]: Started systemd-networkd.service. Mar 17 18:33:00.412570 systemd-networkd[1078]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:33:00.413404 kernel: kauditd_printk_skb: 80 callbacks suppressed Mar 17 18:33:00.413462 kernel: audit: type=1130 audit(1742236380.411:116): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.413947 systemd-networkd[1078]: eth0: Link UP Mar 17 18:33:00.414043 systemd-networkd[1078]: eth0: Gained carrier Mar 17 18:33:00.414000 audit[1071]: AVC avc: denied { confidentiality } for pid=1071 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Mar 17 18:33:00.431905 kernel: audit: type=1400 audit(1742236380.414:117): avc: denied { confidentiality } for pid=1071 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Mar 17 18:33:00.454328 kernel: audit: type=1300 audit(1742236380.414:117): arch=c000003e syscall=175 success=yes exit=0 a0=5564a8904c80 a1=338ac a2=7fc2cff71bc5 a3=5 items=110 ppid=1068 pid=1071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:00.454409 kernel: audit: type=1307 audit(1742236380.414:117): cwd="/" Mar 17 18:33:00.454444 kernel: audit: type=1302 audit(1742236380.414:117): item=0 name=(null) inode=50 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.454461 kernel: audit: type=1302 audit(1742236380.414:117): item=1 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.454487 kernel: audit: type=1302 audit(1742236380.414:117): item=2 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.454504 kernel: audit: type=1302 audit(1742236380.414:117): item=3 name=(null) inode=1888 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit[1071]: SYSCALL arch=c000003e syscall=175 success=yes exit=0 a0=5564a8904c80 a1=338ac a2=7fc2cff71bc5 a3=5 items=110 ppid=1068 pid=1071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:00.414000 audit: CWD cwd="/" Mar 17 18:33:00.414000 audit: PATH item=0 name=(null) inode=50 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=1 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=2 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=3 name=(null) inode=1888 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.433010 systemd-networkd[1078]: eth0: DHCPv4 address 10.0.0.12/16, gateway 10.0.0.1 acquired from 10.0.0.1 Mar 17 18:33:00.467219 kernel: audit: type=1302 audit(1742236380.414:117): item=4 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.467264 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 Mar 17 18:33:00.467280 kernel: audit: type=1302 audit(1742236380.414:117): item=5 name=(null) inode=1889 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=4 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=5 name=(null) inode=1889 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=6 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=7 name=(null) inode=1890 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=8 name=(null) inode=1890 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=9 name=(null) inode=1891 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=10 name=(null) inode=1890 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=11 name=(null) inode=1892 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=12 name=(null) inode=1890 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=13 name=(null) inode=1893 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=14 name=(null) inode=1890 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=15 name=(null) inode=1894 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=16 name=(null) inode=1890 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=17 name=(null) inode=1895 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=18 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=19 name=(null) inode=1896 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=20 name=(null) inode=1896 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=21 name=(null) inode=1897 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=22 name=(null) inode=1896 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=23 name=(null) inode=1898 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=24 name=(null) inode=1896 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=25 name=(null) inode=1899 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=26 name=(null) inode=1896 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=27 name=(null) inode=1900 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=28 name=(null) inode=1896 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=29 name=(null) inode=1901 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=30 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=31 name=(null) inode=1902 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=32 name=(null) inode=1902 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=33 name=(null) inode=1903 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=34 name=(null) inode=1902 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=35 name=(null) inode=1904 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=36 name=(null) inode=1902 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=37 name=(null) inode=1905 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=38 name=(null) inode=1902 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=39 name=(null) inode=1906 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=40 name=(null) inode=1902 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=41 name=(null) inode=1907 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=42 name=(null) inode=1887 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=43 name=(null) inode=1908 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=44 name=(null) inode=1908 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=45 name=(null) inode=1909 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=46 name=(null) inode=1908 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=47 name=(null) inode=1910 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=48 name=(null) inode=1908 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=49 name=(null) inode=1911 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=50 name=(null) inode=1908 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=51 name=(null) inode=1912 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=52 name=(null) inode=1908 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=53 name=(null) inode=1913 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=54 name=(null) inode=50 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=55 name=(null) inode=1914 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=56 name=(null) inode=1914 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=57 name=(null) inode=1915 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=58 name=(null) inode=1914 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=59 name=(null) inode=1916 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=60 name=(null) inode=1914 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=61 name=(null) inode=1917 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=62 name=(null) inode=1917 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=63 name=(null) inode=1918 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=64 name=(null) inode=1917 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=65 name=(null) inode=1919 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=66 name=(null) inode=1917 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=67 name=(null) inode=1920 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=68 name=(null) inode=1917 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=69 name=(null) inode=1921 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=70 name=(null) inode=1917 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=71 name=(null) inode=1922 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=72 name=(null) inode=1914 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=73 name=(null) inode=1923 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=74 name=(null) inode=1923 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=75 name=(null) inode=1924 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=76 name=(null) inode=1923 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=77 name=(null) inode=1925 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=78 name=(null) inode=1923 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=79 name=(null) inode=1926 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=80 name=(null) inode=1923 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=81 name=(null) inode=1927 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=82 name=(null) inode=1923 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=83 name=(null) inode=1928 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=84 name=(null) inode=1914 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=85 name=(null) inode=1929 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=86 name=(null) inode=1929 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=87 name=(null) inode=1930 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=88 name=(null) inode=1929 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=89 name=(null) inode=1931 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=90 name=(null) inode=1929 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=91 name=(null) inode=1932 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=92 name=(null) inode=1929 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=93 name=(null) inode=1933 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=94 name=(null) inode=1929 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=95 name=(null) inode=1934 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=96 name=(null) inode=1914 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=97 name=(null) inode=1935 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=98 name=(null) inode=1935 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=99 name=(null) inode=1936 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=100 name=(null) inode=1935 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=101 name=(null) inode=1937 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=102 name=(null) inode=1935 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=103 name=(null) inode=1938 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=104 name=(null) inode=1935 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=105 name=(null) inode=1939 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=106 name=(null) inode=1935 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=107 name=(null) inode=1940 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=108 name=(null) inode=1 dev=00:07 mode=040700 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:debugfs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PATH item=109 name=(null) inode=1941 dev=00:07 mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:debugfs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:33:00.414000 audit: PROCTITLE proctitle="(udev-worker)" Mar 17 18:33:00.485347 kernel: i801_smbus 0000:00:1f.3: Enabling SMBus device Mar 17 18:33:00.493906 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt Mar 17 18:33:00.494040 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI) Mar 17 18:33:00.494157 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD Mar 17 18:33:00.494249 kernel: mousedev: PS/2 mouse device common for all mice Mar 17 18:33:00.502425 kernel: kvm: Nested Virtualization enabled Mar 17 18:33:00.502521 kernel: SVM: kvm: Nested Paging enabled Mar 17 18:33:00.502565 kernel: SVM: Virtual VMLOAD VMSAVE supported Mar 17 18:33:00.503109 kernel: SVM: Virtual GIF supported Mar 17 18:33:00.520907 kernel: EDAC MC: Ver: 3.0.0 Mar 17 18:33:00.546517 systemd[1]: Finished systemd-udev-settle.service. Mar 17 18:33:00.547000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-settle comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.549208 systemd[1]: Starting lvm2-activation-early.service... Mar 17 18:33:00.559469 lvm[1106]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 18:33:00.591115 systemd[1]: Finished lvm2-activation-early.service. Mar 17 18:33:00.591000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.592393 systemd[1]: Reached target cryptsetup.target. Mar 17 18:33:00.594864 systemd[1]: Starting lvm2-activation.service... Mar 17 18:33:00.598467 lvm[1108]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 18:33:00.628515 systemd[1]: Finished lvm2-activation.service. Mar 17 18:33:00.629000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.631025 systemd[1]: Reached target local-fs-pre.target. Mar 17 18:33:00.632261 systemd[1]: var-lib-machines.mount was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 18:33:00.632296 systemd[1]: Reached target local-fs.target. Mar 17 18:33:00.633243 systemd[1]: Reached target machines.target. Mar 17 18:33:00.635652 systemd[1]: Starting ldconfig.service... Mar 17 18:33:00.637936 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:33:00.638000 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:33:00.639155 systemd[1]: Starting systemd-boot-update.service... Mar 17 18:33:00.641065 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service... Mar 17 18:33:00.643513 systemd[1]: Starting systemd-machine-id-commit.service... Mar 17 18:33:00.645694 systemd[1]: Starting systemd-sysext.service... Mar 17 18:33:00.647768 systemd[1]: boot.automount: Got automount request for /boot, triggered by 1111 (bootctl) Mar 17 18:33:00.648797 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service... Mar 17 18:33:00.655704 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service. Mar 17 18:33:00.657000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.666335 systemd[1]: Unmounting usr-share-oem.mount... Mar 17 18:33:00.672061 systemd[1]: usr-share-oem.mount: Deactivated successfully. Mar 17 18:33:00.672301 systemd[1]: Unmounted usr-share-oem.mount. Mar 17 18:33:00.684902 kernel: loop0: detected capacity change from 0 to 210664 Mar 17 18:33:00.720272 systemd-fsck[1120]: fsck.fat 4.2 (2021-01-31) Mar 17 18:33:00.720272 systemd-fsck[1120]: /dev/vda1: 790 files, 119319/258078 clusters Mar 17 18:33:00.721840 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service. Mar 17 18:33:00.722000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.724547 systemd[1]: Mounting boot.mount... Mar 17 18:33:00.847116 systemd[1]: Mounted boot.mount. Mar 17 18:33:00.854906 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 18:33:00.858930 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 18:33:00.859623 systemd[1]: Finished systemd-boot-update.service. Mar 17 18:33:00.860000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-boot-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.861117 systemd[1]: Finished systemd-machine-id-commit.service. Mar 17 18:33:00.862000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.870894 kernel: loop1: detected capacity change from 0 to 210664 Mar 17 18:33:00.874482 (sd-sysext)[1132]: Using extensions 'kubernetes'. Mar 17 18:33:00.874804 (sd-sysext)[1132]: Merged extensions into '/usr'. Mar 17 18:33:00.889761 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:33:00.891249 systemd[1]: Mounting usr-share-oem.mount... Mar 17 18:33:00.892173 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:33:00.893249 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:33:00.895436 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:33:00.897296 systemd[1]: Starting modprobe@loop.service... Mar 17 18:33:00.898466 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:33:00.898601 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:33:00.898712 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:33:00.901506 systemd[1]: Mounted usr-share-oem.mount. Mar 17 18:33:00.902727 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:33:00.902892 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:33:00.903000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.903000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.904552 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:33:00.904676 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:33:00.905000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.905000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.906435 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:33:00.906578 systemd[1]: Finished modprobe@loop.service. Mar 17 18:33:00.907000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.907000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.907856 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:33:00.907966 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:33:00.909061 systemd[1]: Finished systemd-sysext.service. Mar 17 18:33:00.909000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:00.911903 systemd[1]: Starting ensure-sysext.service... Mar 17 18:33:00.914044 systemd[1]: Starting systemd-tmpfiles-setup.service... Mar 17 18:33:00.917249 systemd[1]: Reloading. Mar 17 18:33:00.923806 systemd-tmpfiles[1146]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Mar 17 18:33:00.924671 systemd-tmpfiles[1146]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 18:33:00.926219 systemd-tmpfiles[1146]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 18:33:00.931290 ldconfig[1110]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 18:33:00.972041 /usr/lib/systemd/system-generators/torcx-generator[1167]: time="2025-03-17T18:33:00Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 18:33:00.972411 /usr/lib/systemd/system-generators/torcx-generator[1167]: time="2025-03-17T18:33:00Z" level=info msg="torcx already run" Mar 17 18:33:01.044177 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:33:01.044198 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:33:01.063503 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:33:01.112694 systemd[1]: Finished ldconfig.service. Mar 17 18:33:01.112000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ldconfig comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.114730 systemd[1]: Finished systemd-tmpfiles-setup.service. Mar 17 18:33:01.114000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.117697 systemd[1]: Starting audit-rules.service... Mar 17 18:33:01.119609 systemd[1]: Starting clean-ca-certificates.service... Mar 17 18:33:01.121647 systemd[1]: Starting systemd-journal-catalog-update.service... Mar 17 18:33:01.124021 systemd[1]: Starting systemd-resolved.service... Mar 17 18:33:01.126534 systemd[1]: Starting systemd-timesyncd.service... Mar 17 18:33:01.128444 systemd[1]: Starting systemd-update-utmp.service... Mar 17 18:33:01.129941 systemd[1]: Finished clean-ca-certificates.service. Mar 17 18:33:01.130000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.133088 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 18:33:01.133000 audit[1227]: SYSTEM_BOOT pid=1227 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.140696 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.141804 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:33:01.146890 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:33:01.149136 systemd[1]: Starting modprobe@loop.service... Mar 17 18:33:01.150138 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.150266 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:33:01.150353 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 18:33:01.151260 systemd[1]: Finished systemd-journal-catalog-update.service. Mar 17 18:33:01.152000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.153090 systemd[1]: Finished systemd-update-utmp.service. Mar 17 18:33:01.153000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.154570 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:33:01.154689 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:33:01.154000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.154000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.156168 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:33:01.156299 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:33:01.157000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.157000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.157715 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:33:01.157905 systemd[1]: Finished modprobe@loop.service. Mar 17 18:33:01.157000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.157000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:01.159588 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:33:01.159680 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.160978 systemd[1]: Starting systemd-update-done.service... Mar 17 18:33:01.160000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Mar 17 18:33:01.160000 audit[1245]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fffa5dae7e0 a2=420 a3=0 items=0 ppid=1215 pid=1245 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:01.160000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Mar 17 18:33:01.161708 augenrules[1245]: No rules Mar 17 18:33:01.163399 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.164793 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:33:01.166699 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:33:01.168614 systemd[1]: Starting modprobe@loop.service... Mar 17 18:33:01.169647 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.169749 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:33:01.169827 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 18:33:01.170598 systemd[1]: Finished audit-rules.service. Mar 17 18:33:01.172168 systemd[1]: Finished systemd-update-done.service. Mar 17 18:33:01.173495 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:33:01.173625 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:33:01.175094 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:33:01.175337 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:33:01.176762 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:33:01.176911 systemd[1]: Finished modprobe@loop.service. Mar 17 18:33:01.178069 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:33:01.178151 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.180613 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.181713 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:33:01.183745 systemd[1]: Starting modprobe@drm.service... Mar 17 18:33:01.185785 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:33:01.187765 systemd[1]: Starting modprobe@loop.service... Mar 17 18:33:01.188780 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.189048 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:33:01.190555 systemd[1]: Starting systemd-networkd-wait-online.service... Mar 17 18:33:01.191736 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 18:33:01.192679 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:33:01.192810 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:33:01.194277 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 18:33:01.194408 systemd[1]: Finished modprobe@drm.service. Mar 17 18:33:01.195702 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:33:01.195831 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:33:01.197330 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:33:01.197705 systemd[1]: Finished modprobe@loop.service. Mar 17 18:33:01.199315 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:33:01.199438 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.201182 systemd[1]: Finished ensure-sysext.service. Mar 17 18:33:01.210500 systemd[1]: Started systemd-timesyncd.service. Mar 17 18:33:01.626841 systemd-timesyncd[1226]: Contacted time server 10.0.0.1:123 (10.0.0.1). Mar 17 18:33:01.626854 systemd-resolved[1221]: Positive Trust Anchors: Mar 17 18:33:01.626865 systemd-resolved[1221]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 18:33:01.626891 systemd-resolved[1221]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Mar 17 18:33:01.626952 systemd[1]: Reached target time-set.target. Mar 17 18:33:01.627562 systemd-timesyncd[1226]: Initial clock synchronization to Mon 2025-03-17 18:33:01.626764 UTC. Mar 17 18:33:01.633635 systemd-resolved[1221]: Defaulting to hostname 'linux'. Mar 17 18:33:01.635037 systemd[1]: Started systemd-resolved.service. Mar 17 18:33:01.635950 systemd[1]: Reached target network.target. Mar 17 18:33:01.636752 systemd[1]: Reached target nss-lookup.target. Mar 17 18:33:01.637613 systemd[1]: Reached target sysinit.target. Mar 17 18:33:01.638501 systemd[1]: Started motdgen.path. Mar 17 18:33:01.639264 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path. Mar 17 18:33:01.640529 systemd[1]: Started logrotate.timer. Mar 17 18:33:01.641392 systemd[1]: Started mdadm.timer. Mar 17 18:33:01.642125 systemd[1]: Started systemd-tmpfiles-clean.timer. Mar 17 18:33:01.643027 systemd[1]: update-engine-stub.timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 18:33:01.643051 systemd[1]: Reached target paths.target. Mar 17 18:33:01.643815 systemd[1]: Reached target timers.target. Mar 17 18:33:01.644959 systemd[1]: Listening on dbus.socket. Mar 17 18:33:01.646926 systemd[1]: Starting docker.socket... Mar 17 18:33:01.648612 systemd[1]: Listening on sshd.socket. Mar 17 18:33:01.649500 systemd[1]: systemd-pcrphase-sysinit.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:33:01.649751 systemd[1]: Listening on docker.socket. Mar 17 18:33:01.650566 systemd[1]: Reached target sockets.target. Mar 17 18:33:01.651375 systemd[1]: Reached target basic.target. Mar 17 18:33:01.652283 systemd[1]: System is tainted: cgroupsv1 Mar 17 18:33:01.652323 systemd[1]: addon-config@usr-share-oem.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.652342 systemd[1]: addon-run@usr-share-oem.service was skipped because no trigger condition checks were met. Mar 17 18:33:01.653248 systemd[1]: Starting containerd.service... Mar 17 18:33:01.654969 systemd[1]: Starting dbus.service... Mar 17 18:33:01.656606 systemd[1]: Starting enable-oem-cloudinit.service... Mar 17 18:33:01.658625 systemd[1]: Starting extend-filesystems.service... Mar 17 18:33:01.659670 systemd[1]: flatcar-setup-environment.service was skipped because of an unmet condition check (ConditionPathExists=/usr/share/oem/bin/flatcar-setup-environment). Mar 17 18:33:01.661031 jq[1277]: false Mar 17 18:33:01.660738 systemd[1]: Starting motdgen.service... Mar 17 18:33:01.662679 systemd[1]: Starting prepare-helm.service... Mar 17 18:33:01.664620 systemd[1]: Starting ssh-key-proc-cmdline.service... Mar 17 18:33:01.666630 systemd[1]: Starting sshd-keygen.service... Mar 17 18:33:01.669197 systemd[1]: Starting systemd-logind.service... Mar 17 18:33:01.670030 systemd[1]: systemd-pcrphase.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:33:01.670100 systemd[1]: tcsd.service was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 18:33:01.671168 systemd[1]: Starting update-engine.service... Mar 17 18:33:01.672952 systemd[1]: Starting update-ssh-keys-after-ignition.service... Mar 17 18:33:01.675815 jq[1295]: true Mar 17 18:33:01.675798 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 18:33:01.676086 systemd[1]: Condition check resulted in enable-oem-cloudinit.service being skipped. Mar 17 18:33:01.677018 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 18:33:01.679215 systemd[1]: Finished ssh-key-proc-cmdline.service. Mar 17 18:33:01.683205 dbus-daemon[1276]: [system] SELinux support is enabled Mar 17 18:33:01.685154 systemd[1]: Started dbus.service. Mar 17 18:33:01.688731 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 18:33:01.689003 systemd[1]: Finished motdgen.service. Mar 17 18:33:01.690146 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 18:33:01.690169 systemd[1]: Reached target system-config.target. Mar 17 18:33:01.691131 systemd[1]: user-cloudinit-proc-cmdline.service was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 18:33:01.691148 systemd[1]: Reached target user-config.target. Mar 17 18:33:01.694007 jq[1304]: true Mar 17 18:33:01.696375 extend-filesystems[1278]: Found loop1 Mar 17 18:33:01.703189 extend-filesystems[1278]: Found sr0 Mar 17 18:33:01.703189 extend-filesystems[1278]: Found vda Mar 17 18:33:01.703189 extend-filesystems[1278]: Found vda1 Mar 17 18:33:01.703189 extend-filesystems[1278]: Found vda2 Mar 17 18:33:01.703189 extend-filesystems[1278]: Found vda3 Mar 17 18:33:01.703189 extend-filesystems[1278]: Found usr Mar 17 18:33:01.703189 extend-filesystems[1278]: Found vda4 Mar 17 18:33:01.703189 extend-filesystems[1278]: Found vda6 Mar 17 18:33:01.703189 extend-filesystems[1278]: Found vda7 Mar 17 18:33:01.703189 extend-filesystems[1278]: Found vda9 Mar 17 18:33:01.703189 extend-filesystems[1278]: Checking size of /dev/vda9 Mar 17 18:33:01.747201 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks Mar 17 18:33:01.747241 update_engine[1292]: I0317 18:33:01.745821 1292 main.cc:92] Flatcar Update Engine starting Mar 17 18:33:01.753479 env[1305]: time="2025-03-17T18:33:01.733746572Z" level=info msg="starting containerd" revision=92b3a9d6f1b3bcc6dc74875cfdea653fe39f09c2 version=1.6.16 Mar 17 18:33:01.753657 extend-filesystems[1278]: Resized partition /dev/vda9 Mar 17 18:33:01.725358 systemd-logind[1287]: Watching system buttons on /dev/input/event1 (Power Button) Mar 17 18:33:01.755219 tar[1302]: linux-amd64/helm Mar 17 18:33:01.755447 update_engine[1292]: I0317 18:33:01.747429 1292 update_check_scheduler.cc:74] Next update check in 7m58s Mar 17 18:33:01.755499 extend-filesystems[1320]: resize2fs 1.46.5 (30-Dec-2021) Mar 17 18:33:01.779948 kernel: EXT4-fs (vda9): resized filesystem to 1864699 Mar 17 18:33:01.725374 systemd-logind[1287]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 17 18:33:01.780403 env[1305]: time="2025-03-17T18:33:01.768817412Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 18:33:01.726502 systemd-logind[1287]: New seat seat0. Mar 17 18:33:01.736245 systemd[1]: Started systemd-logind.service. Mar 17 18:33:01.747358 systemd[1]: Started update-engine.service. Mar 17 18:33:01.750614 systemd[1]: Started locksmithd.service. Mar 17 18:33:01.781133 extend-filesystems[1320]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Mar 17 18:33:01.781133 extend-filesystems[1320]: old_desc_blocks = 1, new_desc_blocks = 1 Mar 17 18:33:01.781133 extend-filesystems[1320]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. Mar 17 18:33:01.786520 extend-filesystems[1278]: Resized filesystem in /dev/vda9 Mar 17 18:33:01.781849 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 18:33:01.782132 systemd[1]: Finished extend-filesystems.service. Mar 17 18:33:01.789247 env[1305]: time="2025-03-17T18:33:01.789062038Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:33:01.791624 bash[1338]: Updated "/home/core/.ssh/authorized_keys" Mar 17 18:33:01.792412 systemd[1]: Finished update-ssh-keys-after-ignition.service. Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.794247964Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.15.179-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.794281306Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.794509905Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.794525084Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.794537236Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.794546414Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.794609041Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.794839213Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.795003771Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:33:01.795186 env[1305]: time="2025-03-17T18:33:01.795017808Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 18:33:01.795421 env[1305]: time="2025-03-17T18:33:01.795061660Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Mar 17 18:33:01.795421 env[1305]: time="2025-03-17T18:33:01.795072931Z" level=info msg="metadata content store policy set" policy=shared Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800709471Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800742533Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800757171Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800795924Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800814639Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800830208Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800843583Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800858361Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800872417Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800886333Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800900159Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 18:33:01.800961 env[1305]: time="2025-03-17T18:33:01.800928993Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 18:33:01.801392 env[1305]: time="2025-03-17T18:33:01.801042506Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 18:33:01.801392 env[1305]: time="2025-03-17T18:33:01.801126564Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801481710Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801519120Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801536843Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801606904Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801621582Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801637382Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801651218Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801664513Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801683899Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801697775Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801709367Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801723433Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801841114Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801857094Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.802761 env[1305]: time="2025-03-17T18:33:01.801869497Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.803213 env[1305]: time="2025-03-17T18:33:01.801881560Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 18:33:01.803213 env[1305]: time="2025-03-17T18:33:01.801896437Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1 Mar 17 18:33:01.803213 env[1305]: time="2025-03-17T18:33:01.801907188Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 18:33:01.803213 env[1305]: time="2025-03-17T18:33:01.801940911Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin" Mar 17 18:33:01.803213 env[1305]: time="2025-03-17T18:33:01.801977489Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 18:33:01.803312 env[1305]: time="2025-03-17T18:33:01.802219924Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.6 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 18:33:01.803312 env[1305]: time="2025-03-17T18:33:01.802279706Z" level=info msg="Connect containerd service" Mar 17 18:33:01.803312 env[1305]: time="2025-03-17T18:33:01.802319240Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 18:33:01.804538 env[1305]: time="2025-03-17T18:33:01.804089230Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 18:33:01.804538 env[1305]: time="2025-03-17T18:33:01.804334009Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 18:33:01.804538 env[1305]: time="2025-03-17T18:33:01.804421052Z" level=info msg="Start subscribing containerd event" Mar 17 18:33:01.804538 env[1305]: time="2025-03-17T18:33:01.804512243Z" level=info msg="Start recovering state" Mar 17 18:33:01.806774 env[1305]: time="2025-03-17T18:33:01.804770437Z" level=info msg="Start event monitor" Mar 17 18:33:01.806774 env[1305]: time="2025-03-17T18:33:01.804757483Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 18:33:01.806774 env[1305]: time="2025-03-17T18:33:01.804863692Z" level=info msg="containerd successfully booted in 0.071789s" Mar 17 18:33:01.804974 systemd[1]: Started containerd.service. Mar 17 18:33:01.816623 env[1305]: time="2025-03-17T18:33:01.816309718Z" level=info msg="Start snapshots syncer" Mar 17 18:33:01.816623 env[1305]: time="2025-03-17T18:33:01.816379278Z" level=info msg="Start cni network conf syncer for default" Mar 17 18:33:01.816623 env[1305]: time="2025-03-17T18:33:01.816388736Z" level=info msg="Start streaming server" Mar 17 18:33:01.822654 locksmithd[1337]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 18:33:02.033795 sshd_keygen[1299]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 18:33:02.097957 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:33:02.098030 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:33:02.114087 systemd[1]: Finished sshd-keygen.service. Mar 17 18:33:02.117189 systemd[1]: Starting issuegen.service... Mar 17 18:33:02.123000 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 18:33:02.123249 systemd[1]: Finished issuegen.service. Mar 17 18:33:02.133235 systemd[1]: Starting systemd-user-sessions.service... Mar 17 18:33:02.134962 systemd[1]: Finished systemd-user-sessions.service. Mar 17 18:33:02.137645 systemd[1]: Started getty@tty1.service. Mar 17 18:33:02.140101 systemd[1]: Started serial-getty@ttyS0.service. Mar 17 18:33:02.141405 systemd[1]: Reached target getty.target. Mar 17 18:33:02.227057 tar[1302]: linux-amd64/LICENSE Mar 17 18:33:02.227336 tar[1302]: linux-amd64/README.md Mar 17 18:33:02.231532 systemd[1]: Finished prepare-helm.service. Mar 17 18:33:02.641168 systemd-networkd[1078]: eth0: Gained IPv6LL Mar 17 18:33:02.643025 systemd[1]: Finished systemd-networkd-wait-online.service. Mar 17 18:33:02.644557 systemd[1]: Reached target network-online.target. Mar 17 18:33:02.647154 systemd[1]: Starting kubelet.service... Mar 17 18:33:03.555812 systemd[1]: Started kubelet.service. Mar 17 18:33:03.557496 systemd[1]: Reached target multi-user.target. Mar 17 18:33:03.560155 systemd[1]: Starting systemd-update-utmp-runlevel.service... Mar 17 18:33:03.566784 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Mar 17 18:33:03.566998 systemd[1]: Finished systemd-update-utmp-runlevel.service. Mar 17 18:33:03.569091 systemd[1]: Startup finished in 5.695s (kernel) + 7.408s (userspace) = 13.104s. Mar 17 18:33:04.240452 kubelet[1378]: E0317 18:33:04.240379 1378 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:33:04.242055 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:33:04.242194 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:33:05.523251 systemd[1]: Created slice system-sshd.slice. Mar 17 18:33:05.524591 systemd[1]: Started sshd@0-10.0.0.12:22-10.0.0.1:58422.service. Mar 17 18:33:05.559895 sshd[1389]: Accepted publickey for core from 10.0.0.1 port 58422 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:33:05.561090 sshd[1389]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:33:05.569324 systemd-logind[1287]: New session 1 of user core. Mar 17 18:33:05.570075 systemd[1]: Created slice user-500.slice. Mar 17 18:33:05.571057 systemd[1]: Starting user-runtime-dir@500.service... Mar 17 18:33:05.582819 systemd[1]: Finished user-runtime-dir@500.service. Mar 17 18:33:05.584420 systemd[1]: Starting user@500.service... Mar 17 18:33:05.587081 (systemd)[1394]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:33:05.656177 systemd[1394]: Queued start job for default target default.target. Mar 17 18:33:05.656386 systemd[1394]: Reached target paths.target. Mar 17 18:33:05.656402 systemd[1394]: Reached target sockets.target. Mar 17 18:33:05.656413 systemd[1394]: Reached target timers.target. Mar 17 18:33:05.656424 systemd[1394]: Reached target basic.target. Mar 17 18:33:05.656461 systemd[1394]: Reached target default.target. Mar 17 18:33:05.656483 systemd[1394]: Startup finished in 64ms. Mar 17 18:33:05.656592 systemd[1]: Started user@500.service. Mar 17 18:33:05.657616 systemd[1]: Started session-1.scope. Mar 17 18:33:05.706957 systemd[1]: Started sshd@1-10.0.0.12:22-10.0.0.1:58434.service. Mar 17 18:33:05.739223 sshd[1404]: Accepted publickey for core from 10.0.0.1 port 58434 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:33:05.740492 sshd[1404]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:33:05.785261 systemd-logind[1287]: New session 2 of user core. Mar 17 18:33:05.788046 systemd[1]: Started session-2.scope. Mar 17 18:33:05.840780 sshd[1404]: pam_unix(sshd:session): session closed for user core Mar 17 18:33:05.842991 systemd[1]: Started sshd@2-10.0.0.12:22-10.0.0.1:58446.service. Mar 17 18:33:05.843423 systemd[1]: sshd@1-10.0.0.12:22-10.0.0.1:58434.service: Deactivated successfully. Mar 17 18:33:05.844637 systemd[1]: session-2.scope: Deactivated successfully. Mar 17 18:33:05.844658 systemd-logind[1287]: Session 2 logged out. Waiting for processes to exit. Mar 17 18:33:05.845480 systemd-logind[1287]: Removed session 2. Mar 17 18:33:05.874467 sshd[1409]: Accepted publickey for core from 10.0.0.1 port 58446 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:33:05.875453 sshd[1409]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:33:05.878310 systemd-logind[1287]: New session 3 of user core. Mar 17 18:33:05.878956 systemd[1]: Started session-3.scope. Mar 17 18:33:05.927989 sshd[1409]: pam_unix(sshd:session): session closed for user core Mar 17 18:33:05.930215 systemd[1]: Started sshd@3-10.0.0.12:22-10.0.0.1:58462.service. Mar 17 18:33:05.930660 systemd[1]: sshd@2-10.0.0.12:22-10.0.0.1:58446.service: Deactivated successfully. Mar 17 18:33:05.931694 systemd[1]: session-3.scope: Deactivated successfully. Mar 17 18:33:05.931710 systemd-logind[1287]: Session 3 logged out. Waiting for processes to exit. Mar 17 18:33:05.932526 systemd-logind[1287]: Removed session 3. Mar 17 18:33:05.961452 sshd[1416]: Accepted publickey for core from 10.0.0.1 port 58462 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:33:05.962696 sshd[1416]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:33:05.965696 systemd-logind[1287]: New session 4 of user core. Mar 17 18:33:05.966372 systemd[1]: Started session-4.scope. Mar 17 18:33:06.018702 sshd[1416]: pam_unix(sshd:session): session closed for user core Mar 17 18:33:06.020741 systemd[1]: Started sshd@4-10.0.0.12:22-10.0.0.1:58472.service. Mar 17 18:33:06.021508 systemd[1]: sshd@3-10.0.0.12:22-10.0.0.1:58462.service: Deactivated successfully. Mar 17 18:33:06.022240 systemd-logind[1287]: Session 4 logged out. Waiting for processes to exit. Mar 17 18:33:06.022272 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 18:33:06.023138 systemd-logind[1287]: Removed session 4. Mar 17 18:33:06.051686 sshd[1423]: Accepted publickey for core from 10.0.0.1 port 58472 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:33:06.052719 sshd[1423]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:33:06.055670 systemd-logind[1287]: New session 5 of user core. Mar 17 18:33:06.056337 systemd[1]: Started session-5.scope. Mar 17 18:33:06.110779 sudo[1429]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 18:33:06.111009 sudo[1429]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:33:06.119144 dbus-daemon[1276]: \xd0\xed\xdb\u000b\xa3U: received setenforce notice (enforcing=-232778656) Mar 17 18:33:06.121194 sudo[1429]: pam_unix(sudo:session): session closed for user root Mar 17 18:33:06.122739 sshd[1423]: pam_unix(sshd:session): session closed for user core Mar 17 18:33:06.125181 systemd[1]: Started sshd@5-10.0.0.12:22-10.0.0.1:58476.service. Mar 17 18:33:06.125628 systemd[1]: sshd@4-10.0.0.12:22-10.0.0.1:58472.service: Deactivated successfully. Mar 17 18:33:06.126500 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 18:33:06.126526 systemd-logind[1287]: Session 5 logged out. Waiting for processes to exit. Mar 17 18:33:06.127429 systemd-logind[1287]: Removed session 5. Mar 17 18:33:06.156543 sshd[1432]: Accepted publickey for core from 10.0.0.1 port 58476 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:33:06.157475 sshd[1432]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:33:06.160410 systemd-logind[1287]: New session 6 of user core. Mar 17 18:33:06.161065 systemd[1]: Started session-6.scope. Mar 17 18:33:06.213020 sudo[1438]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 18:33:06.213193 sudo[1438]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:33:06.215690 sudo[1438]: pam_unix(sudo:session): session closed for user root Mar 17 18:33:06.219647 sudo[1437]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 17 18:33:06.219817 sudo[1437]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:33:06.228237 systemd[1]: Stopping audit-rules.service... Mar 17 18:33:06.228000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Mar 17 18:33:06.229656 auditctl[1441]: No rules Mar 17 18:33:06.229972 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 18:33:06.230227 systemd[1]: Stopped audit-rules.service. Mar 17 18:33:06.235536 kernel: kauditd_printk_skb: 134 callbacks suppressed Mar 17 18:33:06.235610 kernel: audit: type=1305 audit(1742236386.228:145): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Mar 17 18:33:06.235633 kernel: audit: type=1300 audit(1742236386.228:145): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff00700920 a2=420 a3=0 items=0 ppid=1 pid=1441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:06.228000 audit[1441]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff00700920 a2=420 a3=0 items=0 ppid=1 pid=1441 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:06.231801 systemd[1]: Starting audit-rules.service... Mar 17 18:33:06.239407 kernel: audit: type=1327 audit(1742236386.228:145): proctitle=2F7362696E2F617564697463746C002D44 Mar 17 18:33:06.239475 kernel: audit: type=1131 audit(1742236386.229:146): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.228000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D44 Mar 17 18:33:06.229000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.247808 augenrules[1459]: No rules Mar 17 18:33:06.248400 systemd[1]: Finished audit-rules.service. Mar 17 18:33:06.252385 kernel: audit: type=1130 audit(1742236386.247:147): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.247000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.251777 sudo[1437]: pam_unix(sudo:session): session closed for user root Mar 17 18:33:06.250000 audit[1437]: USER_END pid=1437 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.253087 sshd[1432]: pam_unix(sshd:session): session closed for user core Mar 17 18:33:06.255006 systemd[1]: Started sshd@6-10.0.0.12:22-10.0.0.1:58486.service. Mar 17 18:33:06.255980 systemd[1]: sshd@5-10.0.0.12:22-10.0.0.1:58476.service: Deactivated successfully. Mar 17 18:33:06.256518 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 18:33:06.257331 systemd-logind[1287]: Session 6 logged out. Waiting for processes to exit. Mar 17 18:33:06.258254 systemd-logind[1287]: Removed session 6. Mar 17 18:33:06.250000 audit[1437]: CRED_DISP pid=1437 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.266218 kernel: audit: type=1106 audit(1742236386.250:148): pid=1437 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.266270 kernel: audit: type=1104 audit(1742236386.250:149): pid=1437 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.266287 kernel: audit: type=1106 audit(1742236386.253:150): pid=1432 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:06.253000 audit[1432]: USER_END pid=1432 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:06.253000 audit[1432]: CRED_DISP pid=1432 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:06.274043 kernel: audit: type=1104 audit(1742236386.253:151): pid=1432 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:06.274077 kernel: audit: type=1130 audit(1742236386.254:152): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.0.12:22-10.0.0.1:58486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.254000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.0.12:22-10.0.0.1:58486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.255000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.0.12:22-10.0.0.1:58476 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.286000 audit[1464]: USER_ACCT pid=1464 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:06.287398 sshd[1464]: Accepted publickey for core from 10.0.0.1 port 58486 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:33:06.287000 audit[1464]: CRED_ACQ pid=1464 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:06.287000 audit[1464]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe831381a0 a2=3 a3=0 items=0 ppid=1 pid=1464 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=7 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:06.287000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:33:06.288878 sshd[1464]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:33:06.294555 systemd-logind[1287]: New session 7 of user core. Mar 17 18:33:06.295416 systemd[1]: Started session-7.scope. Mar 17 18:33:06.300000 audit[1464]: USER_START pid=1464 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:06.301000 audit[1469]: CRED_ACQ pid=1469 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:06.348000 audit[1470]: USER_ACCT pid=1470 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.349000 audit[1470]: CRED_REFR pid=1470 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.349963 sudo[1470]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 18:33:06.350161 sudo[1470]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:33:06.350000 audit[1470]: USER_START pid=1470 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:06.376035 systemd[1]: Starting docker.service... Mar 17 18:33:06.495863 env[1482]: time="2025-03-17T18:33:06.495787148Z" level=info msg="Starting up" Mar 17 18:33:06.497307 env[1482]: time="2025-03-17T18:33:06.497275960Z" level=info msg="parsed scheme: \"unix\"" module=grpc Mar 17 18:33:06.497307 env[1482]: time="2025-03-17T18:33:06.497299685Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Mar 17 18:33:06.497381 env[1482]: time="2025-03-17T18:33:06.497318750Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Mar 17 18:33:06.497381 env[1482]: time="2025-03-17T18:33:06.497330853Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Mar 17 18:33:06.499018 env[1482]: time="2025-03-17T18:33:06.498990145Z" level=info msg="parsed scheme: \"unix\"" module=grpc Mar 17 18:33:06.499018 env[1482]: time="2025-03-17T18:33:06.499008159Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Mar 17 18:33:06.499119 env[1482]: time="2025-03-17T18:33:06.499020792Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Mar 17 18:33:06.499119 env[1482]: time="2025-03-17T18:33:06.499031212Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Mar 17 18:33:08.194993 env[1482]: time="2025-03-17T18:33:08.194911703Z" level=warning msg="Your kernel does not support cgroup blkio weight" Mar 17 18:33:08.194993 env[1482]: time="2025-03-17T18:33:08.194958821Z" level=warning msg="Your kernel does not support cgroup blkio weight_device" Mar 17 18:33:08.195517 env[1482]: time="2025-03-17T18:33:08.195202378Z" level=info msg="Loading containers: start." Mar 17 18:33:08.246000 audit[1516]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1516 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.246000 audit[1516]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7fffed082690 a2=0 a3=7fffed08267c items=0 ppid=1482 pid=1516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.246000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Mar 17 18:33:08.248000 audit[1518]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1518 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.248000 audit[1518]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7fff7df89bb0 a2=0 a3=7fff7df89b9c items=0 ppid=1482 pid=1518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.248000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Mar 17 18:33:08.250000 audit[1520]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1520 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.250000 audit[1520]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7ffc7402cdf0 a2=0 a3=7ffc7402cddc items=0 ppid=1482 pid=1520 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.250000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Mar 17 18:33:08.251000 audit[1522]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1522 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.251000 audit[1522]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff44923290 a2=0 a3=7fff4492327c items=0 ppid=1482 pid=1522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.251000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Mar 17 18:33:08.253000 audit[1524]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_rule pid=1524 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.253000 audit[1524]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffce72b5320 a2=0 a3=7ffce72b530c items=0 ppid=1482 pid=1524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.253000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6A0052455455524E Mar 17 18:33:08.278000 audit[1529]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_rule pid=1529 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.278000 audit[1529]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fffea5ec100 a2=0 a3=7fffea5ec0ec items=0 ppid=1482 pid=1529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.278000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D32002D6A0052455455524E Mar 17 18:33:08.330000 audit[1531]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1531 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.330000 audit[1531]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff61c81c90 a2=0 a3=7fff61c81c7c items=0 ppid=1482 pid=1531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.330000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Mar 17 18:33:08.332000 audit[1533]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_rule pid=1533 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.332000 audit[1533]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffd5ffdae70 a2=0 a3=7ffd5ffdae5c items=0 ppid=1482 pid=1533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.332000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Mar 17 18:33:08.333000 audit[1535]: NETFILTER_CFG table=filter:10 family=2 entries=2 op=nft_register_chain pid=1535 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.333000 audit[1535]: SYSCALL arch=c000003e syscall=46 success=yes exit=308 a0=3 a1=7fff190cd850 a2=0 a3=7fff190cd83c items=0 ppid=1482 pid=1535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.333000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:33:08.521000 audit[1539]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_unregister_rule pid=1539 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.521000 audit[1539]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffe2e40be20 a2=0 a3=7ffe2e40be0c items=0 ppid=1482 pid=1539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.521000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:33:08.527000 audit[1540]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1540 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.527000 audit[1540]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffc2612a650 a2=0 a3=7ffc2612a63c items=0 ppid=1482 pid=1540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.527000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:33:08.536957 kernel: Initializing XFRM netlink socket Mar 17 18:33:08.562766 env[1482]: time="2025-03-17T18:33:08.562701671Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address" Mar 17 18:33:08.577000 audit[1548]: NETFILTER_CFG table=nat:13 family=2 entries=2 op=nft_register_chain pid=1548 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.577000 audit[1548]: SYSCALL arch=c000003e syscall=46 success=yes exit=492 a0=3 a1=7fff5ed383d0 a2=0 a3=7fff5ed383bc items=0 ppid=1482 pid=1548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.577000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Mar 17 18:33:08.587000 audit[1551]: NETFILTER_CFG table=nat:14 family=2 entries=1 op=nft_register_rule pid=1551 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.587000 audit[1551]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffc2484e790 a2=0 a3=7ffc2484e77c items=0 ppid=1482 pid=1551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.587000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Mar 17 18:33:08.589000 audit[1554]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=1554 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.589000 audit[1554]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fff1d58d7e0 a2=0 a3=7fff1d58d7cc items=0 ppid=1482 pid=1554 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.589000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B657230002D6F00646F636B657230002D6A00414343455054 Mar 17 18:33:08.591000 audit[1556]: NETFILTER_CFG table=filter:16 family=2 entries=1 op=nft_register_rule pid=1556 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.591000 audit[1556]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7fffac20b210 a2=0 a3=7fffac20b1fc items=0 ppid=1482 pid=1556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.591000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B6572300000002D6F00646F636B657230002D6A00414343455054 Mar 17 18:33:08.593000 audit[1558]: NETFILTER_CFG table=nat:17 family=2 entries=2 op=nft_register_chain pid=1558 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.593000 audit[1558]: SYSCALL arch=c000003e syscall=46 success=yes exit=356 a0=3 a1=7ffc89dad960 a2=0 a3=7ffc89dad94c items=0 ppid=1482 pid=1558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.593000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Mar 17 18:33:08.595000 audit[1560]: NETFILTER_CFG table=nat:18 family=2 entries=2 op=nft_register_chain pid=1560 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.595000 audit[1560]: SYSCALL arch=c000003e syscall=46 success=yes exit=444 a0=3 a1=7ffdca998cf0 a2=0 a3=7ffdca998cdc items=0 ppid=1482 pid=1560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.595000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Mar 17 18:33:08.597000 audit[1562]: NETFILTER_CFG table=filter:19 family=2 entries=1 op=nft_register_rule pid=1562 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.597000 audit[1562]: SYSCALL arch=c000003e syscall=46 success=yes exit=304 a0=3 a1=7ffdb6410500 a2=0 a3=7ffdb64104ec items=0 ppid=1482 pid=1562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.597000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6A00444F434B4552 Mar 17 18:33:08.603000 audit[1565]: NETFILTER_CFG table=filter:20 family=2 entries=1 op=nft_register_rule pid=1565 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.603000 audit[1565]: SYSCALL arch=c000003e syscall=46 success=yes exit=508 a0=3 a1=7ffd12c07400 a2=0 a3=7ffd12c073ec items=0 ppid=1482 pid=1565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.603000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Mar 17 18:33:08.605000 audit[1567]: NETFILTER_CFG table=filter:21 family=2 entries=1 op=nft_register_rule pid=1567 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.605000 audit[1567]: SYSCALL arch=c000003e syscall=46 success=yes exit=240 a0=3 a1=7fff8c28c1c0 a2=0 a3=7fff8c28c1ac items=0 ppid=1482 pid=1567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.605000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Mar 17 18:33:08.606000 audit[1569]: NETFILTER_CFG table=filter:22 family=2 entries=1 op=nft_register_rule pid=1569 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.606000 audit[1569]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7ffe530b1740 a2=0 a3=7ffe530b172c items=0 ppid=1482 pid=1569 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.606000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Mar 17 18:33:08.608000 audit[1571]: NETFILTER_CFG table=filter:23 family=2 entries=1 op=nft_register_rule pid=1571 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.608000 audit[1571]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffe13920f50 a2=0 a3=7ffe13920f3c items=0 ppid=1482 pid=1571 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.608000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Mar 17 18:33:08.609941 systemd-networkd[1078]: docker0: Link UP Mar 17 18:33:08.617000 audit[1575]: NETFILTER_CFG table=filter:24 family=2 entries=1 op=nft_unregister_rule pid=1575 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.617000 audit[1575]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fff1100b370 a2=0 a3=7fff1100b35c items=0 ppid=1482 pid=1575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.617000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:33:08.623000 audit[1576]: NETFILTER_CFG table=filter:25 family=2 entries=1 op=nft_register_rule pid=1576 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:08.623000 audit[1576]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffee5423e10 a2=0 a3=7ffee5423dfc items=0 ppid=1482 pid=1576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:08.623000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:33:08.625200 env[1482]: time="2025-03-17T18:33:08.625163779Z" level=info msg="Loading containers: done." Mar 17 18:33:08.638723 env[1482]: time="2025-03-17T18:33:08.638684946Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 18:33:08.638859 env[1482]: time="2025-03-17T18:33:08.638847742Z" level=info msg="Docker daemon" commit=112bdf3343 graphdriver(s)=overlay2 version=20.10.23 Mar 17 18:33:08.638976 env[1482]: time="2025-03-17T18:33:08.638953360Z" level=info msg="Daemon has completed initialization" Mar 17 18:33:08.656622 systemd[1]: Started docker.service. Mar 17 18:33:08.655000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:08.663387 env[1482]: time="2025-03-17T18:33:08.663333601Z" level=info msg="API listen on /run/docker.sock" Mar 17 18:33:09.355697 env[1305]: time="2025-03-17T18:33:09.355653108Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 17 18:33:09.963261 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3559869455.mount: Deactivated successfully. Mar 17 18:33:11.714712 env[1305]: time="2025-03-17T18:33:11.714636944Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:11.716674 env[1305]: time="2025-03-17T18:33:11.716612439Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:11.718359 env[1305]: time="2025-03-17T18:33:11.718320853Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-apiserver:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:11.720149 env[1305]: time="2025-03-17T18:33:11.720113334Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:11.720784 env[1305]: time="2025-03-17T18:33:11.720752363Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 17 18:33:11.820812 env[1305]: time="2025-03-17T18:33:11.820769901Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 17 18:33:14.220380 env[1305]: time="2025-03-17T18:33:14.220307110Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:14.222180 env[1305]: time="2025-03-17T18:33:14.222149706Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:14.223965 env[1305]: time="2025-03-17T18:33:14.223928902Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-controller-manager:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:14.225941 env[1305]: time="2025-03-17T18:33:14.225902814Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:14.226647 env[1305]: time="2025-03-17T18:33:14.226613848Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 17 18:33:14.243354 env[1305]: time="2025-03-17T18:33:14.243312654Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 17 18:33:14.493095 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 18:33:14.492000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:14.493344 systemd[1]: Stopped kubelet.service. Mar 17 18:33:14.494645 kernel: kauditd_printk_skb: 84 callbacks suppressed Mar 17 18:33:14.494716 kernel: audit: type=1130 audit(1742236394.492:187): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:14.495481 systemd[1]: Starting kubelet.service... Mar 17 18:33:14.492000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:14.501302 kernel: audit: type=1131 audit(1742236394.492:188): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:14.608745 systemd[1]: Started kubelet.service. Mar 17 18:33:14.608000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:14.612944 kernel: audit: type=1130 audit(1742236394.608:189): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:14.971416 kubelet[1640]: E0317 18:33:14.971284 1640 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:33:14.974178 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:33:14.974327 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:33:14.973000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:33:14.977935 kernel: audit: type=1131 audit(1742236394.973:190): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:33:16.816888 env[1305]: time="2025-03-17T18:33:16.816810431Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:16.818944 env[1305]: time="2025-03-17T18:33:16.818866427Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:16.821243 env[1305]: time="2025-03-17T18:33:16.821197358Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-scheduler:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:16.823036 env[1305]: time="2025-03-17T18:33:16.823011520Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:16.823907 env[1305]: time="2025-03-17T18:33:16.823870361Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 17 18:33:16.880133 env[1305]: time="2025-03-17T18:33:16.880092838Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 17 18:33:18.082812 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2216740536.mount: Deactivated successfully. Mar 17 18:33:19.995761 env[1305]: time="2025-03-17T18:33:19.995701987Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:20.020596 env[1305]: time="2025-03-17T18:33:20.020546469Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:20.032750 env[1305]: time="2025-03-17T18:33:20.032702035Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-proxy:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:20.049565 env[1305]: time="2025-03-17T18:33:20.049476343Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:20.049840 env[1305]: time="2025-03-17T18:33:20.049797165Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 17 18:33:20.060836 env[1305]: time="2025-03-17T18:33:20.060792305Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 18:33:20.682784 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount723470499.mount: Deactivated successfully. Mar 17 18:33:22.244255 env[1305]: time="2025-03-17T18:33:22.244136975Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns:v1.11.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:22.247567 env[1305]: time="2025-03-17T18:33:22.247476198Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:22.255711 env[1305]: time="2025-03-17T18:33:22.255553155Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/coredns/coredns:v1.11.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:22.257691 env[1305]: time="2025-03-17T18:33:22.257625201Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:22.258871 env[1305]: time="2025-03-17T18:33:22.258801597Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 17 18:33:22.272116 env[1305]: time="2025-03-17T18:33:22.272066964Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 17 18:33:22.807463 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3540948686.mount: Deactivated successfully. Mar 17 18:33:22.812936 env[1305]: time="2025-03-17T18:33:22.812885933Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:22.814733 env[1305]: time="2025-03-17T18:33:22.814708471Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:22.816310 env[1305]: time="2025-03-17T18:33:22.816266883Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:22.817614 env[1305]: time="2025-03-17T18:33:22.817585897Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:22.818079 env[1305]: time="2025-03-17T18:33:22.818056610Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 17 18:33:22.830706 env[1305]: time="2025-03-17T18:33:22.830663201Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 17 18:33:24.346394 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1499888059.mount: Deactivated successfully. Mar 17 18:33:25.225205 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 18:33:25.225425 systemd[1]: Stopped kubelet.service. Mar 17 18:33:25.224000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:25.227235 systemd[1]: Starting kubelet.service... Mar 17 18:33:25.224000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:25.231524 kernel: audit: type=1130 audit(1742236405.224:191): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:25.231601 kernel: audit: type=1131 audit(1742236405.224:192): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:25.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:25.302188 systemd[1]: Started kubelet.service. Mar 17 18:33:25.306947 kernel: audit: type=1130 audit(1742236405.301:193): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:25.616356 kubelet[1683]: E0317 18:33:25.616215 1683 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:33:25.618085 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:33:25.618292 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:33:25.617000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:33:25.624943 kernel: audit: type=1131 audit(1742236405.617:194): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:33:28.811425 env[1305]: time="2025-03-17T18:33:28.811325528Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd:3.5.12-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:28.813392 env[1305]: time="2025-03-17T18:33:28.813362578Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:28.815478 env[1305]: time="2025-03-17T18:33:28.815445124Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/etcd:3.5.12-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:28.817194 env[1305]: time="2025-03-17T18:33:28.817144801Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:28.818023 env[1305]: time="2025-03-17T18:33:28.817985748Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 17 18:33:31.141546 systemd[1]: Stopped kubelet.service. Mar 17 18:33:31.140000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.144493 systemd[1]: Starting kubelet.service... Mar 17 18:33:31.140000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.154883 kernel: audit: type=1130 audit(1742236411.140:195): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.155011 kernel: audit: type=1131 audit(1742236411.140:196): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.161189 systemd[1]: Reloading. Mar 17 18:33:31.223337 /usr/lib/systemd/system-generators/torcx-generator[1793]: time="2025-03-17T18:33:31Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 18:33:31.223710 /usr/lib/systemd/system-generators/torcx-generator[1793]: time="2025-03-17T18:33:31Z" level=info msg="torcx already run" Mar 17 18:33:31.475355 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:33:31.475374 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:33:31.496095 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:33:31.570972 systemd[1]: Started kubelet.service. Mar 17 18:33:31.570000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.572503 systemd[1]: Stopping kubelet.service... Mar 17 18:33:31.572805 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 18:33:31.573063 systemd[1]: Stopped kubelet.service. Mar 17 18:33:31.574552 systemd[1]: Starting kubelet.service... Mar 17 18:33:31.574949 kernel: audit: type=1130 audit(1742236411.570:197): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.572000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.578949 kernel: audit: type=1131 audit(1742236411.572:198): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.654015 systemd[1]: Started kubelet.service. Mar 17 18:33:31.655000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.660948 kernel: audit: type=1130 audit(1742236411.655:199): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:31.718901 kubelet[1853]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:33:31.718901 kubelet[1853]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 18:33:31.718901 kubelet[1853]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:33:31.721756 kubelet[1853]: I0317 18:33:31.721713 1853 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 18:33:32.124539 kubelet[1853]: I0317 18:33:32.124497 1853 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 18:33:32.124539 kubelet[1853]: I0317 18:33:32.124526 1853 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 18:33:32.124809 kubelet[1853]: I0317 18:33:32.124787 1853 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 18:33:32.172690 kubelet[1853]: E0317 18:33:32.172639 1853 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.0.0.12:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:32.172977 kubelet[1853]: I0317 18:33:32.172958 1853 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 18:33:32.216955 kubelet[1853]: I0317 18:33:32.216901 1853 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 18:33:32.217291 kubelet[1853]: I0317 18:33:32.217256 1853 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 18:33:32.217461 kubelet[1853]: I0317 18:33:32.217284 1853 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 18:33:32.218156 kubelet[1853]: I0317 18:33:32.218132 1853 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 18:33:32.218156 kubelet[1853]: I0317 18:33:32.218151 1853 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 18:33:32.218276 kubelet[1853]: I0317 18:33:32.218254 1853 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:33:32.221685 kubelet[1853]: I0317 18:33:32.221662 1853 kubelet.go:400] "Attempting to sync node with API server" Mar 17 18:33:32.221685 kubelet[1853]: I0317 18:33:32.221681 1853 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 18:33:32.221750 kubelet[1853]: I0317 18:33:32.221708 1853 kubelet.go:312] "Adding apiserver pod source" Mar 17 18:33:32.221750 kubelet[1853]: I0317 18:33:32.221732 1853 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 18:33:32.222245 kubelet[1853]: W0317 18:33:32.222187 1853 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.12:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:32.222300 kubelet[1853]: E0317 18:33:32.222264 1853 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.12:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:32.244749 kubelet[1853]: W0317 18:33:32.244709 1853 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.12:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:32.244749 kubelet[1853]: E0317 18:33:32.244747 1853 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.12:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:32.253620 kubelet[1853]: I0317 18:33:32.253597 1853 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Mar 17 18:33:32.257477 kubelet[1853]: I0317 18:33:32.257438 1853 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 18:33:32.257630 kubelet[1853]: W0317 18:33:32.257510 1853 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 18:33:32.258189 kubelet[1853]: I0317 18:33:32.258172 1853 server.go:1264] "Started kubelet" Mar 17 18:33:32.258270 kubelet[1853]: I0317 18:33:32.258248 1853 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 18:33:32.258490 kubelet[1853]: I0317 18:33:32.258427 1853 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 18:33:32.258789 kubelet[1853]: I0317 18:33:32.258764 1853 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 18:33:32.259323 kubelet[1853]: I0317 18:33:32.259300 1853 server.go:455] "Adding debug handlers to kubelet server" Mar 17 18:33:32.259000 audit[1853]: AVC avc: denied { mac_admin } for pid=1853 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:32.260958 kubelet[1853]: I0317 18:33:32.260788 1853 kubelet.go:1419] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Mar 17 18:33:32.260958 kubelet[1853]: I0317 18:33:32.260847 1853 kubelet.go:1423] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Mar 17 18:33:32.260958 kubelet[1853]: I0317 18:33:32.260941 1853 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 18:33:32.273964 kernel: audit: type=1400 audit(1742236412.259:200): avc: denied { mac_admin } for pid=1853 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:32.274070 kernel: audit: type=1401 audit(1742236412.259:200): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:33:32.274100 kernel: audit: type=1300 audit(1742236412.259:200): arch=c000003e syscall=188 success=no exit=-22 a0=c000033e60 a1=c000c44078 a2=c000033e30 a3=25 items=0 ppid=1 pid=1853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.274142 kernel: audit: type=1327 audit(1742236412.259:200): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:33:32.274170 kernel: audit: type=1400 audit(1742236412.259:201): avc: denied { mac_admin } for pid=1853 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:32.259000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:33:32.259000 audit[1853]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000033e60 a1=c000c44078 a2=c000033e30 a3=25 items=0 ppid=1 pid=1853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.259000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:33:32.259000 audit[1853]: AVC avc: denied { mac_admin } for pid=1853 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:32.274484 kubelet[1853]: I0317 18:33:32.265668 1853 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 18:33:32.274484 kubelet[1853]: I0317 18:33:32.267434 1853 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 18:33:32.274484 kubelet[1853]: I0317 18:33:32.267497 1853 reconciler.go:26] "Reconciler: start to sync state" Mar 17 18:33:32.274484 kubelet[1853]: W0317 18:33:32.267790 1853 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.12:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:32.274484 kubelet[1853]: E0317 18:33:32.267837 1853 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.12:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:32.274484 kubelet[1853]: E0317 18:33:32.267899 1853 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.12:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.12:6443: connect: connection refused" interval="200ms" Mar 17 18:33:32.274484 kubelet[1853]: I0317 18:33:32.268104 1853 factory.go:221] Registration of the systemd container factory successfully Mar 17 18:33:32.274484 kubelet[1853]: I0317 18:33:32.268182 1853 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 18:33:32.274484 kubelet[1853]: E0317 18:33:32.272555 1853 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 18:33:32.274484 kubelet[1853]: I0317 18:33:32.273179 1853 factory.go:221] Registration of the containerd container factory successfully Mar 17 18:33:32.259000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:33:32.259000 audit[1853]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000c403e0 a1=c000c44090 a2=c000033ef0 a3=25 items=0 ppid=1 pid=1853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.259000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:33:32.262000 audit[1865]: NETFILTER_CFG table=mangle:26 family=2 entries=2 op=nft_register_chain pid=1865 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:32.262000 audit[1865]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd7a69e290 a2=0 a3=7ffd7a69e27c items=0 ppid=1853 pid=1865 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.262000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Mar 17 18:33:32.263000 audit[1866]: NETFILTER_CFG table=filter:27 family=2 entries=1 op=nft_register_chain pid=1866 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:32.263000 audit[1866]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffcf566ec50 a2=0 a3=7ffcf566ec3c items=0 ppid=1853 pid=1866 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.263000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Mar 17 18:33:32.266000 audit[1868]: NETFILTER_CFG table=filter:28 family=2 entries=2 op=nft_register_chain pid=1868 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:32.266000 audit[1868]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffc69204e30 a2=0 a3=7ffc69204e1c items=0 ppid=1853 pid=1868 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.266000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:33:32.268000 audit[1870]: NETFILTER_CFG table=filter:29 family=2 entries=2 op=nft_register_chain pid=1870 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:32.268000 audit[1870]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffdda8fadd0 a2=0 a3=7ffdda8fadbc items=0 ppid=1853 pid=1870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.268000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:33:32.283000 audit[1875]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=1875 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:32.283000 audit[1875]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7fff7dcf5100 a2=0 a3=7fff7dcf50ec items=0 ppid=1853 pid=1875 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.283000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Mar 17 18:33:32.287626 kubelet[1853]: I0317 18:33:32.287579 1853 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 18:33:32.290000 audit[1878]: NETFILTER_CFG table=mangle:31 family=10 entries=2 op=nft_register_chain pid=1878 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:32.290000 audit[1878]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7fff1bfdc750 a2=0 a3=7fff1bfdc73c items=0 ppid=1853 pid=1878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.290000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Mar 17 18:33:32.293049 kubelet[1853]: I0317 18:33:32.293021 1853 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 18:33:32.293099 kubelet[1853]: I0317 18:33:32.293067 1853 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 18:33:32.293099 kubelet[1853]: I0317 18:33:32.293089 1853 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 18:33:32.293186 kubelet[1853]: E0317 18:33:32.293140 1853 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 18:33:32.293000 audit[1879]: NETFILTER_CFG table=mangle:32 family=10 entries=1 op=nft_register_chain pid=1879 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:32.293000 audit[1879]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc78e18b70 a2=0 a3=7ffc78e18b5c items=0 ppid=1853 pid=1879 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.293000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Mar 17 18:33:32.295132 kubelet[1853]: W0317 18:33:32.294985 1853 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.12:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:32.295132 kubelet[1853]: E0317 18:33:32.295059 1853 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.12:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:32.294000 audit[1877]: NETFILTER_CFG table=mangle:33 family=2 entries=1 op=nft_register_chain pid=1877 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:32.294000 audit[1877]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffebeedda80 a2=0 a3=7ffebeedda6c items=0 ppid=1853 pid=1877 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.294000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Mar 17 18:33:32.295000 audit[1882]: NETFILTER_CFG table=nat:34 family=10 entries=2 op=nft_register_chain pid=1882 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:32.295000 audit[1882]: SYSCALL arch=c000003e syscall=46 success=yes exit=128 a0=3 a1=7fff35d9b300 a2=0 a3=7fff35d9b2ec items=0 ppid=1853 pid=1882 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.295000 audit[1883]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_chain pid=1883 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:32.295000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Mar 17 18:33:32.295000 audit[1883]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc99497b00 a2=0 a3=7ffc99497aec items=0 ppid=1853 pid=1883 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.295000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Mar 17 18:33:32.297680 kubelet[1853]: E0317 18:33:32.297554 1853 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.12:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.12:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.182daac94365d184 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-03-17 18:33:32.258144644 +0000 UTC m=+0.600838006,LastTimestamp:2025-03-17 18:33:32.258144644 +0000 UTC m=+0.600838006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" Mar 17 18:33:32.297000 audit[1885]: NETFILTER_CFG table=filter:36 family=10 entries=2 op=nft_register_chain pid=1885 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:32.297000 audit[1885]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffe947c8510 a2=0 a3=7ffe947c84fc items=0 ppid=1853 pid=1885 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.297000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Mar 17 18:33:32.297000 audit[1886]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_chain pid=1886 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:32.297000 audit[1886]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff61f0bb50 a2=0 a3=7fff61f0bb3c items=0 ppid=1853 pid=1886 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.297000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Mar 17 18:33:32.300353 kubelet[1853]: I0317 18:33:32.300338 1853 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 18:33:32.300353 kubelet[1853]: I0317 18:33:32.300350 1853 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 18:33:32.300420 kubelet[1853]: I0317 18:33:32.300364 1853 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:33:32.367362 kubelet[1853]: I0317 18:33:32.367326 1853 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 18:33:32.367596 kubelet[1853]: E0317 18:33:32.367553 1853 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.12:6443/api/v1/nodes\": dial tcp 10.0.0.12:6443: connect: connection refused" node="localhost" Mar 17 18:33:32.393939 kubelet[1853]: E0317 18:33:32.393833 1853 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 17 18:33:32.468520 kubelet[1853]: E0317 18:33:32.468454 1853 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.12:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.12:6443: connect: connection refused" interval="400ms" Mar 17 18:33:32.568952 kubelet[1853]: I0317 18:33:32.568895 1853 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 18:33:32.569479 kubelet[1853]: E0317 18:33:32.569414 1853 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.12:6443/api/v1/nodes\": dial tcp 10.0.0.12:6443: connect: connection refused" node="localhost" Mar 17 18:33:32.594561 kubelet[1853]: E0317 18:33:32.594511 1853 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 17 18:33:32.830132 kubelet[1853]: I0317 18:33:32.829736 1853 policy_none.go:49] "None policy: Start" Mar 17 18:33:32.830794 kubelet[1853]: I0317 18:33:32.830768 1853 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 18:33:32.830850 kubelet[1853]: I0317 18:33:32.830808 1853 state_mem.go:35] "Initializing new in-memory state store" Mar 17 18:33:32.837487 kubelet[1853]: I0317 18:33:32.837450 1853 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 18:33:32.836000 audit[1853]: AVC avc: denied { mac_admin } for pid=1853 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:32.836000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:33:32.836000 audit[1853]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000c244e0 a1=c0008b8108 a2=c000c244b0 a3=25 items=0 ppid=1 pid=1853 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:32.836000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:33:32.837747 kubelet[1853]: I0317 18:33:32.837550 1853 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Mar 17 18:33:32.837747 kubelet[1853]: I0317 18:33:32.837672 1853 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 18:33:32.837813 kubelet[1853]: I0317 18:33:32.837796 1853 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 18:33:32.840644 kubelet[1853]: E0317 18:33:32.840616 1853 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" Mar 17 18:33:32.869397 kubelet[1853]: E0317 18:33:32.869338 1853 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.12:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.12:6443: connect: connection refused" interval="800ms" Mar 17 18:33:32.971566 kubelet[1853]: I0317 18:33:32.971528 1853 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 18:33:32.971983 kubelet[1853]: E0317 18:33:32.971940 1853 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.12:6443/api/v1/nodes\": dial tcp 10.0.0.12:6443: connect: connection refused" node="localhost" Mar 17 18:33:32.995055 kubelet[1853]: I0317 18:33:32.994980 1853 topology_manager.go:215] "Topology Admit Handler" podUID="aa5aaa3c674e8fafcd7a3edee9afeb2e" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 17 18:33:32.996270 kubelet[1853]: I0317 18:33:32.996246 1853 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 17 18:33:32.996900 kubelet[1853]: I0317 18:33:32.996871 1853 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 17 18:33:33.049095 kubelet[1853]: W0317 18:33:33.049010 1853 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.0.12:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:33.049154 kubelet[1853]: E0317 18:33:33.049104 1853 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.0.0.12:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:33.071402 kubelet[1853]: I0317 18:33:33.071369 1853 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aa5aaa3c674e8fafcd7a3edee9afeb2e-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"aa5aaa3c674e8fafcd7a3edee9afeb2e\") " pod="kube-system/kube-apiserver-localhost" Mar 17 18:33:33.071480 kubelet[1853]: I0317 18:33:33.071408 1853 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:33.071480 kubelet[1853]: I0317 18:33:33.071442 1853 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:33.071533 kubelet[1853]: I0317 18:33:33.071478 1853 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:33.071533 kubelet[1853]: I0317 18:33:33.071512 1853 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:33.071580 kubelet[1853]: I0317 18:33:33.071553 1853 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 17 18:33:33.071610 kubelet[1853]: I0317 18:33:33.071598 1853 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aa5aaa3c674e8fafcd7a3edee9afeb2e-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"aa5aaa3c674e8fafcd7a3edee9afeb2e\") " pod="kube-system/kube-apiserver-localhost" Mar 17 18:33:33.071691 kubelet[1853]: I0317 18:33:33.071653 1853 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aa5aaa3c674e8fafcd7a3edee9afeb2e-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"aa5aaa3c674e8fafcd7a3edee9afeb2e\") " pod="kube-system/kube-apiserver-localhost" Mar 17 18:33:33.071723 kubelet[1853]: I0317 18:33:33.071707 1853 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:33.182594 kubelet[1853]: W0317 18:33:33.182429 1853 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.0.12:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:33.182594 kubelet[1853]: E0317 18:33:33.182477 1853 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.0.0.12:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:33.300490 kubelet[1853]: E0317 18:33:33.300445 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:33.301133 env[1305]: time="2025-03-17T18:33:33.301092804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:aa5aaa3c674e8fafcd7a3edee9afeb2e,Namespace:kube-system,Attempt:0,}" Mar 17 18:33:33.302203 kubelet[1853]: E0317 18:33:33.302173 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:33.302595 env[1305]: time="2025-03-17T18:33:33.302565737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,}" Mar 17 18:33:33.303759 kubelet[1853]: E0317 18:33:33.303729 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:33.304099 env[1305]: time="2025-03-17T18:33:33.304065068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,}" Mar 17 18:33:33.480561 kubelet[1853]: W0317 18:33:33.480410 1853 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.0.12:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:33.480561 kubelet[1853]: E0317 18:33:33.480476 1853 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.0.0.12:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:33.670742 kubelet[1853]: E0317 18:33:33.670683 1853 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.12:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.12:6443: connect: connection refused" interval="1.6s" Mar 17 18:33:33.745483 kubelet[1853]: W0317 18:33:33.745316 1853 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.0.12:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:33.745483 kubelet[1853]: E0317 18:33:33.745401 1853 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.0.0.12:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:33.773682 kubelet[1853]: I0317 18:33:33.773645 1853 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 18:33:33.774033 kubelet[1853]: E0317 18:33:33.773983 1853 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.0.0.12:6443/api/v1/nodes\": dial tcp 10.0.0.12:6443: connect: connection refused" node="localhost" Mar 17 18:33:33.835053 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4236237313.mount: Deactivated successfully. Mar 17 18:33:33.839934 env[1305]: time="2025-03-17T18:33:33.839844867Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.843562 env[1305]: time="2025-03-17T18:33:33.843518697Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.844949 env[1305]: time="2025-03-17T18:33:33.844901209Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.845541 env[1305]: time="2025-03-17T18:33:33.845519980Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.847988 env[1305]: time="2025-03-17T18:33:33.847932484Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.849096 env[1305]: time="2025-03-17T18:33:33.849068695Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.850514 env[1305]: time="2025-03-17T18:33:33.850480873Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.851827 env[1305]: time="2025-03-17T18:33:33.851784508Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.853041 env[1305]: time="2025-03-17T18:33:33.853017170Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.854823 env[1305]: time="2025-03-17T18:33:33.854775116Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.855438 env[1305]: time="2025-03-17T18:33:33.855409877Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.856936 env[1305]: time="2025-03-17T18:33:33.856894130Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:33.902065 env[1305]: time="2025-03-17T18:33:33.901959374Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:33:33.902340 env[1305]: time="2025-03-17T18:33:33.902033352Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:33:33.902340 env[1305]: time="2025-03-17T18:33:33.902046798Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:33:33.902340 env[1305]: time="2025-03-17T18:33:33.902279233Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/89074e9749e4bdacc6656d94b513fd531832aa627c49f4fb57d6148655e89aa0 pid=1905 runtime=io.containerd.runc.v2 Mar 17 18:33:33.909990 env[1305]: time="2025-03-17T18:33:33.909763088Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:33:33.909990 env[1305]: time="2025-03-17T18:33:33.909818512Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:33:33.909990 env[1305]: time="2025-03-17T18:33:33.909832779Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:33:33.910266 env[1305]: time="2025-03-17T18:33:33.910192904Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/38145f2273c66aebfc2c73f253dd5d3da89c8faa69fe1fdc159afce0b32c45e4 pid=1904 runtime=io.containerd.runc.v2 Mar 17 18:33:33.922040 env[1305]: time="2025-03-17T18:33:33.920894293Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:33:33.922040 env[1305]: time="2025-03-17T18:33:33.920966308Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:33:33.922040 env[1305]: time="2025-03-17T18:33:33.921011203Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:33:33.922040 env[1305]: time="2025-03-17T18:33:33.921206419Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/239e1b6a28554fb7c2eb7420bd0c1c411f0efe4c83d7e7af38983d118b9fd261 pid=1928 runtime=io.containerd.runc.v2 Mar 17 18:33:34.117629 env[1305]: time="2025-03-17T18:33:34.117507860Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:aa5aaa3c674e8fafcd7a3edee9afeb2e,Namespace:kube-system,Attempt:0,} returns sandbox id \"38145f2273c66aebfc2c73f253dd5d3da89c8faa69fe1fdc159afce0b32c45e4\"" Mar 17 18:33:34.119338 kubelet[1853]: E0317 18:33:34.119311 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:34.122014 env[1305]: time="2025-03-17T18:33:34.121984924Z" level=info msg="CreateContainer within sandbox \"38145f2273c66aebfc2c73f253dd5d3da89c8faa69fe1fdc159afce0b32c45e4\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 18:33:34.126162 env[1305]: time="2025-03-17T18:33:34.126123559Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:d79ab404294384d4bcc36fb5b5509bbb,Namespace:kube-system,Attempt:0,} returns sandbox id \"239e1b6a28554fb7c2eb7420bd0c1c411f0efe4c83d7e7af38983d118b9fd261\"" Mar 17 18:33:34.126840 kubelet[1853]: E0317 18:33:34.126820 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:34.128996 env[1305]: time="2025-03-17T18:33:34.128972667Z" level=info msg="CreateContainer within sandbox \"239e1b6a28554fb7c2eb7420bd0c1c411f0efe4c83d7e7af38983d118b9fd261\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 18:33:34.129345 env[1305]: time="2025-03-17T18:33:34.129322988Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:23a18e2dc14f395c5f1bea711a5a9344,Namespace:kube-system,Attempt:0,} returns sandbox id \"89074e9749e4bdacc6656d94b513fd531832aa627c49f4fb57d6148655e89aa0\"" Mar 17 18:33:34.130075 kubelet[1853]: E0317 18:33:34.130055 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:34.131854 env[1305]: time="2025-03-17T18:33:34.131816215Z" level=info msg="CreateContainer within sandbox \"89074e9749e4bdacc6656d94b513fd531832aa627c49f4fb57d6148655e89aa0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 18:33:34.149934 env[1305]: time="2025-03-17T18:33:34.149885871Z" level=info msg="CreateContainer within sandbox \"38145f2273c66aebfc2c73f253dd5d3da89c8faa69fe1fdc159afce0b32c45e4\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"25ce7f44e6983705516914896c83b2fdb75cbea9f267d600b288c6f9aadcc167\"" Mar 17 18:33:34.150985 env[1305]: time="2025-03-17T18:33:34.150871718Z" level=info msg="StartContainer for \"25ce7f44e6983705516914896c83b2fdb75cbea9f267d600b288c6f9aadcc167\"" Mar 17 18:33:34.158244 env[1305]: time="2025-03-17T18:33:34.158184354Z" level=info msg="CreateContainer within sandbox \"239e1b6a28554fb7c2eb7420bd0c1c411f0efe4c83d7e7af38983d118b9fd261\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"c6468fa5a4913ed8c456b82115f908e44074eb030798e7d7d0afc90a30571b37\"" Mar 17 18:33:34.158809 env[1305]: time="2025-03-17T18:33:34.158761235Z" level=info msg="StartContainer for \"c6468fa5a4913ed8c456b82115f908e44074eb030798e7d7d0afc90a30571b37\"" Mar 17 18:33:34.162723 env[1305]: time="2025-03-17T18:33:34.162653331Z" level=info msg="CreateContainer within sandbox \"89074e9749e4bdacc6656d94b513fd531832aa627c49f4fb57d6148655e89aa0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"ee527d1b172cdca0d8335787ad83274a784d17ba931c4500503735ab77694534\"" Mar 17 18:33:34.163539 env[1305]: time="2025-03-17T18:33:34.163496240Z" level=info msg="StartContainer for \"ee527d1b172cdca0d8335787ad83274a784d17ba931c4500503735ab77694534\"" Mar 17 18:33:34.212640 kubelet[1853]: E0317 18:33:34.212596 1853 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.0.0.12:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.0.0.12:6443: connect: connection refused Mar 17 18:33:34.262941 env[1305]: time="2025-03-17T18:33:34.254131912Z" level=info msg="StartContainer for \"c6468fa5a4913ed8c456b82115f908e44074eb030798e7d7d0afc90a30571b37\" returns successfully" Mar 17 18:33:34.262941 env[1305]: time="2025-03-17T18:33:34.259324736Z" level=info msg="StartContainer for \"25ce7f44e6983705516914896c83b2fdb75cbea9f267d600b288c6f9aadcc167\" returns successfully" Mar 17 18:33:34.291938 env[1305]: time="2025-03-17T18:33:34.291841899Z" level=info msg="StartContainer for \"ee527d1b172cdca0d8335787ad83274a784d17ba931c4500503735ab77694534\" returns successfully" Mar 17 18:33:34.302354 kubelet[1853]: E0317 18:33:34.302271 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:34.304545 kubelet[1853]: E0317 18:33:34.304493 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:34.312816 kubelet[1853]: E0317 18:33:34.312799 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:35.323383 kubelet[1853]: E0317 18:33:35.323340 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:35.323888 kubelet[1853]: E0317 18:33:35.323862 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:35.375520 kubelet[1853]: I0317 18:33:35.375472 1853 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 18:33:35.480800 kubelet[1853]: E0317 18:33:35.480759 1853 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" Mar 17 18:33:35.583262 kubelet[1853]: I0317 18:33:35.583127 1853 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 17 18:33:35.592824 kubelet[1853]: E0317 18:33:35.592795 1853 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 18:33:35.694193 kubelet[1853]: E0317 18:33:35.694133 1853 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 18:33:35.794905 kubelet[1853]: E0317 18:33:35.794845 1853 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 18:33:35.896017 kubelet[1853]: E0317 18:33:35.895882 1853 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 18:33:35.996392 kubelet[1853]: E0317 18:33:35.996363 1853 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 18:33:36.096934 kubelet[1853]: E0317 18:33:36.096882 1853 kubelet_node_status.go:462] "Error getting the current node from lister" err="node \"localhost\" not found" Mar 17 18:33:36.223993 kubelet[1853]: I0317 18:33:36.223865 1853 apiserver.go:52] "Watching apiserver" Mar 17 18:33:36.268000 kubelet[1853]: I0317 18:33:36.267969 1853 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 18:33:36.329431 kubelet[1853]: E0317 18:33:36.329383 1853 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" Mar 17 18:33:36.329848 kubelet[1853]: E0317 18:33:36.329822 1853 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:37.593375 systemd[1]: Reloading. Mar 17 18:33:37.659838 /usr/lib/systemd/system-generators/torcx-generator[2151]: time="2025-03-17T18:33:37Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 18:33:37.659882 /usr/lib/systemd/system-generators/torcx-generator[2151]: time="2025-03-17T18:33:37Z" level=info msg="torcx already run" Mar 17 18:33:37.742718 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:33:37.742735 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:33:37.763567 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:33:37.840200 systemd[1]: Stopping kubelet.service... Mar 17 18:33:37.863417 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 18:33:37.863784 systemd[1]: Stopped kubelet.service. Mar 17 18:33:37.862000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:37.864781 kernel: kauditd_printk_skb: 43 callbacks suppressed Mar 17 18:33:37.864832 kernel: audit: type=1131 audit(1742236417.862:215): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:37.865868 systemd[1]: Starting kubelet.service... Mar 17 18:33:37.956344 systemd[1]: Started kubelet.service. Mar 17 18:33:37.956000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:37.962998 kernel: audit: type=1130 audit(1742236417.956:216): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:38.013003 kubelet[2208]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:33:38.013003 kubelet[2208]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 18:33:38.013003 kubelet[2208]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:33:38.013423 kubelet[2208]: I0317 18:33:38.013041 2208 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 18:33:38.017357 kubelet[2208]: I0317 18:33:38.017327 2208 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 18:33:38.017357 kubelet[2208]: I0317 18:33:38.017348 2208 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 18:33:38.017539 kubelet[2208]: I0317 18:33:38.017515 2208 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 18:33:38.018872 kubelet[2208]: I0317 18:33:38.018844 2208 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 18:33:38.020395 kubelet[2208]: I0317 18:33:38.020287 2208 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 18:33:38.028502 kubelet[2208]: I0317 18:33:38.028473 2208 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 18:33:38.028982 kubelet[2208]: I0317 18:33:38.028941 2208 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 18:33:38.029292 kubelet[2208]: I0317 18:33:38.028979 2208 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 18:33:38.029425 kubelet[2208]: I0317 18:33:38.029299 2208 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 18:33:38.029425 kubelet[2208]: I0317 18:33:38.029309 2208 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 18:33:38.029425 kubelet[2208]: I0317 18:33:38.029359 2208 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:33:38.029534 kubelet[2208]: I0317 18:33:38.029470 2208 kubelet.go:400] "Attempting to sync node with API server" Mar 17 18:33:38.029534 kubelet[2208]: I0317 18:33:38.029498 2208 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 18:33:38.029534 kubelet[2208]: I0317 18:33:38.029526 2208 kubelet.go:312] "Adding apiserver pod source" Mar 17 18:33:38.029614 kubelet[2208]: I0317 18:33:38.029542 2208 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 18:33:38.030320 kubelet[2208]: I0317 18:33:38.030292 2208 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Mar 17 18:33:38.030570 kubelet[2208]: I0317 18:33:38.030549 2208 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 18:33:38.047185 kernel: audit: type=1400 audit(1742236418.032:217): avc: denied { mac_admin } for pid=2208 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:38.047289 kernel: audit: type=1401 audit(1742236418.032:217): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:33:38.047306 kernel: audit: type=1300 audit(1742236418.032:217): arch=c000003e syscall=188 success=no exit=-22 a0=c000d3ab40 a1=c000da0210 a2=c000d3ab10 a3=25 items=0 ppid=1 pid=2208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:38.047323 kernel: audit: type=1327 audit(1742236418.032:217): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:33:38.032000 audit[2208]: AVC avc: denied { mac_admin } for pid=2208 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:38.032000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:33:38.032000 audit[2208]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000d3ab40 a1=c000da0210 a2=c000d3ab10 a3=25 items=0 ppid=1 pid=2208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:38.032000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.031305 2208 server.go:1264] "Started kubelet" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.033077 2208 kubelet.go:1419] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.033127 2208 kubelet.go:1423] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.033167 2208 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.038958 2208 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.040108 2208 server.go:455] "Adding debug handlers to kubelet server" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.041421 2208 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.043496 2208 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.045304 2208 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.046677 2208 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.047329 2208 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.047466 2208 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.047484 2208 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 18:33:38.047546 kubelet[2208]: I0317 18:33:38.047504 2208 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 18:33:38.047889 kubelet[2208]: E0317 18:33:38.047575 2208 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 18:33:38.032000 audit[2208]: AVC avc: denied { mac_admin } for pid=2208 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:38.051938 kernel: audit: type=1400 audit(1742236418.032:218): avc: denied { mac_admin } for pid=2208 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:38.052028 kernel: audit: type=1401 audit(1742236418.032:218): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:33:38.032000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:33:38.052243 kubelet[2208]: I0317 18:33:38.047469 2208 reconciler.go:26] "Reconciler: start to sync state" Mar 17 18:33:38.053521 kubelet[2208]: I0317 18:33:38.053502 2208 factory.go:221] Registration of the systemd container factory successfully Mar 17 18:33:38.032000 audit[2208]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000da2340 a1=c000da0228 a2=c000d3abd0 a3=25 items=0 ppid=1 pid=2208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:38.057069 kubelet[2208]: I0317 18:33:38.053679 2208 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 18:33:38.057069 kubelet[2208]: I0317 18:33:38.055219 2208 factory.go:221] Registration of the containerd container factory successfully Mar 17 18:33:38.057069 kubelet[2208]: E0317 18:33:38.056900 2208 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 18:33:38.058054 kernel: audit: type=1300 audit(1742236418.032:218): arch=c000003e syscall=188 success=no exit=-22 a0=c000da2340 a1=c000da0228 a2=c000d3abd0 a3=25 items=0 ppid=1 pid=2208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:38.058114 kernel: audit: type=1327 audit(1742236418.032:218): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:33:38.032000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:33:38.104808 kubelet[2208]: I0317 18:33:38.104754 2208 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 18:33:38.104808 kubelet[2208]: I0317 18:33:38.104787 2208 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 18:33:38.104808 kubelet[2208]: I0317 18:33:38.104809 2208 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:33:38.105356 kubelet[2208]: I0317 18:33:38.105325 2208 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 18:33:38.105399 kubelet[2208]: I0317 18:33:38.105348 2208 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 18:33:38.105399 kubelet[2208]: I0317 18:33:38.105370 2208 policy_none.go:49] "None policy: Start" Mar 17 18:33:38.108498 kubelet[2208]: I0317 18:33:38.108478 2208 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 18:33:38.108658 kubelet[2208]: I0317 18:33:38.108648 2208 state_mem.go:35] "Initializing new in-memory state store" Mar 17 18:33:38.109001 kubelet[2208]: I0317 18:33:38.108944 2208 state_mem.go:75] "Updated machine memory state" Mar 17 18:33:38.110200 kubelet[2208]: I0317 18:33:38.110166 2208 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 18:33:38.109000 audit[2208]: AVC avc: denied { mac_admin } for pid=2208 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:33:38.109000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:33:38.109000 audit[2208]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000cbce10 a1=c000633200 a2=c000cbcde0 a3=25 items=0 ppid=1 pid=2208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:38.109000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:33:38.110536 kubelet[2208]: I0317 18:33:38.110233 2208 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Mar 17 18:33:38.110536 kubelet[2208]: I0317 18:33:38.110360 2208 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 18:33:38.110536 kubelet[2208]: I0317 18:33:38.110454 2208 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 18:33:38.148817 kubelet[2208]: I0317 18:33:38.148675 2208 topology_manager.go:215] "Topology Admit Handler" podUID="d79ab404294384d4bcc36fb5b5509bbb" podNamespace="kube-system" podName="kube-scheduler-localhost" Mar 17 18:33:38.148975 kubelet[2208]: I0317 18:33:38.148825 2208 topology_manager.go:215] "Topology Admit Handler" podUID="aa5aaa3c674e8fafcd7a3edee9afeb2e" podNamespace="kube-system" podName="kube-apiserver-localhost" Mar 17 18:33:38.148975 kubelet[2208]: I0317 18:33:38.148882 2208 topology_manager.go:215] "Topology Admit Handler" podUID="23a18e2dc14f395c5f1bea711a5a9344" podNamespace="kube-system" podName="kube-controller-manager-localhost" Mar 17 18:33:38.152064 kubelet[2208]: I0317 18:33:38.152029 2208 kubelet_node_status.go:73] "Attempting to register node" node="localhost" Mar 17 18:33:38.156908 kubelet[2208]: I0317 18:33:38.156862 2208 kubelet_node_status.go:112] "Node was previously registered" node="localhost" Mar 17 18:33:38.157032 kubelet[2208]: I0317 18:33:38.156950 2208 kubelet_node_status.go:76] "Successfully registered node" node="localhost" Mar 17 18:33:38.254491 kubelet[2208]: I0317 18:33:38.254426 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:38.254491 kubelet[2208]: I0317 18:33:38.254469 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/aa5aaa3c674e8fafcd7a3edee9afeb2e-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"aa5aaa3c674e8fafcd7a3edee9afeb2e\") " pod="kube-system/kube-apiserver-localhost" Mar 17 18:33:38.254491 kubelet[2208]: I0317 18:33:38.254496 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:38.254744 kubelet[2208]: I0317 18:33:38.254518 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/aa5aaa3c674e8fafcd7a3edee9afeb2e-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"aa5aaa3c674e8fafcd7a3edee9afeb2e\") " pod="kube-system/kube-apiserver-localhost" Mar 17 18:33:38.254744 kubelet[2208]: I0317 18:33:38.254552 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:38.254744 kubelet[2208]: I0317 18:33:38.254670 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:38.254744 kubelet[2208]: I0317 18:33:38.254714 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/23a18e2dc14f395c5f1bea711a5a9344-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"23a18e2dc14f395c5f1bea711a5a9344\") " pod="kube-system/kube-controller-manager-localhost" Mar 17 18:33:38.254744 kubelet[2208]: I0317 18:33:38.254738 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d79ab404294384d4bcc36fb5b5509bbb-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"d79ab404294384d4bcc36fb5b5509bbb\") " pod="kube-system/kube-scheduler-localhost" Mar 17 18:33:38.254956 kubelet[2208]: I0317 18:33:38.254753 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/aa5aaa3c674e8fafcd7a3edee9afeb2e-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"aa5aaa3c674e8fafcd7a3edee9afeb2e\") " pod="kube-system/kube-apiserver-localhost" Mar 17 18:33:38.455299 kubelet[2208]: E0317 18:33:38.455159 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:38.455474 kubelet[2208]: E0317 18:33:38.455434 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:38.456007 kubelet[2208]: E0317 18:33:38.455982 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:39.030433 kubelet[2208]: I0317 18:33:39.030368 2208 apiserver.go:52] "Watching apiserver" Mar 17 18:33:39.047476 kubelet[2208]: I0317 18:33:39.047444 2208 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 18:33:39.094079 kubelet[2208]: E0317 18:33:39.094050 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:39.095040 kubelet[2208]: E0317 18:33:39.095010 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:39.118261 kubelet[2208]: E0317 18:33:39.118212 2208 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" Mar 17 18:33:39.118652 kubelet[2208]: E0317 18:33:39.118630 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:39.150551 kubelet[2208]: I0317 18:33:39.150482 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.150462427 podStartE2EDuration="1.150462427s" podCreationTimestamp="2025-03-17 18:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:33:39.150252713 +0000 UTC m=+1.187842324" watchObservedRunningTime="2025-03-17 18:33:39.150462427 +0000 UTC m=+1.188052037" Mar 17 18:33:39.202810 kubelet[2208]: I0317 18:33:39.202746 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.202726077 podStartE2EDuration="1.202726077s" podCreationTimestamp="2025-03-17 18:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:33:39.171660565 +0000 UTC m=+1.209250175" watchObservedRunningTime="2025-03-17 18:33:39.202726077 +0000 UTC m=+1.240315687" Mar 17 18:33:40.095074 kubelet[2208]: E0317 18:33:40.095004 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:41.095810 kubelet[2208]: E0317 18:33:41.095766 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:43.460692 sudo[1470]: pam_unix(sudo:session): session closed for user root Mar 17 18:33:43.459000 audit[1470]: USER_END pid=1470 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:43.461771 kernel: kauditd_printk_skb: 4 callbacks suppressed Mar 17 18:33:43.461830 kernel: audit: type=1106 audit(1742236423.459:220): pid=1470 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:43.462805 sshd[1464]: pam_unix(sshd:session): session closed for user core Mar 17 18:33:43.465750 systemd[1]: sshd@6-10.0.0.12:22-10.0.0.1:58486.service: Deactivated successfully. Mar 17 18:33:43.460000 audit[1470]: CRED_DISP pid=1470 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:43.466746 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 18:33:43.467224 systemd-logind[1287]: Session 7 logged out. Waiting for processes to exit. Mar 17 18:33:43.468056 systemd-logind[1287]: Removed session 7. Mar 17 18:33:43.469769 kernel: audit: type=1104 audit(1742236423.460:221): pid=1470 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:33:43.469853 kernel: audit: type=1106 audit(1742236423.463:222): pid=1464 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:43.463000 audit[1464]: USER_END pid=1464 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:43.463000 audit[1464]: CRED_DISP pid=1464 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:43.477984 kernel: audit: type=1104 audit(1742236423.463:223): pid=1464 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:33:43.478045 kernel: audit: type=1131 audit(1742236423.465:224): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.0.12:22-10.0.0.1:58486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:43.465000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.0.12:22-10.0.0.1:58486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:33:46.558000 kubelet[2208]: E0317 18:33:46.557904 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:46.568875 kubelet[2208]: I0317 18:33:46.568788 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=8.568771102 podStartE2EDuration="8.568771102s" podCreationTimestamp="2025-03-17 18:33:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:33:39.202875765 +0000 UTC m=+1.240465365" watchObservedRunningTime="2025-03-17 18:33:46.568771102 +0000 UTC m=+8.606360712" Mar 17 18:33:47.103729 kubelet[2208]: E0317 18:33:47.103699 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:47.469143 update_engine[1292]: I0317 18:33:47.468986 1292 update_attempter.cc:509] Updating boot flags... Mar 17 18:33:48.189493 kubelet[2208]: E0317 18:33:48.189455 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:49.106956 kubelet[2208]: E0317 18:33:49.106735 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:49.579676 kubelet[2208]: E0317 18:33:49.579301 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:52.762159 kubelet[2208]: I0317 18:33:52.762090 2208 topology_manager.go:215] "Topology Admit Handler" podUID="c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6" podNamespace="kube-system" podName="kube-proxy-cgqn7" Mar 17 18:33:52.832517 kubelet[2208]: I0317 18:33:52.832457 2208 topology_manager.go:215] "Topology Admit Handler" podUID="421cc094-8aa3-4cf2-a1d0-04707057618e" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-r2jlm" Mar 17 18:33:52.847844 kubelet[2208]: I0317 18:33:52.847777 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/421cc094-8aa3-4cf2-a1d0-04707057618e-var-lib-calico\") pod \"tigera-operator-7bc55997bb-r2jlm\" (UID: \"421cc094-8aa3-4cf2-a1d0-04707057618e\") " pod="tigera-operator/tigera-operator-7bc55997bb-r2jlm" Mar 17 18:33:52.847844 kubelet[2208]: I0317 18:33:52.847830 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6-lib-modules\") pod \"kube-proxy-cgqn7\" (UID: \"c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6\") " pod="kube-system/kube-proxy-cgqn7" Mar 17 18:33:52.848102 kubelet[2208]: I0317 18:33:52.847860 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmnc5\" (UniqueName: \"kubernetes.io/projected/421cc094-8aa3-4cf2-a1d0-04707057618e-kube-api-access-pmnc5\") pod \"tigera-operator-7bc55997bb-r2jlm\" (UID: \"421cc094-8aa3-4cf2-a1d0-04707057618e\") " pod="tigera-operator/tigera-operator-7bc55997bb-r2jlm" Mar 17 18:33:52.848102 kubelet[2208]: I0317 18:33:52.847880 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6-kube-proxy\") pod \"kube-proxy-cgqn7\" (UID: \"c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6\") " pod="kube-system/kube-proxy-cgqn7" Mar 17 18:33:52.848102 kubelet[2208]: I0317 18:33:52.847897 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xpkf\" (UniqueName: \"kubernetes.io/projected/c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6-kube-api-access-8xpkf\") pod \"kube-proxy-cgqn7\" (UID: \"c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6\") " pod="kube-system/kube-proxy-cgqn7" Mar 17 18:33:52.848102 kubelet[2208]: I0317 18:33:52.847913 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6-xtables-lock\") pod \"kube-proxy-cgqn7\" (UID: \"c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6\") " pod="kube-system/kube-proxy-cgqn7" Mar 17 18:33:52.877771 kubelet[2208]: I0317 18:33:52.877735 2208 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 18:33:52.878259 env[1305]: time="2025-03-17T18:33:52.878224048Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 18:33:52.878557 kubelet[2208]: I0317 18:33:52.878424 2208 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 18:33:53.065811 kubelet[2208]: E0317 18:33:53.065661 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:53.066320 env[1305]: time="2025-03-17T18:33:53.066262748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cgqn7,Uid:c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6,Namespace:kube-system,Attempt:0,}" Mar 17 18:33:53.083941 env[1305]: time="2025-03-17T18:33:53.083844524Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:33:53.084073 env[1305]: time="2025-03-17T18:33:53.083893066Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:33:53.084073 env[1305]: time="2025-03-17T18:33:53.083903866Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:33:53.084309 env[1305]: time="2025-03-17T18:33:53.084254741Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/e0079af828fea43cfa33fd00b859d50be0ccc18a83750bf6ba0da854cb4901f6 pid=2319 runtime=io.containerd.runc.v2 Mar 17 18:33:53.121973 env[1305]: time="2025-03-17T18:33:53.121216462Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-cgqn7,Uid:c4a1cd80-3d9a-49a7-91b3-ccc7b3e302a6,Namespace:kube-system,Attempt:0,} returns sandbox id \"e0079af828fea43cfa33fd00b859d50be0ccc18a83750bf6ba0da854cb4901f6\"" Mar 17 18:33:53.122230 kubelet[2208]: E0317 18:33:53.121803 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:53.124546 env[1305]: time="2025-03-17T18:33:53.124497862Z" level=info msg="CreateContainer within sandbox \"e0079af828fea43cfa33fd00b859d50be0ccc18a83750bf6ba0da854cb4901f6\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 18:33:53.136533 env[1305]: time="2025-03-17T18:33:53.136475606Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-r2jlm,Uid:421cc094-8aa3-4cf2-a1d0-04707057618e,Namespace:tigera-operator,Attempt:0,}" Mar 17 18:33:53.147682 env[1305]: time="2025-03-17T18:33:53.147630108Z" level=info msg="CreateContainer within sandbox \"e0079af828fea43cfa33fd00b859d50be0ccc18a83750bf6ba0da854cb4901f6\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"9409d2dad1a66512981fc39b5ad224be753e779feefb51b70ea92ad8e90b1caf\"" Mar 17 18:33:53.148369 env[1305]: time="2025-03-17T18:33:53.148324024Z" level=info msg="StartContainer for \"9409d2dad1a66512981fc39b5ad224be753e779feefb51b70ea92ad8e90b1caf\"" Mar 17 18:33:53.158088 env[1305]: time="2025-03-17T18:33:53.158019231Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:33:53.158088 env[1305]: time="2025-03-17T18:33:53.158059247Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:33:53.158088 env[1305]: time="2025-03-17T18:33:53.158068825Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:33:53.158270 env[1305]: time="2025-03-17T18:33:53.158222176Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/327b41b3d5c0609429052ca27adc7fe967d45c4569e9dfb869855ac6206a0ad9 pid=2369 runtime=io.containerd.runc.v2 Mar 17 18:33:53.196234 env[1305]: time="2025-03-17T18:33:53.196174163Z" level=info msg="StartContainer for \"9409d2dad1a66512981fc39b5ad224be753e779feefb51b70ea92ad8e90b1caf\" returns successfully" Mar 17 18:33:53.210006 env[1305]: time="2025-03-17T18:33:53.209948352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-r2jlm,Uid:421cc094-8aa3-4cf2-a1d0-04707057618e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"327b41b3d5c0609429052ca27adc7fe967d45c4569e9dfb869855ac6206a0ad9\"" Mar 17 18:33:53.211849 env[1305]: time="2025-03-17T18:33:53.211798490Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Mar 17 18:33:53.261000 audit[2454]: NETFILTER_CFG table=mangle:38 family=2 entries=1 op=nft_register_chain pid=2454 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.261000 audit[2455]: NETFILTER_CFG table=mangle:39 family=10 entries=1 op=nft_register_chain pid=2455 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.267087 kernel: audit: type=1325 audit(1742236433.261:225): table=mangle:38 family=2 entries=1 op=nft_register_chain pid=2454 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.267150 kernel: audit: type=1325 audit(1742236433.261:226): table=mangle:39 family=10 entries=1 op=nft_register_chain pid=2455 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.267167 kernel: audit: type=1300 audit(1742236433.261:226): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffda55527a0 a2=0 a3=7ffda555278c items=0 ppid=2398 pid=2455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.261000 audit[2455]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffda55527a0 a2=0 a3=7ffda555278c items=0 ppid=2398 pid=2455 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.261000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:33:53.274650 kernel: audit: type=1327 audit(1742236433.261:226): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:33:53.274676 kernel: audit: type=1325 audit(1742236433.262:227): table=nat:40 family=10 entries=1 op=nft_register_chain pid=2456 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.262000 audit[2456]: NETFILTER_CFG table=nat:40 family=10 entries=1 op=nft_register_chain pid=2456 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.262000 audit[2456]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe20f865d0 a2=0 a3=7ffe20f865bc items=0 ppid=2398 pid=2456 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.281234 kernel: audit: type=1300 audit(1742236433.262:227): arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe20f865d0 a2=0 a3=7ffe20f865bc items=0 ppid=2398 pid=2456 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.281281 kernel: audit: type=1327 audit(1742236433.262:227): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 18:33:53.262000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 18:33:53.264000 audit[2457]: NETFILTER_CFG table=filter:41 family=10 entries=1 op=nft_register_chain pid=2457 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.285718 kernel: audit: type=1325 audit(1742236433.264:228): table=filter:41 family=10 entries=1 op=nft_register_chain pid=2457 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.285767 kernel: audit: type=1300 audit(1742236433.264:228): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff332fddf0 a2=0 a3=7fff332fdddc items=0 ppid=2398 pid=2457 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.264000 audit[2457]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff332fddf0 a2=0 a3=7fff332fdddc items=0 ppid=2398 pid=2457 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.290768 kernel: audit: type=1327 audit(1742236433.264:228): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 18:33:53.264000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 18:33:53.261000 audit[2454]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff6e0792c0 a2=0 a3=7fff6e0792ac items=0 ppid=2398 pid=2454 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.261000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:33:53.270000 audit[2458]: NETFILTER_CFG table=nat:42 family=2 entries=1 op=nft_register_chain pid=2458 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.270000 audit[2458]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff3e6630f0 a2=0 a3=7fff3e6630dc items=0 ppid=2398 pid=2458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.270000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 18:33:53.271000 audit[2459]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2459 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.271000 audit[2459]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe05296570 a2=0 a3=7ffe0529655c items=0 ppid=2398 pid=2459 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.271000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 18:33:53.362000 audit[2460]: NETFILTER_CFG table=filter:44 family=2 entries=1 op=nft_register_chain pid=2460 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.362000 audit[2460]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffd54b98060 a2=0 a3=7ffd54b9804c items=0 ppid=2398 pid=2460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.362000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Mar 17 18:33:53.364000 audit[2462]: NETFILTER_CFG table=filter:45 family=2 entries=1 op=nft_register_rule pid=2462 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.364000 audit[2462]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffc31190e50 a2=0 a3=7ffc31190e3c items=0 ppid=2398 pid=2462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.364000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Mar 17 18:33:53.367000 audit[2465]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2465 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.367000 audit[2465]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7fff785ff240 a2=0 a3=7fff785ff22c items=0 ppid=2398 pid=2465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.367000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Mar 17 18:33:53.368000 audit[2466]: NETFILTER_CFG table=filter:47 family=2 entries=1 op=nft_register_chain pid=2466 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.368000 audit[2466]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff562da270 a2=0 a3=7fff562da25c items=0 ppid=2398 pid=2466 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.368000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Mar 17 18:33:53.370000 audit[2468]: NETFILTER_CFG table=filter:48 family=2 entries=1 op=nft_register_rule pid=2468 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.370000 audit[2468]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd18378400 a2=0 a3=7ffd183783ec items=0 ppid=2398 pid=2468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.370000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Mar 17 18:33:53.371000 audit[2469]: NETFILTER_CFG table=filter:49 family=2 entries=1 op=nft_register_chain pid=2469 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.371000 audit[2469]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffa32cb890 a2=0 a3=7fffa32cb87c items=0 ppid=2398 pid=2469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.371000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Mar 17 18:33:53.373000 audit[2471]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_rule pid=2471 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.373000 audit[2471]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc431f8b90 a2=0 a3=7ffc431f8b7c items=0 ppid=2398 pid=2471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.373000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Mar 17 18:33:53.376000 audit[2474]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_rule pid=2474 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.376000 audit[2474]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc5a04ad50 a2=0 a3=7ffc5a04ad3c items=0 ppid=2398 pid=2474 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.376000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Mar 17 18:33:53.377000 audit[2475]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2475 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.377000 audit[2475]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffda0190a80 a2=0 a3=7ffda0190a6c items=0 ppid=2398 pid=2475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.377000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Mar 17 18:33:53.379000 audit[2477]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_rule pid=2477 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.379000 audit[2477]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffdaf4d9d00 a2=0 a3=7ffdaf4d9cec items=0 ppid=2398 pid=2477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.379000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Mar 17 18:33:53.380000 audit[2478]: NETFILTER_CFG table=filter:54 family=2 entries=1 op=nft_register_chain pid=2478 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.380000 audit[2478]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffc4330a700 a2=0 a3=7ffc4330a6ec items=0 ppid=2398 pid=2478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.380000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Mar 17 18:33:53.382000 audit[2480]: NETFILTER_CFG table=filter:55 family=2 entries=1 op=nft_register_rule pid=2480 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.382000 audit[2480]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffda471a910 a2=0 a3=7ffda471a8fc items=0 ppid=2398 pid=2480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.382000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 18:33:53.385000 audit[2483]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_rule pid=2483 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.385000 audit[2483]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffeca6a93e0 a2=0 a3=7ffeca6a93cc items=0 ppid=2398 pid=2483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.385000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 18:33:53.389000 audit[2486]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_rule pid=2486 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.389000 audit[2486]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc584db8c0 a2=0 a3=7ffc584db8ac items=0 ppid=2398 pid=2486 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.389000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Mar 17 18:33:53.390000 audit[2487]: NETFILTER_CFG table=nat:58 family=2 entries=1 op=nft_register_chain pid=2487 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.390000 audit[2487]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcca72de90 a2=0 a3=7ffcca72de7c items=0 ppid=2398 pid=2487 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.390000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Mar 17 18:33:53.392000 audit[2489]: NETFILTER_CFG table=nat:59 family=2 entries=1 op=nft_register_rule pid=2489 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.392000 audit[2489]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc3adc4280 a2=0 a3=7ffc3adc426c items=0 ppid=2398 pid=2489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.392000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:33:53.395000 audit[2492]: NETFILTER_CFG table=nat:60 family=2 entries=1 op=nft_register_rule pid=2492 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.395000 audit[2492]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff9d90d0d0 a2=0 a3=7fff9d90d0bc items=0 ppid=2398 pid=2492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.395000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:33:53.396000 audit[2493]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=2493 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.396000 audit[2493]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd9bfbef70 a2=0 a3=7ffd9bfbef5c items=0 ppid=2398 pid=2493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.396000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Mar 17 18:33:53.398000 audit[2495]: NETFILTER_CFG table=nat:62 family=2 entries=1 op=nft_register_rule pid=2495 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:33:53.398000 audit[2495]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffc7362cc00 a2=0 a3=7ffc7362cbec items=0 ppid=2398 pid=2495 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.398000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Mar 17 18:33:53.415000 audit[2501]: NETFILTER_CFG table=filter:63 family=2 entries=8 op=nft_register_rule pid=2501 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:53.415000 audit[2501]: SYSCALL arch=c000003e syscall=46 success=yes exit=5164 a0=3 a1=7fff6b7e8c90 a2=0 a3=7fff6b7e8c7c items=0 ppid=2398 pid=2501 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.415000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:53.423000 audit[2501]: NETFILTER_CFG table=nat:64 family=2 entries=14 op=nft_register_chain pid=2501 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:53.423000 audit[2501]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7fff6b7e8c90 a2=0 a3=7fff6b7e8c7c items=0 ppid=2398 pid=2501 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.423000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:53.424000 audit[2507]: NETFILTER_CFG table=filter:65 family=10 entries=1 op=nft_register_chain pid=2507 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.424000 audit[2507]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffed97ea120 a2=0 a3=7ffed97ea10c items=0 ppid=2398 pid=2507 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.424000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Mar 17 18:33:53.426000 audit[2509]: NETFILTER_CFG table=filter:66 family=10 entries=2 op=nft_register_chain pid=2509 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.426000 audit[2509]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffd8312b280 a2=0 a3=7ffd8312b26c items=0 ppid=2398 pid=2509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.426000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Mar 17 18:33:53.429000 audit[2512]: NETFILTER_CFG table=filter:67 family=10 entries=2 op=nft_register_chain pid=2512 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.429000 audit[2512]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7fff30e478f0 a2=0 a3=7fff30e478dc items=0 ppid=2398 pid=2512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.429000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Mar 17 18:33:53.430000 audit[2513]: NETFILTER_CFG table=filter:68 family=10 entries=1 op=nft_register_chain pid=2513 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.430000 audit[2513]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffeae9cdfc0 a2=0 a3=7ffeae9cdfac items=0 ppid=2398 pid=2513 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.430000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Mar 17 18:33:53.432000 audit[2515]: NETFILTER_CFG table=filter:69 family=10 entries=1 op=nft_register_rule pid=2515 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.432000 audit[2515]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffeb538d960 a2=0 a3=7ffeb538d94c items=0 ppid=2398 pid=2515 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.432000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Mar 17 18:33:53.433000 audit[2516]: NETFILTER_CFG table=filter:70 family=10 entries=1 op=nft_register_chain pid=2516 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.433000 audit[2516]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffde588a010 a2=0 a3=7ffde5889ffc items=0 ppid=2398 pid=2516 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.433000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Mar 17 18:33:53.435000 audit[2518]: NETFILTER_CFG table=filter:71 family=10 entries=1 op=nft_register_rule pid=2518 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.435000 audit[2518]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffca4cd3320 a2=0 a3=7ffca4cd330c items=0 ppid=2398 pid=2518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.435000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Mar 17 18:33:53.438000 audit[2521]: NETFILTER_CFG table=filter:72 family=10 entries=2 op=nft_register_chain pid=2521 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.438000 audit[2521]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7ffcaee4da70 a2=0 a3=7ffcaee4da5c items=0 ppid=2398 pid=2521 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.438000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Mar 17 18:33:53.439000 audit[2522]: NETFILTER_CFG table=filter:73 family=10 entries=1 op=nft_register_chain pid=2522 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.439000 audit[2522]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe10095d10 a2=0 a3=7ffe10095cfc items=0 ppid=2398 pid=2522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.439000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Mar 17 18:33:53.441000 audit[2524]: NETFILTER_CFG table=filter:74 family=10 entries=1 op=nft_register_rule pid=2524 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.441000 audit[2524]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff013b9970 a2=0 a3=7fff013b995c items=0 ppid=2398 pid=2524 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.441000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Mar 17 18:33:53.442000 audit[2525]: NETFILTER_CFG table=filter:75 family=10 entries=1 op=nft_register_chain pid=2525 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.442000 audit[2525]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe89a599d0 a2=0 a3=7ffe89a599bc items=0 ppid=2398 pid=2525 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.442000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Mar 17 18:33:53.444000 audit[2527]: NETFILTER_CFG table=filter:76 family=10 entries=1 op=nft_register_rule pid=2527 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.444000 audit[2527]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffeecb513a0 a2=0 a3=7ffeecb5138c items=0 ppid=2398 pid=2527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.444000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 18:33:53.447000 audit[2530]: NETFILTER_CFG table=filter:77 family=10 entries=1 op=nft_register_rule pid=2530 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.447000 audit[2530]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffee49370d0 a2=0 a3=7ffee49370bc items=0 ppid=2398 pid=2530 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.447000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Mar 17 18:33:53.450000 audit[2533]: NETFILTER_CFG table=filter:78 family=10 entries=1 op=nft_register_rule pid=2533 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.450000 audit[2533]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffc00fc0d00 a2=0 a3=7ffc00fc0cec items=0 ppid=2398 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.450000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Mar 17 18:33:53.451000 audit[2534]: NETFILTER_CFG table=nat:79 family=10 entries=1 op=nft_register_chain pid=2534 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.451000 audit[2534]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff1a9d7230 a2=0 a3=7fff1a9d721c items=0 ppid=2398 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.451000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Mar 17 18:33:53.453000 audit[2536]: NETFILTER_CFG table=nat:80 family=10 entries=2 op=nft_register_chain pid=2536 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.453000 audit[2536]: SYSCALL arch=c000003e syscall=46 success=yes exit=600 a0=3 a1=7ffc11fdb9e0 a2=0 a3=7ffc11fdb9cc items=0 ppid=2398 pid=2536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.453000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:33:53.456000 audit[2539]: NETFILTER_CFG table=nat:81 family=10 entries=2 op=nft_register_chain pid=2539 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.456000 audit[2539]: SYSCALL arch=c000003e syscall=46 success=yes exit=608 a0=3 a1=7ffc4e3829c0 a2=0 a3=7ffc4e3829ac items=0 ppid=2398 pid=2539 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.456000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:33:53.456000 audit[2540]: NETFILTER_CFG table=nat:82 family=10 entries=1 op=nft_register_chain pid=2540 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.456000 audit[2540]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff9a2bc3b0 a2=0 a3=7fff9a2bc39c items=0 ppid=2398 pid=2540 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.456000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Mar 17 18:33:53.458000 audit[2542]: NETFILTER_CFG table=nat:83 family=10 entries=2 op=nft_register_chain pid=2542 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.458000 audit[2542]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffcc04b72c0 a2=0 a3=7ffcc04b72ac items=0 ppid=2398 pid=2542 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.458000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Mar 17 18:33:53.459000 audit[2543]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=2543 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.459000 audit[2543]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffefa43e8b0 a2=0 a3=7ffefa43e89c items=0 ppid=2398 pid=2543 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.459000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Mar 17 18:33:53.461000 audit[2545]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=2545 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.461000 audit[2545]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fffc0f70360 a2=0 a3=7fffc0f7034c items=0 ppid=2398 pid=2545 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.461000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:33:53.464000 audit[2548]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_rule pid=2548 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:33:53.464000 audit[2548]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd4edddda0 a2=0 a3=7ffd4edddd8c items=0 ppid=2398 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.464000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:33:53.467000 audit[2550]: NETFILTER_CFG table=filter:87 family=10 entries=3 op=nft_register_rule pid=2550 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Mar 17 18:33:53.467000 audit[2550]: SYSCALL arch=c000003e syscall=46 success=yes exit=2004 a0=3 a1=7ffd71649c20 a2=0 a3=7ffd71649c0c items=0 ppid=2398 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.467000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:53.467000 audit[2550]: NETFILTER_CFG table=nat:88 family=10 entries=7 op=nft_register_chain pid=2550 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Mar 17 18:33:53.467000 audit[2550]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffd71649c20 a2=0 a3=7ffd71649c0c items=0 ppid=2398 pid=2550 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:53.467000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:54.116638 kubelet[2208]: E0317 18:33:54.116600 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:54.549108 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2136308167.mount: Deactivated successfully. Mar 17 18:33:55.366427 env[1305]: time="2025-03-17T18:33:55.366355604Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator:v1.36.2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:55.368620 env[1305]: time="2025-03-17T18:33:55.368566759Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:55.369973 env[1305]: time="2025-03-17T18:33:55.369942855Z" level=info msg="ImageUpdate event &ImageUpdate{Name:quay.io/tigera/operator:v1.36.2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:55.371559 env[1305]: time="2025-03-17T18:33:55.371526473Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:33:55.372079 env[1305]: time="2025-03-17T18:33:55.372052278Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Mar 17 18:33:55.374643 env[1305]: time="2025-03-17T18:33:55.374590994Z" level=info msg="CreateContainer within sandbox \"327b41b3d5c0609429052ca27adc7fe967d45c4569e9dfb869855ac6206a0ad9\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 18:33:55.384635 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount975100315.mount: Deactivated successfully. Mar 17 18:33:55.385812 env[1305]: time="2025-03-17T18:33:55.385772451Z" level=info msg="CreateContainer within sandbox \"327b41b3d5c0609429052ca27adc7fe967d45c4569e9dfb869855ac6206a0ad9\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"ca388c12a5e5cdd5b7f633c299c15310370856cd94a735fb868fb7dfb912bcf1\"" Mar 17 18:33:55.386370 env[1305]: time="2025-03-17T18:33:55.386324957Z" level=info msg="StartContainer for \"ca388c12a5e5cdd5b7f633c299c15310370856cd94a735fb868fb7dfb912bcf1\"" Mar 17 18:33:55.428989 env[1305]: time="2025-03-17T18:33:55.428891858Z" level=info msg="StartContainer for \"ca388c12a5e5cdd5b7f633c299c15310370856cd94a735fb868fb7dfb912bcf1\" returns successfully" Mar 17 18:33:56.184561 kubelet[2208]: I0317 18:33:56.184496 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-cgqn7" podStartSLOduration=4.1844742010000004 podStartE2EDuration="4.184474201s" podCreationTimestamp="2025-03-17 18:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:33:54.126569377 +0000 UTC m=+16.164158977" watchObservedRunningTime="2025-03-17 18:33:56.184474201 +0000 UTC m=+18.222063831" Mar 17 18:33:56.185319 kubelet[2208]: I0317 18:33:56.185274 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-r2jlm" podStartSLOduration=2.023492362 podStartE2EDuration="4.185265248s" podCreationTimestamp="2025-03-17 18:33:52 +0000 UTC" firstStartedPulling="2025-03-17 18:33:53.211214793 +0000 UTC m=+15.248804403" lastFinishedPulling="2025-03-17 18:33:55.372987689 +0000 UTC m=+17.410577289" observedRunningTime="2025-03-17 18:33:56.185246963 +0000 UTC m=+18.222836574" watchObservedRunningTime="2025-03-17 18:33:56.185265248 +0000 UTC m=+18.222854878" Mar 17 18:33:58.242000 audit[2592]: NETFILTER_CFG table=filter:89 family=2 entries=15 op=nft_register_rule pid=2592 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:58.242000 audit[2592]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffe343f77d0 a2=0 a3=7ffe343f77bc items=0 ppid=2398 pid=2592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:58.242000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:58.247000 audit[2592]: NETFILTER_CFG table=nat:90 family=2 entries=12 op=nft_register_rule pid=2592 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:58.247000 audit[2592]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffe343f77d0 a2=0 a3=0 items=0 ppid=2398 pid=2592 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:58.247000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:58.259000 audit[2594]: NETFILTER_CFG table=filter:91 family=2 entries=16 op=nft_register_rule pid=2594 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:58.259000 audit[2594]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffd65962c30 a2=0 a3=7ffd65962c1c items=0 ppid=2398 pid=2594 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:58.259000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:58.264000 audit[2594]: NETFILTER_CFG table=nat:92 family=2 entries=12 op=nft_register_rule pid=2594 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:58.266418 kernel: kauditd_printk_skb: 152 callbacks suppressed Mar 17 18:33:58.266531 kernel: audit: type=1325 audit(1742236438.264:279): table=nat:92 family=2 entries=12 op=nft_register_rule pid=2594 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:58.264000 audit[2594]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd65962c30 a2=0 a3=0 items=0 ppid=2398 pid=2594 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:58.273625 kernel: audit: type=1300 audit(1742236438.264:279): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd65962c30 a2=0 a3=0 items=0 ppid=2398 pid=2594 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:58.273696 kernel: audit: type=1327 audit(1742236438.264:279): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:58.264000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:58.367433 kubelet[2208]: I0317 18:33:58.367373 2208 topology_manager.go:215] "Topology Admit Handler" podUID="94704619-936c-454e-b51a-0971929c1171" podNamespace="calico-system" podName="calico-typha-7fc7c848b4-blsfz" Mar 17 18:33:58.385380 kubelet[2208]: I0317 18:33:58.385336 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/94704619-936c-454e-b51a-0971929c1171-typha-certs\") pod \"calico-typha-7fc7c848b4-blsfz\" (UID: \"94704619-936c-454e-b51a-0971929c1171\") " pod="calico-system/calico-typha-7fc7c848b4-blsfz" Mar 17 18:33:58.385632 kubelet[2208]: I0317 18:33:58.385606 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94704619-936c-454e-b51a-0971929c1171-tigera-ca-bundle\") pod \"calico-typha-7fc7c848b4-blsfz\" (UID: \"94704619-936c-454e-b51a-0971929c1171\") " pod="calico-system/calico-typha-7fc7c848b4-blsfz" Mar 17 18:33:58.385739 kubelet[2208]: I0317 18:33:58.385719 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trbpg\" (UniqueName: \"kubernetes.io/projected/94704619-936c-454e-b51a-0971929c1171-kube-api-access-trbpg\") pod \"calico-typha-7fc7c848b4-blsfz\" (UID: \"94704619-936c-454e-b51a-0971929c1171\") " pod="calico-system/calico-typha-7fc7c848b4-blsfz" Mar 17 18:33:58.502443 kubelet[2208]: I0317 18:33:58.502298 2208 topology_manager.go:215] "Topology Admit Handler" podUID="eb977058-058c-48a0-a562-5b91756026fc" podNamespace="calico-system" podName="calico-node-gl667" Mar 17 18:33:58.587867 kubelet[2208]: I0317 18:33:58.587798 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-policysync\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.587867 kubelet[2208]: I0317 18:33:58.587843 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-bin-dir\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.587867 kubelet[2208]: I0317 18:33:58.587860 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/eb977058-058c-48a0-a562-5b91756026fc-node-certs\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.587867 kubelet[2208]: I0317 18:33:58.587886 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qlrh\" (UniqueName: \"kubernetes.io/projected/eb977058-058c-48a0-a562-5b91756026fc-kube-api-access-8qlrh\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.587867 kubelet[2208]: I0317 18:33:58.587902 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-var-lib-calico\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.588251 kubelet[2208]: I0317 18:33:58.587933 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-var-run-calico\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.588251 kubelet[2208]: I0317 18:33:58.587947 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-flexvol-driver-host\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.588251 kubelet[2208]: I0317 18:33:58.587963 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb977058-058c-48a0-a562-5b91756026fc-tigera-ca-bundle\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.588251 kubelet[2208]: I0317 18:33:58.587977 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-lib-modules\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.588251 kubelet[2208]: I0317 18:33:58.587990 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-xtables-lock\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.588393 kubelet[2208]: I0317 18:33:58.588004 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-net-dir\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.588393 kubelet[2208]: I0317 18:33:58.588018 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-log-dir\") pod \"calico-node-gl667\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " pod="calico-system/calico-node-gl667" Mar 17 18:33:58.589094 kubelet[2208]: I0317 18:33:58.589059 2208 topology_manager.go:215] "Topology Admit Handler" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" podNamespace="calico-system" podName="csi-node-driver-7dwlz" Mar 17 18:33:58.589422 kubelet[2208]: E0317 18:33:58.589382 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:33:58.671686 kubelet[2208]: E0317 18:33:58.671642 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:58.672220 env[1305]: time="2025-03-17T18:33:58.672182439Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7fc7c848b4-blsfz,Uid:94704619-936c-454e-b51a-0971929c1171,Namespace:calico-system,Attempt:0,}" Mar 17 18:33:58.688296 kubelet[2208]: I0317 18:33:58.688251 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72499494-46cf-4998-b0e6-cf96b6f788d0-kubelet-dir\") pod \"csi-node-driver-7dwlz\" (UID: \"72499494-46cf-4998-b0e6-cf96b6f788d0\") " pod="calico-system/csi-node-driver-7dwlz" Mar 17 18:33:58.688375 kubelet[2208]: I0317 18:33:58.688303 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/72499494-46cf-4998-b0e6-cf96b6f788d0-socket-dir\") pod \"csi-node-driver-7dwlz\" (UID: \"72499494-46cf-4998-b0e6-cf96b6f788d0\") " pod="calico-system/csi-node-driver-7dwlz" Mar 17 18:33:58.688405 kubelet[2208]: I0317 18:33:58.688377 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcj42\" (UniqueName: \"kubernetes.io/projected/72499494-46cf-4998-b0e6-cf96b6f788d0-kube-api-access-dcj42\") pod \"csi-node-driver-7dwlz\" (UID: \"72499494-46cf-4998-b0e6-cf96b6f788d0\") " pod="calico-system/csi-node-driver-7dwlz" Mar 17 18:33:58.688437 kubelet[2208]: I0317 18:33:58.688412 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/72499494-46cf-4998-b0e6-cf96b6f788d0-varrun\") pod \"csi-node-driver-7dwlz\" (UID: \"72499494-46cf-4998-b0e6-cf96b6f788d0\") " pod="calico-system/csi-node-driver-7dwlz" Mar 17 18:33:58.688437 kubelet[2208]: I0317 18:33:58.688434 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/72499494-46cf-4998-b0e6-cf96b6f788d0-registration-dir\") pod \"csi-node-driver-7dwlz\" (UID: \"72499494-46cf-4998-b0e6-cf96b6f788d0\") " pod="calico-system/csi-node-driver-7dwlz" Mar 17 18:33:58.690635 kubelet[2208]: E0317 18:33:58.689651 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.690635 kubelet[2208]: W0317 18:33:58.689673 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.690635 kubelet[2208]: E0317 18:33:58.689702 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.694446 kubelet[2208]: E0317 18:33:58.692155 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.694446 kubelet[2208]: W0317 18:33:58.692166 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.694446 kubelet[2208]: E0317 18:33:58.692180 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.698435 kubelet[2208]: E0317 18:33:58.698273 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.698435 kubelet[2208]: W0317 18:33:58.698285 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.698435 kubelet[2208]: E0317 18:33:58.698357 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.698822 kubelet[2208]: E0317 18:33:58.698807 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.698822 kubelet[2208]: W0317 18:33:58.698821 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.698936 kubelet[2208]: E0317 18:33:58.698903 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.700398 env[1305]: time="2025-03-17T18:33:58.700275252Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:33:58.700398 env[1305]: time="2025-03-17T18:33:58.700346998Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:33:58.700398 env[1305]: time="2025-03-17T18:33:58.700357818Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:33:58.700770 env[1305]: time="2025-03-17T18:33:58.700726295Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7 pid=2606 runtime=io.containerd.runc.v2 Mar 17 18:33:58.705483 kubelet[2208]: E0317 18:33:58.704290 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.705483 kubelet[2208]: W0317 18:33:58.704315 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.705483 kubelet[2208]: E0317 18:33:58.705053 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.706203 kubelet[2208]: E0317 18:33:58.706182 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.706203 kubelet[2208]: W0317 18:33:58.706201 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.706305 kubelet[2208]: E0317 18:33:58.706291 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.707056 kubelet[2208]: E0317 18:33:58.707036 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.707056 kubelet[2208]: W0317 18:33:58.707053 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.707167 kubelet[2208]: E0317 18:33:58.707149 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.710064 kubelet[2208]: E0317 18:33:58.710032 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.710064 kubelet[2208]: W0317 18:33:58.710052 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.710064 kubelet[2208]: E0317 18:33:58.710063 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.766362 env[1305]: time="2025-03-17T18:33:58.766118883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7fc7c848b4-blsfz,Uid:94704619-936c-454e-b51a-0971929c1171,Namespace:calico-system,Attempt:0,} returns sandbox id \"4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7\"" Mar 17 18:33:58.766949 kubelet[2208]: E0317 18:33:58.766902 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:58.768224 env[1305]: time="2025-03-17T18:33:58.768113392Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Mar 17 18:33:58.790426 kubelet[2208]: E0317 18:33:58.789705 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.790426 kubelet[2208]: W0317 18:33:58.789725 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.790426 kubelet[2208]: E0317 18:33:58.789746 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.790426 kubelet[2208]: E0317 18:33:58.789967 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.790426 kubelet[2208]: W0317 18:33:58.789974 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.790426 kubelet[2208]: E0317 18:33:58.789982 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.790426 kubelet[2208]: E0317 18:33:58.790197 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.790426 kubelet[2208]: W0317 18:33:58.790206 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.790426 kubelet[2208]: E0317 18:33:58.790217 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.790426 kubelet[2208]: E0317 18:33:58.790427 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.790926 kubelet[2208]: W0317 18:33:58.790434 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.790926 kubelet[2208]: E0317 18:33:58.790441 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.790926 kubelet[2208]: E0317 18:33:58.790643 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.790926 kubelet[2208]: W0317 18:33:58.790652 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.790926 kubelet[2208]: E0317 18:33:58.790664 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.790926 kubelet[2208]: E0317 18:33:58.790879 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.790926 kubelet[2208]: W0317 18:33:58.790886 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.790926 kubelet[2208]: E0317 18:33:58.790896 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.791354 kubelet[2208]: E0317 18:33:58.791085 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.791354 kubelet[2208]: W0317 18:33:58.791093 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.791354 kubelet[2208]: E0317 18:33:58.791162 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.791354 kubelet[2208]: E0317 18:33:58.791245 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.791354 kubelet[2208]: W0317 18:33:58.791250 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.791354 kubelet[2208]: E0317 18:33:58.791302 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.791516 kubelet[2208]: E0317 18:33:58.791451 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.791516 kubelet[2208]: W0317 18:33:58.791460 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.791565 kubelet[2208]: E0317 18:33:58.791530 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.791656 kubelet[2208]: E0317 18:33:58.791640 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.791656 kubelet[2208]: W0317 18:33:58.791653 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.791740 kubelet[2208]: E0317 18:33:58.791720 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.791819 kubelet[2208]: E0317 18:33:58.791806 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.791819 kubelet[2208]: W0317 18:33:58.791814 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.791926 kubelet[2208]: E0317 18:33:58.791879 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.792009 kubelet[2208]: E0317 18:33:58.791995 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.792009 kubelet[2208]: W0317 18:33:58.792005 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.792089 kubelet[2208]: E0317 18:33:58.792026 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.792279 kubelet[2208]: E0317 18:33:58.792258 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.792279 kubelet[2208]: W0317 18:33:58.792271 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.792367 kubelet[2208]: E0317 18:33:58.792291 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.792513 kubelet[2208]: E0317 18:33:58.792501 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.792513 kubelet[2208]: W0317 18:33:58.792512 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.792605 kubelet[2208]: E0317 18:33:58.792522 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.792702 kubelet[2208]: E0317 18:33:58.792687 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.792702 kubelet[2208]: W0317 18:33:58.792699 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.792782 kubelet[2208]: E0317 18:33:58.792754 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.792854 kubelet[2208]: E0317 18:33:58.792840 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.792937 kubelet[2208]: W0317 18:33:58.792858 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.792971 kubelet[2208]: E0317 18:33:58.792944 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.793062 kubelet[2208]: E0317 18:33:58.793049 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.793062 kubelet[2208]: W0317 18:33:58.793057 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.793147 kubelet[2208]: E0317 18:33:58.793113 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.793210 kubelet[2208]: E0317 18:33:58.793197 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.793210 kubelet[2208]: W0317 18:33:58.793205 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.793297 kubelet[2208]: E0317 18:33:58.793272 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.793369 kubelet[2208]: E0317 18:33:58.793357 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.793369 kubelet[2208]: W0317 18:33:58.793365 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.793454 kubelet[2208]: E0317 18:33:58.793373 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.793534 kubelet[2208]: E0317 18:33:58.793521 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.793534 kubelet[2208]: W0317 18:33:58.793531 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.793619 kubelet[2208]: E0317 18:33:58.793539 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.793812 kubelet[2208]: E0317 18:33:58.793781 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.793812 kubelet[2208]: W0317 18:33:58.793793 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.793812 kubelet[2208]: E0317 18:33:58.793805 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.794059 kubelet[2208]: E0317 18:33:58.794045 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.794059 kubelet[2208]: W0317 18:33:58.794057 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.794148 kubelet[2208]: E0317 18:33:58.794067 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.794414 kubelet[2208]: E0317 18:33:58.794390 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.794414 kubelet[2208]: W0317 18:33:58.794400 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.794414 kubelet[2208]: E0317 18:33:58.794409 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.794640 kubelet[2208]: E0317 18:33:58.794627 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.794640 kubelet[2208]: W0317 18:33:58.794638 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.794743 kubelet[2208]: E0317 18:33:58.794648 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.794841 kubelet[2208]: E0317 18:33:58.794828 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.794841 kubelet[2208]: W0317 18:33:58.794837 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.794963 kubelet[2208]: E0317 18:33:58.794844 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.801971 kubelet[2208]: E0317 18:33:58.801933 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:33:58.801971 kubelet[2208]: W0317 18:33:58.801961 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:33:58.802096 kubelet[2208]: E0317 18:33:58.801988 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:33:58.811614 kubelet[2208]: E0317 18:33:58.811453 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:58.818042 env[1305]: time="2025-03-17T18:33:58.815703414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gl667,Uid:eb977058-058c-48a0-a562-5b91756026fc,Namespace:calico-system,Attempt:0,}" Mar 17 18:33:58.849046 env[1305]: time="2025-03-17T18:33:58.848987248Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:33:58.849269 env[1305]: time="2025-03-17T18:33:58.849243403Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:33:58.849392 env[1305]: time="2025-03-17T18:33:58.849367637Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:33:58.849697 env[1305]: time="2025-03-17T18:33:58.849663016Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c pid=2679 runtime=io.containerd.runc.v2 Mar 17 18:33:58.912709 env[1305]: time="2025-03-17T18:33:58.912652181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-gl667,Uid:eb977058-058c-48a0-a562-5b91756026fc,Namespace:calico-system,Attempt:0,} returns sandbox id \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\"" Mar 17 18:33:58.913507 kubelet[2208]: E0317 18:33:58.913091 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:33:59.281000 audit[2715]: NETFILTER_CFG table=filter:93 family=2 entries=17 op=nft_register_rule pid=2715 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:59.281000 audit[2715]: SYSCALL arch=c000003e syscall=46 success=yes exit=6652 a0=3 a1=7fff22ec8510 a2=0 a3=7fff22ec84fc items=0 ppid=2398 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:59.289235 kernel: audit: type=1325 audit(1742236439.281:280): table=filter:93 family=2 entries=17 op=nft_register_rule pid=2715 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:59.289312 kernel: audit: type=1300 audit(1742236439.281:280): arch=c000003e syscall=46 success=yes exit=6652 a0=3 a1=7fff22ec8510 a2=0 a3=7fff22ec84fc items=0 ppid=2398 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:59.289336 kernel: audit: type=1327 audit(1742236439.281:280): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:59.281000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:59.292000 audit[2715]: NETFILTER_CFG table=nat:94 family=2 entries=12 op=nft_register_rule pid=2715 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:59.292000 audit[2715]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff22ec8510 a2=0 a3=0 items=0 ppid=2398 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:59.299971 kernel: audit: type=1325 audit(1742236439.292:281): table=nat:94 family=2 entries=12 op=nft_register_rule pid=2715 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:33:59.300032 kernel: audit: type=1300 audit(1742236439.292:281): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fff22ec8510 a2=0 a3=0 items=0 ppid=2398 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:33:59.300055 kernel: audit: type=1327 audit(1742236439.292:281): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:33:59.292000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:00.604503 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2628412247.mount: Deactivated successfully. Mar 17 18:34:01.048370 kubelet[2208]: E0317 18:34:01.048239 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:34:01.855733 env[1305]: time="2025-03-17T18:34:01.855665505Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:01.875305 env[1305]: time="2025-03-17T18:34:01.875219391Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:01.879760 env[1305]: time="2025-03-17T18:34:01.879725586Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/typha:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:01.889212 env[1305]: time="2025-03-17T18:34:01.889155663Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:01.889694 env[1305]: time="2025-03-17T18:34:01.889659224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Mar 17 18:34:01.890836 env[1305]: time="2025-03-17T18:34:01.890785750Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Mar 17 18:34:01.900398 env[1305]: time="2025-03-17T18:34:01.900349921Z" level=info msg="CreateContainer within sandbox \"4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 18:34:01.972378 env[1305]: time="2025-03-17T18:34:01.972325856Z" level=info msg="CreateContainer within sandbox \"4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\"" Mar 17 18:34:01.972936 env[1305]: time="2025-03-17T18:34:01.972890663Z" level=info msg="StartContainer for \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\"" Mar 17 18:34:02.033367 env[1305]: time="2025-03-17T18:34:02.033276709Z" level=info msg="StartContainer for \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\" returns successfully" Mar 17 18:34:02.136897 kubelet[2208]: E0317 18:34:02.136726 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:02.149354 kubelet[2208]: I0317 18:34:02.148910 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7fc7c848b4-blsfz" podStartSLOduration=1.025976964 podStartE2EDuration="4.148889266s" podCreationTimestamp="2025-03-17 18:33:58 +0000 UTC" firstStartedPulling="2025-03-17 18:33:58.76773125 +0000 UTC m=+20.805320860" lastFinishedPulling="2025-03-17 18:34:01.890643552 +0000 UTC m=+23.928233162" observedRunningTime="2025-03-17 18:34:02.148424408 +0000 UTC m=+24.186014048" watchObservedRunningTime="2025-03-17 18:34:02.148889266 +0000 UTC m=+24.186478876" Mar 17 18:34:02.206607 kubelet[2208]: E0317 18:34:02.206562 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.206607 kubelet[2208]: W0317 18:34:02.206584 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.206607 kubelet[2208]: E0317 18:34:02.206602 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.206954 kubelet[2208]: E0317 18:34:02.206911 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.207010 kubelet[2208]: W0317 18:34:02.206951 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.207010 kubelet[2208]: E0317 18:34:02.206966 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.207253 kubelet[2208]: E0317 18:34:02.207209 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.207253 kubelet[2208]: W0317 18:34:02.207244 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.207336 kubelet[2208]: E0317 18:34:02.207260 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.207443 kubelet[2208]: E0317 18:34:02.207430 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.207443 kubelet[2208]: W0317 18:34:02.207439 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.207507 kubelet[2208]: E0317 18:34:02.207447 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.207574 kubelet[2208]: E0317 18:34:02.207564 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.207574 kubelet[2208]: W0317 18:34:02.207573 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.207636 kubelet[2208]: E0317 18:34:02.207583 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.207752 kubelet[2208]: E0317 18:34:02.207739 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.207785 kubelet[2208]: W0317 18:34:02.207752 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.207785 kubelet[2208]: E0317 18:34:02.207762 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.207968 kubelet[2208]: E0317 18:34:02.207957 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.207968 kubelet[2208]: W0317 18:34:02.207967 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.208023 kubelet[2208]: E0317 18:34:02.207976 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.208143 kubelet[2208]: E0317 18:34:02.208129 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.208143 kubelet[2208]: W0317 18:34:02.208141 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.208226 kubelet[2208]: E0317 18:34:02.208151 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.208310 kubelet[2208]: E0317 18:34:02.208299 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.208310 kubelet[2208]: W0317 18:34:02.208308 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.208378 kubelet[2208]: E0317 18:34:02.208315 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.208449 kubelet[2208]: E0317 18:34:02.208433 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.208449 kubelet[2208]: W0317 18:34:02.208444 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.208449 kubelet[2208]: E0317 18:34:02.208451 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.208711 kubelet[2208]: E0317 18:34:02.208613 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.208711 kubelet[2208]: W0317 18:34:02.208624 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.208711 kubelet[2208]: E0317 18:34:02.208632 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.208816 kubelet[2208]: E0317 18:34:02.208801 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.208816 kubelet[2208]: W0317 18:34:02.208812 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.208859 kubelet[2208]: E0317 18:34:02.208824 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.209015 kubelet[2208]: E0317 18:34:02.209001 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.209015 kubelet[2208]: W0317 18:34:02.209011 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.209094 kubelet[2208]: E0317 18:34:02.209019 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.209172 kubelet[2208]: E0317 18:34:02.209161 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.209172 kubelet[2208]: W0317 18:34:02.209170 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.209231 kubelet[2208]: E0317 18:34:02.209177 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.209301 kubelet[2208]: E0317 18:34:02.209292 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.209301 kubelet[2208]: W0317 18:34:02.209300 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.209353 kubelet[2208]: E0317 18:34:02.209306 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.216577 kubelet[2208]: E0317 18:34:02.216558 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.216577 kubelet[2208]: W0317 18:34:02.216570 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.216647 kubelet[2208]: E0317 18:34:02.216580 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.216851 kubelet[2208]: E0317 18:34:02.216830 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.216851 kubelet[2208]: W0317 18:34:02.216842 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.216904 kubelet[2208]: E0317 18:34:02.216852 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.217088 kubelet[2208]: E0317 18:34:02.217074 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.217088 kubelet[2208]: W0317 18:34:02.217087 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.217183 kubelet[2208]: E0317 18:34:02.217102 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.217316 kubelet[2208]: E0317 18:34:02.217294 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.217316 kubelet[2208]: W0317 18:34:02.217310 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.217416 kubelet[2208]: E0317 18:34:02.217335 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.217509 kubelet[2208]: E0317 18:34:02.217493 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.217509 kubelet[2208]: W0317 18:34:02.217504 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.217587 kubelet[2208]: E0317 18:34:02.217519 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.217734 kubelet[2208]: E0317 18:34:02.217711 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.217734 kubelet[2208]: W0317 18:34:02.217725 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.217836 kubelet[2208]: E0317 18:34:02.217741 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.218108 kubelet[2208]: E0317 18:34:02.218072 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.218175 kubelet[2208]: W0317 18:34:02.218107 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.218175 kubelet[2208]: E0317 18:34:02.218146 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.218341 kubelet[2208]: E0317 18:34:02.218321 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.218341 kubelet[2208]: W0317 18:34:02.218336 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.218445 kubelet[2208]: E0317 18:34:02.218363 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.218583 kubelet[2208]: E0317 18:34:02.218564 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.218583 kubelet[2208]: W0317 18:34:02.218576 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.218680 kubelet[2208]: E0317 18:34:02.218600 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.218841 kubelet[2208]: E0317 18:34:02.218822 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.218841 kubelet[2208]: W0317 18:34:02.218835 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.218966 kubelet[2208]: E0317 18:34:02.218859 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.219110 kubelet[2208]: E0317 18:34:02.219092 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.219110 kubelet[2208]: W0317 18:34:02.219104 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.219209 kubelet[2208]: E0317 18:34:02.219143 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.219293 kubelet[2208]: E0317 18:34:02.219274 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.219293 kubelet[2208]: W0317 18:34:02.219287 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.219411 kubelet[2208]: E0317 18:34:02.219316 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.219539 kubelet[2208]: E0317 18:34:02.219518 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.219539 kubelet[2208]: W0317 18:34:02.219533 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.219628 kubelet[2208]: E0317 18:34:02.219553 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.219764 kubelet[2208]: E0317 18:34:02.219745 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.219764 kubelet[2208]: W0317 18:34:02.219759 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.219859 kubelet[2208]: E0317 18:34:02.219777 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.220060 kubelet[2208]: E0317 18:34:02.220046 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.220060 kubelet[2208]: W0317 18:34:02.220059 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.220117 kubelet[2208]: E0317 18:34:02.220075 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.220323 kubelet[2208]: E0317 18:34:02.220309 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.220323 kubelet[2208]: W0317 18:34:02.220322 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.220399 kubelet[2208]: E0317 18:34:02.220336 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.220534 kubelet[2208]: E0317 18:34:02.220520 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.220558 kubelet[2208]: W0317 18:34:02.220532 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.220558 kubelet[2208]: E0317 18:34:02.220548 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:02.220736 kubelet[2208]: E0317 18:34:02.220726 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:02.220763 kubelet[2208]: W0317 18:34:02.220737 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:02.220763 kubelet[2208]: E0317 18:34:02.220746 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.047978 kubelet[2208]: E0317 18:34:03.047904 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:34:03.137850 kubelet[2208]: I0317 18:34:03.137808 2208 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 18:34:03.138434 kubelet[2208]: E0317 18:34:03.138414 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:03.219579 kubelet[2208]: E0317 18:34:03.219525 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.219579 kubelet[2208]: W0317 18:34:03.219566 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.219579 kubelet[2208]: E0317 18:34:03.219602 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.220024 kubelet[2208]: E0317 18:34:03.220008 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.220024 kubelet[2208]: W0317 18:34:03.220021 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.220086 kubelet[2208]: E0317 18:34:03.220030 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.220286 kubelet[2208]: E0317 18:34:03.220251 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.220345 kubelet[2208]: W0317 18:34:03.220298 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.220345 kubelet[2208]: E0317 18:34:03.220328 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.220662 kubelet[2208]: E0317 18:34:03.220634 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.220662 kubelet[2208]: W0317 18:34:03.220648 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.220662 kubelet[2208]: E0317 18:34:03.220657 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.220930 kubelet[2208]: E0317 18:34:03.220887 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.220930 kubelet[2208]: W0317 18:34:03.220902 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.220930 kubelet[2208]: E0317 18:34:03.220925 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.221244 kubelet[2208]: E0317 18:34:03.221215 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.221244 kubelet[2208]: W0317 18:34:03.221233 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.221244 kubelet[2208]: E0317 18:34:03.221244 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.221466 kubelet[2208]: E0317 18:34:03.221435 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.221466 kubelet[2208]: W0317 18:34:03.221451 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.221466 kubelet[2208]: E0317 18:34:03.221460 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.221647 kubelet[2208]: E0317 18:34:03.221632 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.221647 kubelet[2208]: W0317 18:34:03.221644 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.221703 kubelet[2208]: E0317 18:34:03.221653 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.221886 kubelet[2208]: E0317 18:34:03.221868 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.221886 kubelet[2208]: W0317 18:34:03.221880 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.221886 kubelet[2208]: E0317 18:34:03.221890 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.222152 kubelet[2208]: E0317 18:34:03.222133 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.222152 kubelet[2208]: W0317 18:34:03.222146 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.222152 kubelet[2208]: E0317 18:34:03.222155 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.222434 kubelet[2208]: E0317 18:34:03.222404 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.222434 kubelet[2208]: W0317 18:34:03.222420 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.222434 kubelet[2208]: E0317 18:34:03.222430 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.222743 kubelet[2208]: E0317 18:34:03.222702 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.222743 kubelet[2208]: W0317 18:34:03.222727 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.222743 kubelet[2208]: E0317 18:34:03.222755 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.223450 kubelet[2208]: E0317 18:34:03.223429 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.223450 kubelet[2208]: W0317 18:34:03.223444 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.223564 kubelet[2208]: E0317 18:34:03.223454 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.223727 kubelet[2208]: E0317 18:34:03.223651 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.223727 kubelet[2208]: W0317 18:34:03.223659 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.223727 kubelet[2208]: E0317 18:34:03.223668 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.224061 kubelet[2208]: E0317 18:34:03.223866 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.224061 kubelet[2208]: W0317 18:34:03.223877 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.224061 kubelet[2208]: E0317 18:34:03.223887 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.224160 kubelet[2208]: E0317 18:34:03.224130 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.224160 kubelet[2208]: W0317 18:34:03.224143 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.224160 kubelet[2208]: E0317 18:34:03.224156 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.224348 kubelet[2208]: E0317 18:34:03.224328 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.224348 kubelet[2208]: W0317 18:34:03.224341 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.224447 kubelet[2208]: E0317 18:34:03.224354 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.224546 kubelet[2208]: E0317 18:34:03.224528 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.224546 kubelet[2208]: W0317 18:34:03.224541 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.224634 kubelet[2208]: E0317 18:34:03.224556 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.224824 kubelet[2208]: E0317 18:34:03.224732 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.224824 kubelet[2208]: W0317 18:34:03.224745 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.224824 kubelet[2208]: E0317 18:34:03.224761 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.224989 kubelet[2208]: E0317 18:34:03.224971 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.224989 kubelet[2208]: W0317 18:34:03.224981 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.224989 kubelet[2208]: E0317 18:34:03.224989 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.225166 kubelet[2208]: E0317 18:34:03.225148 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.225166 kubelet[2208]: W0317 18:34:03.225159 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.225166 kubelet[2208]: E0317 18:34:03.225169 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.225494 kubelet[2208]: E0317 18:34:03.225470 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.225494 kubelet[2208]: W0317 18:34:03.225489 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.225593 kubelet[2208]: E0317 18:34:03.225513 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.225869 kubelet[2208]: E0317 18:34:03.225838 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.225936 kubelet[2208]: W0317 18:34:03.225869 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.225936 kubelet[2208]: E0317 18:34:03.225905 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.226185 kubelet[2208]: E0317 18:34:03.226167 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.226185 kubelet[2208]: W0317 18:34:03.226182 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.226288 kubelet[2208]: E0317 18:34:03.226198 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.226423 kubelet[2208]: E0317 18:34:03.226404 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.226423 kubelet[2208]: W0317 18:34:03.226420 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.226500 kubelet[2208]: E0317 18:34:03.226438 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.226711 kubelet[2208]: E0317 18:34:03.226693 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.226711 kubelet[2208]: W0317 18:34:03.226706 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.226818 kubelet[2208]: E0317 18:34:03.226723 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.226962 kubelet[2208]: E0317 18:34:03.226945 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.226962 kubelet[2208]: W0317 18:34:03.226958 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.227038 kubelet[2208]: E0317 18:34:03.226974 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.227214 kubelet[2208]: E0317 18:34:03.227198 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.227214 kubelet[2208]: W0317 18:34:03.227209 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.227293 kubelet[2208]: E0317 18:34:03.227241 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.227388 kubelet[2208]: E0317 18:34:03.227374 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.227388 kubelet[2208]: W0317 18:34:03.227385 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.227444 kubelet[2208]: E0317 18:34:03.227394 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.227553 kubelet[2208]: E0317 18:34:03.227541 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.227553 kubelet[2208]: W0317 18:34:03.227551 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.227605 kubelet[2208]: E0317 18:34:03.227563 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.227752 kubelet[2208]: E0317 18:34:03.227740 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.227752 kubelet[2208]: W0317 18:34:03.227750 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.227817 kubelet[2208]: E0317 18:34:03.227762 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.228115 kubelet[2208]: E0317 18:34:03.228097 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.228115 kubelet[2208]: W0317 18:34:03.228113 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.228192 kubelet[2208]: E0317 18:34:03.228133 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:03.228358 kubelet[2208]: E0317 18:34:03.228341 2208 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:34:03.228358 kubelet[2208]: W0317 18:34:03.228356 2208 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:34:03.228412 kubelet[2208]: E0317 18:34:03.228368 2208 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:34:04.010233 env[1305]: time="2025-03-17T18:34:04.010150208Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:04.082707 env[1305]: time="2025-03-17T18:34:04.082639789Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:04.155938 env[1305]: time="2025-03-17T18:34:04.155842755Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:04.158996 env[1305]: time="2025-03-17T18:34:04.158956645Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:04.159466 env[1305]: time="2025-03-17T18:34:04.159430128Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Mar 17 18:34:04.161871 env[1305]: time="2025-03-17T18:34:04.161828710Z" level=info msg="CreateContainer within sandbox \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 18:34:04.183808 env[1305]: time="2025-03-17T18:34:04.183738846Z" level=info msg="CreateContainer within sandbox \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0\"" Mar 17 18:34:04.184474 env[1305]: time="2025-03-17T18:34:04.184407888Z" level=info msg="StartContainer for \"50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0\"" Mar 17 18:34:04.237104 env[1305]: time="2025-03-17T18:34:04.237037318Z" level=info msg="StartContainer for \"50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0\" returns successfully" Mar 17 18:34:04.296256 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0-rootfs.mount: Deactivated successfully. Mar 17 18:34:04.684683 env[1305]: time="2025-03-17T18:34:04.684618071Z" level=info msg="shim disconnected" id=50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0 Mar 17 18:34:04.684683 env[1305]: time="2025-03-17T18:34:04.684684386Z" level=warning msg="cleaning up after shim disconnected" id=50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0 namespace=k8s.io Mar 17 18:34:04.685024 env[1305]: time="2025-03-17T18:34:04.684700897Z" level=info msg="cleaning up dead shim" Mar 17 18:34:04.692770 env[1305]: time="2025-03-17T18:34:04.692712932Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:04Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=2870 runtime=io.containerd.runc.v2\n" Mar 17 18:34:05.047885 kubelet[2208]: E0317 18:34:05.047759 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:34:05.142771 kubelet[2208]: E0317 18:34:05.142709 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:05.143507 env[1305]: time="2025-03-17T18:34:05.143466590Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Mar 17 18:34:06.404741 systemd[1]: Started sshd@7-10.0.0.12:22-10.0.0.1:46576.service. Mar 17 18:34:06.403000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.0.12:22-10.0.0.1:46576 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:06.408939 kernel: audit: type=1130 audit(1742236446.403:282): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.0.12:22-10.0.0.1:46576 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:06.474000 audit[2892]: USER_ACCT pid=2892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.475408 sshd[2892]: Accepted publickey for core from 10.0.0.1 port 46576 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:06.479417 sshd[2892]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:06.478000 audit[2892]: CRED_ACQ pid=2892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.483608 kernel: audit: type=1101 audit(1742236446.474:283): pid=2892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.483764 kernel: audit: type=1103 audit(1742236446.478:284): pid=2892 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.483806 kernel: audit: type=1006 audit(1742236446.478:285): pid=2892 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=8 res=1 Mar 17 18:34:06.485795 systemd-logind[1287]: New session 8 of user core. Mar 17 18:34:06.486362 kernel: audit: type=1300 audit(1742236446.478:285): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff15049b30 a2=3 a3=0 items=0 ppid=1 pid=2892 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:06.478000 audit[2892]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff15049b30 a2=3 a3=0 items=0 ppid=1 pid=2892 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:06.486949 systemd[1]: Started session-8.scope. Mar 17 18:34:06.478000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:06.490000 audit[2892]: USER_START pid=2892 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.497894 kernel: audit: type=1327 audit(1742236446.478:285): proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:06.497967 kernel: audit: type=1105 audit(1742236446.490:286): pid=2892 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.497990 kernel: audit: type=1103 audit(1742236446.492:287): pid=2895 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.492000 audit[2895]: CRED_ACQ pid=2895 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.606928 sshd[2892]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:06.607000 audit[2892]: USER_END pid=2892 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.610049 systemd[1]: sshd@7-10.0.0.12:22-10.0.0.1:46576.service: Deactivated successfully. Mar 17 18:34:06.610782 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 18:34:06.611690 systemd-logind[1287]: Session 8 logged out. Waiting for processes to exit. Mar 17 18:34:06.607000 audit[2892]: CRED_DISP pid=2892 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.612467 systemd-logind[1287]: Removed session 8. Mar 17 18:34:06.615847 kernel: audit: type=1106 audit(1742236446.607:288): pid=2892 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.615954 kernel: audit: type=1104 audit(1742236446.607:289): pid=2892 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:06.609000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.0.12:22-10.0.0.1:46576 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:07.048158 kubelet[2208]: E0317 18:34:07.048086 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:34:09.048175 kubelet[2208]: E0317 18:34:09.048108 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:34:11.050947 kubelet[2208]: E0317 18:34:11.050865 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:34:11.615112 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:34:11.615262 kernel: audit: type=1130 audit(1742236451.609:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.0.12:22-10.0.0.1:46590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:11.609000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.0.12:22-10.0.0.1:46590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:11.609857 systemd[1]: Started sshd@8-10.0.0.12:22-10.0.0.1:46590.service. Mar 17 18:34:11.644000 audit[2907]: USER_ACCT pid=2907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:11.650357 sshd[2907]: Accepted publickey for core from 10.0.0.1 port 46590 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:11.650580 sshd[2907]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:11.658224 kernel: audit: type=1101 audit(1742236451.644:292): pid=2907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:11.658333 kernel: audit: type=1103 audit(1742236451.649:293): pid=2907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:11.658369 kernel: audit: type=1006 audit(1742236451.649:294): pid=2907 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Mar 17 18:34:11.649000 audit[2907]: CRED_ACQ pid=2907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:11.655373 systemd-logind[1287]: New session 9 of user core. Mar 17 18:34:11.655876 systemd[1]: Started session-9.scope. Mar 17 18:34:11.665345 kernel: audit: type=1300 audit(1742236451.649:294): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa83d1140 a2=3 a3=0 items=0 ppid=1 pid=2907 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:11.665443 kernel: audit: type=1327 audit(1742236451.649:294): proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:11.649000 audit[2907]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffa83d1140 a2=3 a3=0 items=0 ppid=1 pid=2907 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:11.649000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:11.670726 kernel: audit: type=1105 audit(1742236451.661:295): pid=2907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:11.661000 audit[2907]: USER_START pid=2907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:11.674956 kernel: audit: type=1103 audit(1742236451.663:296): pid=2910 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:11.663000 audit[2910]: CRED_ACQ pid=2910 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:12.302853 sshd[2907]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:12.303000 audit[2907]: USER_END pid=2907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:12.305258 systemd[1]: sshd@8-10.0.0.12:22-10.0.0.1:46590.service: Deactivated successfully. Mar 17 18:34:12.306033 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 18:34:12.303000 audit[2907]: CRED_DISP pid=2907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:12.311576 systemd-logind[1287]: Session 9 logged out. Waiting for processes to exit. Mar 17 18:34:12.311951 kernel: audit: type=1106 audit(1742236452.303:297): pid=2907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:12.312075 kernel: audit: type=1104 audit(1742236452.303:298): pid=2907 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:12.304000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.0.12:22-10.0.0.1:46590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:12.312443 systemd-logind[1287]: Removed session 9. Mar 17 18:34:12.611412 env[1305]: time="2025-03-17T18:34:12.611357228Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:12.613851 env[1305]: time="2025-03-17T18:34:12.613826344Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:12.615894 env[1305]: time="2025-03-17T18:34:12.615855050Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/cni:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:12.618800 env[1305]: time="2025-03-17T18:34:12.618758873Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Mar 17 18:34:12.619104 env[1305]: time="2025-03-17T18:34:12.619075569Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:12.621341 env[1305]: time="2025-03-17T18:34:12.621273594Z" level=info msg="CreateContainer within sandbox \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 18:34:12.639127 env[1305]: time="2025-03-17T18:34:12.639071606Z" level=info msg="CreateContainer within sandbox \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f\"" Mar 17 18:34:12.639645 env[1305]: time="2025-03-17T18:34:12.639601533Z" level=info msg="StartContainer for \"8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f\"" Mar 17 18:34:12.683546 env[1305]: time="2025-03-17T18:34:12.683492213Z" level=info msg="StartContainer for \"8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f\" returns successfully" Mar 17 18:34:13.048254 kubelet[2208]: E0317 18:34:13.048099 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:34:13.159447 kubelet[2208]: E0317 18:34:13.159393 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:13.566213 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f-rootfs.mount: Deactivated successfully. Mar 17 18:34:13.570128 env[1305]: time="2025-03-17T18:34:13.570087772Z" level=info msg="shim disconnected" id=8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f Mar 17 18:34:13.570215 env[1305]: time="2025-03-17T18:34:13.570130362Z" level=warning msg="cleaning up after shim disconnected" id=8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f namespace=k8s.io Mar 17 18:34:13.570215 env[1305]: time="2025-03-17T18:34:13.570138418Z" level=info msg="cleaning up dead shim" Mar 17 18:34:13.576945 env[1305]: time="2025-03-17T18:34:13.576883104Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:13Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=2969 runtime=io.containerd.runc.v2\n" Mar 17 18:34:13.641312 kubelet[2208]: I0317 18:34:13.641277 2208 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 17 18:34:13.663976 kubelet[2208]: I0317 18:34:13.662456 2208 topology_manager.go:215] "Topology Admit Handler" podUID="6293eeb9-1677-4d5c-b023-b36c91a971a4" podNamespace="kube-system" podName="coredns-7db6d8ff4d-m5vrn" Mar 17 18:34:13.663976 kubelet[2208]: I0317 18:34:13.662720 2208 topology_manager.go:215] "Topology Admit Handler" podUID="7b08c7f0-5fd2-4720-bc11-16f99c67d9d2" podNamespace="kube-system" podName="coredns-7db6d8ff4d-z2tns" Mar 17 18:34:13.665476 kubelet[2208]: I0317 18:34:13.665455 2208 topology_manager.go:215] "Topology Admit Handler" podUID="84b5b251-95c6-46de-a5be-62ebfcea6ad9" podNamespace="calico-system" podName="calico-kube-controllers-7f5cc97b76-r9pvl" Mar 17 18:34:13.666118 kubelet[2208]: I0317 18:34:13.666065 2208 topology_manager.go:215] "Topology Admit Handler" podUID="78d53884-0ccd-4a74-9597-a16b3590ffd8" podNamespace="calico-apiserver" podName="calico-apiserver-649897fff-xtjrl" Mar 17 18:34:13.666311 kubelet[2208]: I0317 18:34:13.666277 2208 topology_manager.go:215] "Topology Admit Handler" podUID="9442466c-0eea-4718-be43-447a2a2a790c" podNamespace="calico-apiserver" podName="calico-apiserver-649897fff-47dt2" Mar 17 18:34:13.798823 kubelet[2208]: I0317 18:34:13.798760 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84b5b251-95c6-46de-a5be-62ebfcea6ad9-tigera-ca-bundle\") pod \"calico-kube-controllers-7f5cc97b76-r9pvl\" (UID: \"84b5b251-95c6-46de-a5be-62ebfcea6ad9\") " pod="calico-system/calico-kube-controllers-7f5cc97b76-r9pvl" Mar 17 18:34:13.798823 kubelet[2208]: I0317 18:34:13.798810 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jns5w\" (UniqueName: \"kubernetes.io/projected/7b08c7f0-5fd2-4720-bc11-16f99c67d9d2-kube-api-access-jns5w\") pod \"coredns-7db6d8ff4d-z2tns\" (UID: \"7b08c7f0-5fd2-4720-bc11-16f99c67d9d2\") " pod="kube-system/coredns-7db6d8ff4d-z2tns" Mar 17 18:34:13.798823 kubelet[2208]: I0317 18:34:13.798828 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h99m8\" (UniqueName: \"kubernetes.io/projected/9442466c-0eea-4718-be43-447a2a2a790c-kube-api-access-h99m8\") pod \"calico-apiserver-649897fff-47dt2\" (UID: \"9442466c-0eea-4718-be43-447a2a2a790c\") " pod="calico-apiserver/calico-apiserver-649897fff-47dt2" Mar 17 18:34:13.799101 kubelet[2208]: I0317 18:34:13.798846 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgdhx\" (UniqueName: \"kubernetes.io/projected/84b5b251-95c6-46de-a5be-62ebfcea6ad9-kube-api-access-zgdhx\") pod \"calico-kube-controllers-7f5cc97b76-r9pvl\" (UID: \"84b5b251-95c6-46de-a5be-62ebfcea6ad9\") " pod="calico-system/calico-kube-controllers-7f5cc97b76-r9pvl" Mar 17 18:34:13.799101 kubelet[2208]: I0317 18:34:13.798864 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6293eeb9-1677-4d5c-b023-b36c91a971a4-config-volume\") pod \"coredns-7db6d8ff4d-m5vrn\" (UID: \"6293eeb9-1677-4d5c-b023-b36c91a971a4\") " pod="kube-system/coredns-7db6d8ff4d-m5vrn" Mar 17 18:34:13.799101 kubelet[2208]: I0317 18:34:13.798879 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm45v\" (UniqueName: \"kubernetes.io/projected/6293eeb9-1677-4d5c-b023-b36c91a971a4-kube-api-access-gm45v\") pod \"coredns-7db6d8ff4d-m5vrn\" (UID: \"6293eeb9-1677-4d5c-b023-b36c91a971a4\") " pod="kube-system/coredns-7db6d8ff4d-m5vrn" Mar 17 18:34:13.799101 kubelet[2208]: I0317 18:34:13.798895 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7b08c7f0-5fd2-4720-bc11-16f99c67d9d2-config-volume\") pod \"coredns-7db6d8ff4d-z2tns\" (UID: \"7b08c7f0-5fd2-4720-bc11-16f99c67d9d2\") " pod="kube-system/coredns-7db6d8ff4d-z2tns" Mar 17 18:34:13.799101 kubelet[2208]: I0317 18:34:13.798945 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/9442466c-0eea-4718-be43-447a2a2a790c-calico-apiserver-certs\") pod \"calico-apiserver-649897fff-47dt2\" (UID: \"9442466c-0eea-4718-be43-447a2a2a790c\") " pod="calico-apiserver/calico-apiserver-649897fff-47dt2" Mar 17 18:34:13.799235 kubelet[2208]: I0317 18:34:13.798971 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/78d53884-0ccd-4a74-9597-a16b3590ffd8-calico-apiserver-certs\") pod \"calico-apiserver-649897fff-xtjrl\" (UID: \"78d53884-0ccd-4a74-9597-a16b3590ffd8\") " pod="calico-apiserver/calico-apiserver-649897fff-xtjrl" Mar 17 18:34:13.799235 kubelet[2208]: I0317 18:34:13.798985 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj48z\" (UniqueName: \"kubernetes.io/projected/78d53884-0ccd-4a74-9597-a16b3590ffd8-kube-api-access-cj48z\") pod \"calico-apiserver-649897fff-xtjrl\" (UID: \"78d53884-0ccd-4a74-9597-a16b3590ffd8\") " pod="calico-apiserver/calico-apiserver-649897fff-xtjrl" Mar 17 18:34:13.972606 env[1305]: time="2025-03-17T18:34:13.972546485Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f5cc97b76-r9pvl,Uid:84b5b251-95c6-46de-a5be-62ebfcea6ad9,Namespace:calico-system,Attempt:0,}" Mar 17 18:34:13.976767 kubelet[2208]: E0317 18:34:13.976731 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:13.977195 env[1305]: time="2025-03-17T18:34:13.977158759Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-m5vrn,Uid:6293eeb9-1677-4d5c-b023-b36c91a971a4,Namespace:kube-system,Attempt:0,}" Mar 17 18:34:13.978032 env[1305]: time="2025-03-17T18:34:13.978004130Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-649897fff-47dt2,Uid:9442466c-0eea-4718-be43-447a2a2a790c,Namespace:calico-apiserver,Attempt:0,}" Mar 17 18:34:13.979408 kubelet[2208]: E0317 18:34:13.979392 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:13.979470 env[1305]: time="2025-03-17T18:34:13.979408441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-649897fff-xtjrl,Uid:78d53884-0ccd-4a74-9597-a16b3590ffd8,Namespace:calico-apiserver,Attempt:0,}" Mar 17 18:34:13.979794 env[1305]: time="2025-03-17T18:34:13.979755554Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-z2tns,Uid:7b08c7f0-5fd2-4720-bc11-16f99c67d9d2,Namespace:kube-system,Attempt:0,}" Mar 17 18:34:14.163104 kubelet[2208]: E0317 18:34:14.163054 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:14.164359 env[1305]: time="2025-03-17T18:34:14.163938767Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Mar 17 18:34:14.288788 env[1305]: time="2025-03-17T18:34:14.288610408Z" level=error msg="Failed to destroy network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.289008 env[1305]: time="2025-03-17T18:34:14.288977128Z" level=error msg="encountered an error cleaning up failed sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.289048 env[1305]: time="2025-03-17T18:34:14.289027092Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-m5vrn,Uid:6293eeb9-1677-4d5c-b023-b36c91a971a4,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.289675 kubelet[2208]: E0317 18:34:14.289295 2208 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.289675 kubelet[2208]: E0317 18:34:14.289369 2208 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-m5vrn" Mar 17 18:34:14.289675 kubelet[2208]: E0317 18:34:14.289391 2208 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-m5vrn" Mar 17 18:34:14.289798 kubelet[2208]: E0317 18:34:14.289440 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-m5vrn_kube-system(6293eeb9-1677-4d5c-b023-b36c91a971a4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-m5vrn_kube-system(6293eeb9-1677-4d5c-b023-b36c91a971a4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-m5vrn" podUID="6293eeb9-1677-4d5c-b023-b36c91a971a4" Mar 17 18:34:14.297112 env[1305]: time="2025-03-17T18:34:14.297061511Z" level=error msg="Failed to destroy network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.297452 env[1305]: time="2025-03-17T18:34:14.297410396Z" level=error msg="encountered an error cleaning up failed sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.297531 env[1305]: time="2025-03-17T18:34:14.297482483Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-649897fff-xtjrl,Uid:78d53884-0ccd-4a74-9597-a16b3590ffd8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.298029 kubelet[2208]: E0317 18:34:14.297711 2208 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.298029 kubelet[2208]: E0317 18:34:14.297765 2208 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-649897fff-xtjrl" Mar 17 18:34:14.298029 kubelet[2208]: E0317 18:34:14.297783 2208 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-649897fff-xtjrl" Mar 17 18:34:14.298148 kubelet[2208]: E0317 18:34:14.297822 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-649897fff-xtjrl_calico-apiserver(78d53884-0ccd-4a74-9597-a16b3590ffd8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-649897fff-xtjrl_calico-apiserver(78d53884-0ccd-4a74-9597-a16b3590ffd8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-649897fff-xtjrl" podUID="78d53884-0ccd-4a74-9597-a16b3590ffd8" Mar 17 18:34:14.309992 env[1305]: time="2025-03-17T18:34:14.309898791Z" level=error msg="Failed to destroy network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.310616 env[1305]: time="2025-03-17T18:34:14.310550147Z" level=error msg="encountered an error cleaning up failed sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.310822 env[1305]: time="2025-03-17T18:34:14.310791330Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f5cc97b76-r9pvl,Uid:84b5b251-95c6-46de-a5be-62ebfcea6ad9,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.311137 env[1305]: time="2025-03-17T18:34:14.311108727Z" level=error msg="Failed to destroy network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.311227 kubelet[2208]: E0317 18:34:14.311170 2208 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.311405 kubelet[2208]: E0317 18:34:14.311247 2208 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7f5cc97b76-r9pvl" Mar 17 18:34:14.311405 kubelet[2208]: E0317 18:34:14.311269 2208 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7f5cc97b76-r9pvl" Mar 17 18:34:14.311405 kubelet[2208]: E0317 18:34:14.311326 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7f5cc97b76-r9pvl_calico-system(84b5b251-95c6-46de-a5be-62ebfcea6ad9)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7f5cc97b76-r9pvl_calico-system(84b5b251-95c6-46de-a5be-62ebfcea6ad9)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7f5cc97b76-r9pvl" podUID="84b5b251-95c6-46de-a5be-62ebfcea6ad9" Mar 17 18:34:14.311547 env[1305]: time="2025-03-17T18:34:14.311438026Z" level=error msg="encountered an error cleaning up failed sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.311547 env[1305]: time="2025-03-17T18:34:14.311484204Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-z2tns,Uid:7b08c7f0-5fd2-4720-bc11-16f99c67d9d2,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.311722 kubelet[2208]: E0317 18:34:14.311657 2208 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.311772 kubelet[2208]: E0317 18:34:14.311733 2208 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-z2tns" Mar 17 18:34:14.311772 kubelet[2208]: E0317 18:34:14.311751 2208 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-z2tns" Mar 17 18:34:14.311833 kubelet[2208]: E0317 18:34:14.311798 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-z2tns_kube-system(7b08c7f0-5fd2-4720-bc11-16f99c67d9d2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-z2tns_kube-system(7b08c7f0-5fd2-4720-bc11-16f99c67d9d2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-z2tns" podUID="7b08c7f0-5fd2-4720-bc11-16f99c67d9d2" Mar 17 18:34:14.331336 env[1305]: time="2025-03-17T18:34:14.331274420Z" level=error msg="Failed to destroy network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.331653 env[1305]: time="2025-03-17T18:34:14.331617955Z" level=error msg="encountered an error cleaning up failed sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.331693 env[1305]: time="2025-03-17T18:34:14.331664594Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-649897fff-47dt2,Uid:9442466c-0eea-4718-be43-447a2a2a790c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.331951 kubelet[2208]: E0317 18:34:14.331903 2208 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:14.332020 kubelet[2208]: E0317 18:34:14.331976 2208 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-649897fff-47dt2" Mar 17 18:34:14.332020 kubelet[2208]: E0317 18:34:14.331996 2208 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-649897fff-47dt2" Mar 17 18:34:14.332074 kubelet[2208]: E0317 18:34:14.332039 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-649897fff-47dt2_calico-apiserver(9442466c-0eea-4718-be43-447a2a2a790c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-649897fff-47dt2_calico-apiserver(9442466c-0eea-4718-be43-447a2a2a790c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-649897fff-47dt2" podUID="9442466c-0eea-4718-be43-447a2a2a790c" Mar 17 18:34:15.050693 env[1305]: time="2025-03-17T18:34:15.050584029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7dwlz,Uid:72499494-46cf-4998-b0e6-cf96b6f788d0,Namespace:calico-system,Attempt:0,}" Mar 17 18:34:15.166231 kubelet[2208]: I0317 18:34:15.166197 2208 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:15.166868 env[1305]: time="2025-03-17T18:34:15.166836470Z" level=info msg="StopPodSandbox for \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\"" Mar 17 18:34:15.169762 kubelet[2208]: I0317 18:34:15.169721 2208 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:15.170393 env[1305]: time="2025-03-17T18:34:15.170362390Z" level=info msg="StopPodSandbox for \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\"" Mar 17 18:34:15.174400 kubelet[2208]: I0317 18:34:15.173740 2208 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:15.174820 env[1305]: time="2025-03-17T18:34:15.174785306Z" level=info msg="StopPodSandbox for \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\"" Mar 17 18:34:15.175896 kubelet[2208]: I0317 18:34:15.175600 2208 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:15.176227 env[1305]: time="2025-03-17T18:34:15.176202820Z" level=info msg="StopPodSandbox for \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\"" Mar 17 18:34:15.177653 kubelet[2208]: I0317 18:34:15.177629 2208 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:15.178064 env[1305]: time="2025-03-17T18:34:15.178027501Z" level=info msg="StopPodSandbox for \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\"" Mar 17 18:34:15.224899 env[1305]: time="2025-03-17T18:34:15.224839509Z" level=error msg="StopPodSandbox for \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\" failed" error="failed to destroy network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:15.225494 kubelet[2208]: E0317 18:34:15.225292 2208 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:15.225494 kubelet[2208]: E0317 18:34:15.225359 2208 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381"} Mar 17 18:34:15.225494 kubelet[2208]: E0317 18:34:15.225424 2208 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"84b5b251-95c6-46de-a5be-62ebfcea6ad9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:34:15.225494 kubelet[2208]: E0317 18:34:15.225455 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"84b5b251-95c6-46de-a5be-62ebfcea6ad9\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7f5cc97b76-r9pvl" podUID="84b5b251-95c6-46de-a5be-62ebfcea6ad9" Mar 17 18:34:15.229258 env[1305]: time="2025-03-17T18:34:15.229230946Z" level=error msg="StopPodSandbox for \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\" failed" error="failed to destroy network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:15.229418 env[1305]: time="2025-03-17T18:34:15.229389233Z" level=error msg="StopPodSandbox for \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\" failed" error="failed to destroy network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:15.229602 kubelet[2208]: E0317 18:34:15.229536 2208 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:15.229602 kubelet[2208]: E0317 18:34:15.229543 2208 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:15.229699 kubelet[2208]: E0317 18:34:15.229596 2208 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2"} Mar 17 18:34:15.229699 kubelet[2208]: E0317 18:34:15.229634 2208 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"78d53884-0ccd-4a74-9597-a16b3590ffd8\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:34:15.229699 kubelet[2208]: E0317 18:34:15.229657 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"78d53884-0ccd-4a74-9597-a16b3590ffd8\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-649897fff-xtjrl" podUID="78d53884-0ccd-4a74-9597-a16b3590ffd8" Mar 17 18:34:15.229699 kubelet[2208]: E0317 18:34:15.229568 2208 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4"} Mar 17 18:34:15.229839 kubelet[2208]: E0317 18:34:15.229691 2208 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"6293eeb9-1677-4d5c-b023-b36c91a971a4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:34:15.229839 kubelet[2208]: E0317 18:34:15.229715 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"6293eeb9-1677-4d5c-b023-b36c91a971a4\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-m5vrn" podUID="6293eeb9-1677-4d5c-b023-b36c91a971a4" Mar 17 18:34:15.232514 env[1305]: time="2025-03-17T18:34:15.232470096Z" level=error msg="StopPodSandbox for \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\" failed" error="failed to destroy network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:15.232645 kubelet[2208]: E0317 18:34:15.232603 2208 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:15.232645 kubelet[2208]: E0317 18:34:15.232629 2208 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00"} Mar 17 18:34:15.232758 kubelet[2208]: E0317 18:34:15.232682 2208 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7b08c7f0-5fd2-4720-bc11-16f99c67d9d2\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:34:15.232758 kubelet[2208]: E0317 18:34:15.232698 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7b08c7f0-5fd2-4720-bc11-16f99c67d9d2\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-z2tns" podUID="7b08c7f0-5fd2-4720-bc11-16f99c67d9d2" Mar 17 18:34:15.232887 env[1305]: time="2025-03-17T18:34:15.232856483Z" level=error msg="StopPodSandbox for \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\" failed" error="failed to destroy network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:15.233093 kubelet[2208]: E0317 18:34:15.233067 2208 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:15.233093 kubelet[2208]: E0317 18:34:15.233095 2208 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e"} Mar 17 18:34:15.233200 kubelet[2208]: E0317 18:34:15.233116 2208 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9442466c-0eea-4718-be43-447a2a2a790c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:34:15.233200 kubelet[2208]: E0317 18:34:15.233132 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9442466c-0eea-4718-be43-447a2a2a790c\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-649897fff-47dt2" podUID="9442466c-0eea-4718-be43-447a2a2a790c" Mar 17 18:34:15.331767 env[1305]: time="2025-03-17T18:34:15.331679591Z" level=error msg="Failed to destroy network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:15.332115 env[1305]: time="2025-03-17T18:34:15.332081667Z" level=error msg="encountered an error cleaning up failed sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:15.332176 env[1305]: time="2025-03-17T18:34:15.332129468Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7dwlz,Uid:72499494-46cf-4998-b0e6-cf96b6f788d0,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:15.332450 kubelet[2208]: E0317 18:34:15.332390 2208 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:15.332516 kubelet[2208]: E0317 18:34:15.332468 2208 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7dwlz" Mar 17 18:34:15.332516 kubelet[2208]: E0317 18:34:15.332490 2208 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-7dwlz" Mar 17 18:34:15.332583 kubelet[2208]: E0317 18:34:15.332539 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-7dwlz_calico-system(72499494-46cf-4998-b0e6-cf96b6f788d0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-7dwlz_calico-system(72499494-46cf-4998-b0e6-cf96b6f788d0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:34:15.334628 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447-shm.mount: Deactivated successfully. Mar 17 18:34:15.677565 kubelet[2208]: I0317 18:34:15.677419 2208 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 18:34:15.678439 kubelet[2208]: E0317 18:34:15.678397 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:16.062000 audit[3340]: NETFILTER_CFG table=filter:95 family=2 entries=17 op=nft_register_rule pid=3340 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:16.062000 audit[3340]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7fff6e125d50 a2=0 a3=7fff6e125d3c items=0 ppid=2398 pid=3340 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:16.062000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:16.072000 audit[3340]: NETFILTER_CFG table=nat:96 family=2 entries=19 op=nft_register_chain pid=3340 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:16.072000 audit[3340]: SYSCALL arch=c000003e syscall=46 success=yes exit=6276 a0=3 a1=7fff6e125d50 a2=0 a3=7fff6e125d3c items=0 ppid=2398 pid=3340 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:16.072000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:16.180857 kubelet[2208]: I0317 18:34:16.180808 2208 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:16.181430 kubelet[2208]: E0317 18:34:16.181398 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:16.182123 env[1305]: time="2025-03-17T18:34:16.182073548Z" level=info msg="StopPodSandbox for \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\"" Mar 17 18:34:16.205973 env[1305]: time="2025-03-17T18:34:16.205893337Z" level=error msg="StopPodSandbox for \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\" failed" error="failed to destroy network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:34:16.206184 kubelet[2208]: E0317 18:34:16.206138 2208 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:16.206255 kubelet[2208]: E0317 18:34:16.206185 2208 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447"} Mar 17 18:34:16.206255 kubelet[2208]: E0317 18:34:16.206220 2208 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"72499494-46cf-4998-b0e6-cf96b6f788d0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:34:16.206350 kubelet[2208]: E0317 18:34:16.206242 2208 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"72499494-46cf-4998-b0e6-cf96b6f788d0\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-7dwlz" podUID="72499494-46cf-4998-b0e6-cf96b6f788d0" Mar 17 18:34:17.307158 kernel: kauditd_printk_skb: 7 callbacks suppressed Mar 17 18:34:17.307280 kernel: audit: type=1130 audit(1742236457.305:302): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.0.12:22-10.0.0.1:53668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:17.305000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.0.12:22-10.0.0.1:53668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:17.305872 systemd[1]: Started sshd@9-10.0.0.12:22-10.0.0.1:53668.service. Mar 17 18:34:17.341000 audit[3367]: USER_ACCT pid=3367 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.342263 sshd[3367]: Accepted publickey for core from 10.0.0.1 port 53668 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:17.360725 sshd[3367]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:17.366765 systemd-logind[1287]: New session 10 of user core. Mar 17 18:34:17.367270 systemd[1]: Started session-10.scope. Mar 17 18:34:17.346000 audit[3367]: CRED_ACQ pid=3367 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.374520 kernel: audit: type=1101 audit(1742236457.341:303): pid=3367 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.374613 kernel: audit: type=1103 audit(1742236457.346:304): pid=3367 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.374632 kernel: audit: type=1006 audit(1742236457.346:305): pid=3367 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Mar 17 18:34:17.346000 audit[3367]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdcd06f3a0 a2=3 a3=0 items=0 ppid=1 pid=3367 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:17.383499 kernel: audit: type=1300 audit(1742236457.346:305): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdcd06f3a0 a2=3 a3=0 items=0 ppid=1 pid=3367 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:17.383535 kernel: audit: type=1327 audit(1742236457.346:305): proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:17.346000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:17.371000 audit[3367]: USER_START pid=3367 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.390896 kernel: audit: type=1105 audit(1742236457.371:306): pid=3367 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.390979 kernel: audit: type=1103 audit(1742236457.372:307): pid=3370 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.372000 audit[3370]: CRED_ACQ pid=3370 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.491976 sshd[3367]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:17.492000 audit[3367]: USER_END pid=3367 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.494126 systemd[1]: sshd@9-10.0.0.12:22-10.0.0.1:53668.service: Deactivated successfully. Mar 17 18:34:17.495041 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 18:34:17.495984 systemd-logind[1287]: Session 10 logged out. Waiting for processes to exit. Mar 17 18:34:17.496970 systemd-logind[1287]: Removed session 10. Mar 17 18:34:17.492000 audit[3367]: CRED_DISP pid=3367 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.503848 kernel: audit: type=1106 audit(1742236457.492:308): pid=3367 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.503950 kernel: audit: type=1104 audit(1742236457.492:309): pid=3367 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:17.493000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.0.12:22-10.0.0.1:53668 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:22.500464 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:34:22.500650 kernel: audit: type=1130 audit(1742236462.494:311): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.0.12:22-10.0.0.1:53672 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:22.494000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.0.12:22-10.0.0.1:53672 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:22.495161 systemd[1]: Started sshd@10-10.0.0.12:22-10.0.0.1:53672.service. Mar 17 18:34:22.529000 audit[3382]: USER_ACCT pid=3382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.548329 sshd[3382]: Accepted publickey for core from 10.0.0.1 port 53672 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:22.547000 audit[3382]: CRED_ACQ pid=3382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.548877 sshd[3382]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:22.552778 kernel: audit: type=1101 audit(1742236462.529:312): pid=3382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.552834 kernel: audit: type=1103 audit(1742236462.547:313): pid=3382 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.552854 kernel: audit: type=1006 audit(1742236462.547:314): pid=3382 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Mar 17 18:34:22.553204 systemd[1]: Started session-11.scope. Mar 17 18:34:22.554244 systemd-logind[1287]: New session 11 of user core. Mar 17 18:34:22.559823 kernel: audit: type=1300 audit(1742236462.547:314): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0c1ef230 a2=3 a3=0 items=0 ppid=1 pid=3382 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:22.547000 audit[3382]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0c1ef230 a2=3 a3=0 items=0 ppid=1 pid=3382 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:22.547000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:22.566112 kernel: audit: type=1327 audit(1742236462.547:314): proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:22.566164 kernel: audit: type=1105 audit(1742236462.559:315): pid=3382 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.559000 audit[3382]: USER_START pid=3382 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.561000 audit[3385]: CRED_ACQ pid=3385 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.570317 kernel: audit: type=1103 audit(1742236462.561:316): pid=3385 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.675999 sshd[3382]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:22.675000 audit[3382]: USER_END pid=3382 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.678536 systemd-logind[1287]: Session 11 logged out. Waiting for processes to exit. Mar 17 18:34:22.679064 systemd[1]: sshd@10-10.0.0.12:22-10.0.0.1:53672.service: Deactivated successfully. Mar 17 18:34:22.679792 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 18:34:22.680472 systemd-logind[1287]: Removed session 11. Mar 17 18:34:22.676000 audit[3382]: CRED_DISP pid=3382 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.852485 kernel: audit: type=1106 audit(1742236462.675:317): pid=3382 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.852573 kernel: audit: type=1104 audit(1742236462.676:318): pid=3382 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:22.676000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.0.12:22-10.0.0.1:53672 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:23.456079 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2628917677.mount: Deactivated successfully. Mar 17 18:34:25.532414 env[1305]: time="2025-03-17T18:34:25.532352999Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:25.534858 env[1305]: time="2025-03-17T18:34:25.534825072Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:25.536819 env[1305]: time="2025-03-17T18:34:25.536789742Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:25.538603 env[1305]: time="2025-03-17T18:34:25.538576947Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:25.539044 env[1305]: time="2025-03-17T18:34:25.539001645Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Mar 17 18:34:25.546745 env[1305]: time="2025-03-17T18:34:25.546704590Z" level=info msg="CreateContainer within sandbox \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 18:34:25.563606 env[1305]: time="2025-03-17T18:34:25.563559282Z" level=info msg="CreateContainer within sandbox \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc\"" Mar 17 18:34:25.564044 env[1305]: time="2025-03-17T18:34:25.564019536Z" level=info msg="StartContainer for \"f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc\"" Mar 17 18:34:25.646557 env[1305]: time="2025-03-17T18:34:25.646492914Z" level=info msg="StartContainer for \"f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc\" returns successfully" Mar 17 18:34:25.734672 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 18:34:25.734872 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 18:34:26.049895 env[1305]: time="2025-03-17T18:34:26.049847039Z" level=info msg="StopPodSandbox for \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\"" Mar 17 18:34:26.050119 env[1305]: time="2025-03-17T18:34:26.049994054Z" level=info msg="StopPodSandbox for \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\"" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.105 [INFO][3492] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.106 [INFO][3492] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" iface="eth0" netns="/var/run/netns/cni-bb42b29c-da58-a481-ebd6-a83b55db2d07" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.106 [INFO][3492] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" iface="eth0" netns="/var/run/netns/cni-bb42b29c-da58-a481-ebd6-a83b55db2d07" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.106 [INFO][3492] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" iface="eth0" netns="/var/run/netns/cni-bb42b29c-da58-a481-ebd6-a83b55db2d07" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.106 [INFO][3492] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.106 [INFO][3492] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.163 [INFO][3507] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" HandleID="k8s-pod-network.5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.163 [INFO][3507] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.164 [INFO][3507] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.172 [WARNING][3507] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" HandleID="k8s-pod-network.5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.172 [INFO][3507] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" HandleID="k8s-pod-network.5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.173 [INFO][3507] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:26.177090 env[1305]: 2025-03-17 18:34:26.175 [INFO][3492] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:26.178087 env[1305]: time="2025-03-17T18:34:26.177285556Z" level=info msg="TearDown network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\" successfully" Mar 17 18:34:26.178087 env[1305]: time="2025-03-17T18:34:26.177322335Z" level=info msg="StopPodSandbox for \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\" returns successfully" Mar 17 18:34:26.178228 kubelet[2208]: E0317 18:34:26.177704 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:26.179190 env[1305]: time="2025-03-17T18:34:26.179130992Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-m5vrn,Uid:6293eeb9-1677-4d5c-b023-b36c91a971a4,Namespace:kube-system,Attempt:1,}" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.117 [INFO][3491] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.117 [INFO][3491] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" iface="eth0" netns="/var/run/netns/cni-a9a2166a-7d9f-2c92-461c-62d1b373eee7" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.118 [INFO][3491] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" iface="eth0" netns="/var/run/netns/cni-a9a2166a-7d9f-2c92-461c-62d1b373eee7" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.118 [INFO][3491] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" iface="eth0" netns="/var/run/netns/cni-a9a2166a-7d9f-2c92-461c-62d1b373eee7" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.118 [INFO][3491] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.118 [INFO][3491] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.163 [INFO][3512] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" HandleID="k8s-pod-network.e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.164 [INFO][3512] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.173 [INFO][3512] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.179 [WARNING][3512] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" HandleID="k8s-pod-network.e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.179 [INFO][3512] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" HandleID="k8s-pod-network.e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.180 [INFO][3512] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:26.183579 env[1305]: 2025-03-17 18:34:26.182 [INFO][3491] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:26.183953 env[1305]: time="2025-03-17T18:34:26.183717193Z" level=info msg="TearDown network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\" successfully" Mar 17 18:34:26.183953 env[1305]: time="2025-03-17T18:34:26.183750887Z" level=info msg="StopPodSandbox for \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\" returns successfully" Mar 17 18:34:26.184246 kubelet[2208]: E0317 18:34:26.184212 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:26.184653 env[1305]: time="2025-03-17T18:34:26.184621221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-z2tns,Uid:7b08c7f0-5fd2-4720-bc11-16f99c67d9d2,Namespace:kube-system,Attempt:1,}" Mar 17 18:34:26.214525 kubelet[2208]: E0317 18:34:26.214484 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:26.230796 kubelet[2208]: I0317 18:34:26.230687 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-gl667" podStartSLOduration=1.606184468 podStartE2EDuration="28.230668546s" podCreationTimestamp="2025-03-17 18:33:58 +0000 UTC" firstStartedPulling="2025-03-17 18:33:58.915347064 +0000 UTC m=+20.952936674" lastFinishedPulling="2025-03-17 18:34:25.539831152 +0000 UTC m=+47.577420752" observedRunningTime="2025-03-17 18:34:26.230277822 +0000 UTC m=+48.267867432" watchObservedRunningTime="2025-03-17 18:34:26.230668546 +0000 UTC m=+48.268258156" Mar 17 18:34:26.302055 systemd-networkd[1078]: calie1360bac6e4: Link UP Mar 17 18:34:26.304619 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 18:34:26.304674 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): calie1360bac6e4: link becomes ready Mar 17 18:34:26.304777 systemd-networkd[1078]: calie1360bac6e4: Gained carrier Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.221 [INFO][3534] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.234 [INFO][3534] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0 coredns-7db6d8ff4d- kube-system 6293eeb9-1677-4d5c-b023-b36c91a971a4 886 0 2025-03-17 18:33:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-m5vrn eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie1360bac6e4 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5vrn" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--m5vrn-" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.234 [INFO][3534] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5vrn" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.265 [INFO][3570] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" HandleID="k8s-pod-network.47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.271 [INFO][3570] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" HandleID="k8s-pod-network.47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000434c20), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-m5vrn", "timestamp":"2025-03-17 18:34:26.265021102 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.271 [INFO][3570] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.271 [INFO][3570] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.271 [INFO][3570] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.272 [INFO][3570] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" host="localhost" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.275 [INFO][3570] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.278 [INFO][3570] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.279 [INFO][3570] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.281 [INFO][3570] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.281 [INFO][3570] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" host="localhost" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.282 [INFO][3570] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303 Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.285 [INFO][3570] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" host="localhost" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.290 [INFO][3570] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" host="localhost" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.290 [INFO][3570] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" host="localhost" Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.290 [INFO][3570] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:26.317789 env[1305]: 2025-03-17 18:34:26.290 [INFO][3570] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" HandleID="k8s-pod-network.47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.318479 env[1305]: 2025-03-17 18:34:26.293 [INFO][3534] cni-plugin/k8s.go 386: Populated endpoint ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5vrn" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6293eeb9-1677-4d5c-b023-b36c91a971a4", ResourceVersion:"886", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-m5vrn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie1360bac6e4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:26.318479 env[1305]: 2025-03-17 18:34:26.293 [INFO][3534] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.129/32] ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5vrn" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.318479 env[1305]: 2025-03-17 18:34:26.293 [INFO][3534] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie1360bac6e4 ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5vrn" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.318479 env[1305]: 2025-03-17 18:34:26.305 [INFO][3534] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5vrn" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.318479 env[1305]: 2025-03-17 18:34:26.305 [INFO][3534] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5vrn" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6293eeb9-1677-4d5c-b023-b36c91a971a4", ResourceVersion:"886", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303", Pod:"coredns-7db6d8ff4d-m5vrn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie1360bac6e4", MAC:"a2:2c:f2:17:8c:71", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:26.318479 env[1305]: 2025-03-17 18:34:26.316 [INFO][3534] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303" Namespace="kube-system" Pod="coredns-7db6d8ff4d-m5vrn" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:26.333114 env[1305]: time="2025-03-17T18:34:26.333025066Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:34:26.333114 env[1305]: time="2025-03-17T18:34:26.333063138Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:34:26.333335 env[1305]: time="2025-03-17T18:34:26.333072936Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:34:26.333335 env[1305]: time="2025-03-17T18:34:26.333243346Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303 pid=3618 runtime=io.containerd.runc.v2 Mar 17 18:34:26.344506 systemd-networkd[1078]: cali1db9154d887: Link UP Mar 17 18:34:26.347268 systemd-networkd[1078]: cali1db9154d887: Gained carrier Mar 17 18:34:26.347937 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali1db9154d887: link becomes ready Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.244 [INFO][3545] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.259 [INFO][3545] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0 coredns-7db6d8ff4d- kube-system 7b08c7f0-5fd2-4720-bc11-16f99c67d9d2 888 0 2025-03-17 18:33:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-7db6d8ff4d-z2tns eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali1db9154d887 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Namespace="kube-system" Pod="coredns-7db6d8ff4d-z2tns" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--z2tns-" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.259 [INFO][3545] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Namespace="kube-system" Pod="coredns-7db6d8ff4d-z2tns" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.309 [INFO][3589] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" HandleID="k8s-pod-network.188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.316 [INFO][3589] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" HandleID="k8s-pod-network.188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030dc20), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-7db6d8ff4d-z2tns", "timestamp":"2025-03-17 18:34:26.309579851 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.316 [INFO][3589] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.316 [INFO][3589] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.316 [INFO][3589] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.317 [INFO][3589] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" host="localhost" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.322 [INFO][3589] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.325 [INFO][3589] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.327 [INFO][3589] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.329 [INFO][3589] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.329 [INFO][3589] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" host="localhost" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.330 [INFO][3589] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272 Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.335 [INFO][3589] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" host="localhost" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.339 [INFO][3589] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" host="localhost" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.339 [INFO][3589] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" host="localhost" Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.339 [INFO][3589] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:26.356671 env[1305]: 2025-03-17 18:34:26.339 [INFO][3589] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" HandleID="k8s-pod-network.188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.357476 env[1305]: 2025-03-17 18:34:26.342 [INFO][3545] cni-plugin/k8s.go 386: Populated endpoint ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Namespace="kube-system" Pod="coredns-7db6d8ff4d-z2tns" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7b08c7f0-5fd2-4720-bc11-16f99c67d9d2", ResourceVersion:"888", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-7db6d8ff4d-z2tns", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1db9154d887", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:26.357476 env[1305]: 2025-03-17 18:34:26.342 [INFO][3545] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.130/32] ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Namespace="kube-system" Pod="coredns-7db6d8ff4d-z2tns" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.357476 env[1305]: 2025-03-17 18:34:26.342 [INFO][3545] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1db9154d887 ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Namespace="kube-system" Pod="coredns-7db6d8ff4d-z2tns" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.357476 env[1305]: 2025-03-17 18:34:26.345 [INFO][3545] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Namespace="kube-system" Pod="coredns-7db6d8ff4d-z2tns" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.357476 env[1305]: 2025-03-17 18:34:26.345 [INFO][3545] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Namespace="kube-system" Pod="coredns-7db6d8ff4d-z2tns" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7b08c7f0-5fd2-4720-bc11-16f99c67d9d2", ResourceVersion:"888", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272", Pod:"coredns-7db6d8ff4d-z2tns", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1db9154d887", MAC:"82:73:ea:c2:b1:39", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:26.357476 env[1305]: 2025-03-17 18:34:26.354 [INFO][3545] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272" Namespace="kube-system" Pod="coredns-7db6d8ff4d-z2tns" WorkloadEndpoint="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:26.358455 systemd-resolved[1221]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 18:34:26.367381 env[1305]: time="2025-03-17T18:34:26.367209666Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:34:26.367381 env[1305]: time="2025-03-17T18:34:26.367249722Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:34:26.367381 env[1305]: time="2025-03-17T18:34:26.367259440Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:34:26.367509 env[1305]: time="2025-03-17T18:34:26.367405574Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272 pid=3662 runtime=io.containerd.runc.v2 Mar 17 18:34:26.382966 env[1305]: time="2025-03-17T18:34:26.382925045Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-m5vrn,Uid:6293eeb9-1677-4d5c-b023-b36c91a971a4,Namespace:kube-system,Attempt:1,} returns sandbox id \"47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303\"" Mar 17 18:34:26.384862 kubelet[2208]: E0317 18:34:26.384821 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:26.392384 env[1305]: time="2025-03-17T18:34:26.392352829Z" level=info msg="CreateContainer within sandbox \"47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 18:34:26.395208 systemd-resolved[1221]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 18:34:26.416806 env[1305]: time="2025-03-17T18:34:26.416755791Z" level=info msg="CreateContainer within sandbox \"47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e32c2fd966173c409ab5d7a105a24d56b7f1adeaff85a0d258c7ccae79229708\"" Mar 17 18:34:26.418704 env[1305]: time="2025-03-17T18:34:26.418655118Z" level=info msg="StartContainer for \"e32c2fd966173c409ab5d7a105a24d56b7f1adeaff85a0d258c7ccae79229708\"" Mar 17 18:34:26.422250 env[1305]: time="2025-03-17T18:34:26.422207498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-z2tns,Uid:7b08c7f0-5fd2-4720-bc11-16f99c67d9d2,Namespace:kube-system,Attempt:1,} returns sandbox id \"188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272\"" Mar 17 18:34:26.423142 kubelet[2208]: E0317 18:34:26.423119 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:26.424876 env[1305]: time="2025-03-17T18:34:26.424853858Z" level=info msg="CreateContainer within sandbox \"188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 18:34:26.437658 env[1305]: time="2025-03-17T18:34:26.437622093Z" level=info msg="CreateContainer within sandbox \"188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"12da09358c6ef5289b4b6433cab655b7cb609b9a1d1bea8011f5fb587a9537c3\"" Mar 17 18:34:26.438190 env[1305]: time="2025-03-17T18:34:26.438170012Z" level=info msg="StartContainer for \"12da09358c6ef5289b4b6433cab655b7cb609b9a1d1bea8011f5fb587a9537c3\"" Mar 17 18:34:26.467796 env[1305]: time="2025-03-17T18:34:26.467753853Z" level=info msg="StartContainer for \"e32c2fd966173c409ab5d7a105a24d56b7f1adeaff85a0d258c7ccae79229708\" returns successfully" Mar 17 18:34:26.478142 env[1305]: time="2025-03-17T18:34:26.478107284Z" level=info msg="StartContainer for \"12da09358c6ef5289b4b6433cab655b7cb609b9a1d1bea8011f5fb587a9537c3\" returns successfully" Mar 17 18:34:26.552133 systemd[1]: run-netns-cni\x2da9a2166a\x2d7d9f\x2d2c92\x2d461c\x2d62d1b373eee7.mount: Deactivated successfully. Mar 17 18:34:26.552754 systemd[1]: run-netns-cni\x2dbb42b29c\x2dda58\x2da481\x2debd6\x2da83b55db2d07.mount: Deactivated successfully. Mar 17 18:34:27.214000 audit[3816]: AVC avc: denied { write } for pid=3816 comm="tee" name="fd" dev="proc" ino=24433 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:27.214000 audit[3816]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffdc0406a1a a2=241 a3=1b6 items=1 ppid=3793 pid=3816 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.214000 audit: CWD cwd="/etc/service/enabled/allocate-tunnel-addrs/log" Mar 17 18:34:27.214000 audit: PATH item=0 name="/dev/fd/63" inode=26680 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:27.214000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:27.218696 kubelet[2208]: E0317 18:34:27.218655 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:27.219000 audit[3837]: AVC avc: denied { write } for pid=3837 comm="tee" name="fd" dev="proc" ino=25229 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:27.220000 audit[3832]: AVC avc: denied { write } for pid=3832 comm="tee" name="fd" dev="proc" ino=25230 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:27.220000 audit[3832]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffdce0f7a2a a2=241 a3=1b6 items=1 ppid=3794 pid=3832 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.220000 audit: CWD cwd="/etc/service/enabled/bird6/log" Mar 17 18:34:27.220000 audit: PATH item=0 name="/dev/fd/63" inode=25216 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:27.220000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:27.219000 audit[3837]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffeb1d7aa2c a2=241 a3=1b6 items=1 ppid=3795 pid=3837 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.221841 kubelet[2208]: E0317 18:34:27.221773 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:27.219000 audit: CWD cwd="/etc/service/enabled/cni/log" Mar 17 18:34:27.219000 audit: PATH item=0 name="/dev/fd/63" inode=26689 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:27.222184 kubelet[2208]: E0317 18:34:27.222130 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:27.219000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:27.242000 audit[3871]: AVC avc: denied { write } for pid=3871 comm="tee" name="fd" dev="proc" ino=26697 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:27.242000 audit[3871]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffe640fda2a a2=241 a3=1b6 items=1 ppid=3805 pid=3871 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.242000 audit: CWD cwd="/etc/service/enabled/confd/log" Mar 17 18:34:27.242000 audit: PATH item=0 name="/dev/fd/63" inode=24468 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:27.242000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:27.265692 kubelet[2208]: I0317 18:34:27.260794 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-m5vrn" podStartSLOduration=35.260774959 podStartE2EDuration="35.260774959s" podCreationTimestamp="2025-03-17 18:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:34:27.231524979 +0000 UTC m=+49.269114589" watchObservedRunningTime="2025-03-17 18:34:27.260774959 +0000 UTC m=+49.298364559" Mar 17 18:34:27.265000 audit[3856]: AVC avc: denied { write } for pid=3856 comm="tee" name="fd" dev="proc" ino=25242 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:27.265000 audit[3856]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7fffd2bf3a2a a2=241 a3=1b6 items=1 ppid=3800 pid=3856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.265000 audit: CWD cwd="/etc/service/enabled/felix/log" Mar 17 18:34:27.265000 audit: PATH item=0 name="/dev/fd/63" inode=25233 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:27.265000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:27.268000 audit[3878]: AVC avc: denied { write } for pid=3878 comm="tee" name="fd" dev="proc" ino=25246 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:27.272000 audit[3884]: NETFILTER_CFG table=filter:97 family=2 entries=16 op=nft_register_rule pid=3884 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:27.272000 audit[3884]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffef7e0e8d0 a2=0 a3=7ffef7e0e8bc items=0 ppid=2398 pid=3884 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.272000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:27.268000 audit[3878]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffeccb01a1b a2=241 a3=1b6 items=1 ppid=3804 pid=3878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.268000 audit: CWD cwd="/etc/service/enabled/node-status-reporter/log" Mar 17 18:34:27.268000 audit: PATH item=0 name="/dev/fd/63" inode=25239 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:27.268000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:27.282000 audit[3884]: NETFILTER_CFG table=nat:98 family=2 entries=14 op=nft_register_rule pid=3884 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:27.282000 audit[3884]: SYSCALL arch=c000003e syscall=46 success=yes exit=3468 a0=3 a1=7ffef7e0e8d0 a2=0 a3=0 items=0 ppid=2398 pid=3884 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.282000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:27.285000 audit[3886]: AVC avc: denied { write } for pid=3886 comm="tee" name="fd" dev="proc" ino=26710 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:27.285000 audit[3886]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffed6a09a2b a2=241 a3=1b6 items=1 ppid=3792 pid=3886 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.285000 audit: CWD cwd="/etc/service/enabled/bird/log" Mar 17 18:34:27.285000 audit: PATH item=0 name="/dev/fd/63" inode=25271 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:27.285000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:27.296000 audit[3891]: NETFILTER_CFG table=filter:99 family=2 entries=13 op=nft_register_rule pid=3891 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:27.296000 audit[3891]: SYSCALL arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7ffee33712a0 a2=0 a3=7ffee337128c items=0 ppid=2398 pid=3891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.296000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:27.302000 audit[3891]: NETFILTER_CFG table=nat:100 family=2 entries=35 op=nft_register_chain pid=3891 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:27.302000 audit[3891]: SYSCALL arch=c000003e syscall=46 success=yes exit=14196 a0=3 a1=7ffee33712a0 a2=0 a3=7ffee337128c items=0 ppid=2398 pid=3891 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.302000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:27.349934 kubelet[2208]: I0317 18:34:27.349855 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-z2tns" podStartSLOduration=35.349833496 podStartE2EDuration="35.349833496s" podCreationTimestamp="2025-03-17 18:33:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:34:27.270769785 +0000 UTC m=+49.308359395" watchObservedRunningTime="2025-03-17 18:34:27.349833496 +0000 UTC m=+49.387423106" Mar 17 18:34:27.626000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.634719 kernel: kauditd_printk_skb: 48 callbacks suppressed Mar 17 18:34:27.634838 kernel: audit: type=1400 audit(1742236467.626:331): avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.626000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.644412 kernel: audit: type=1400 audit(1742236467.626:331): avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.644486 kernel: audit: type=1400 audit(1742236467.626:331): avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.626000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.626000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.647645 kernel: audit: type=1400 audit(1742236467.626:331): avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.647708 kernel: audit: type=1400 audit(1742236467.626:331): avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.626000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.653530 kernel: audit: type=1400 audit(1742236467.626:331): avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.626000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.626000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.656569 kernel: audit: type=1400 audit(1742236467.626:331): avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.656638 kernel: audit: type=1400 audit(1742236467.626:331): avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.626000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.659400 kernel: audit: type=1400 audit(1742236467.626:331): avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.626000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.626000 audit: BPF prog-id=10 op=LOAD Mar 17 18:34:27.663211 kernel: audit: type=1334 audit(1742236467.626:331): prog-id=10 op=LOAD Mar 17 18:34:27.626000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffec246730 a2=98 a3=3 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.626000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.636000 audit: BPF prog-id=10 op=UNLOAD Mar 17 18:34:27.637000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.637000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.637000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.637000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.637000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.637000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.637000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.637000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.637000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.637000 audit: BPF prog-id=11 op=LOAD Mar 17 18:34:27.637000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fffec246510 a2=74 a3=540051 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.637000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.640000 audit: BPF prog-id=11 op=UNLOAD Mar 17 18:34:27.640000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.640000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.640000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.640000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.640000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.640000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.640000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.640000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.640000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.640000 audit: BPF prog-id=12 op=LOAD Mar 17 18:34:27.640000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fffec246540 a2=94 a3=2 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.640000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.640000 audit: BPF prog-id=12 op=UNLOAD Mar 17 18:34:27.678633 systemd[1]: Started sshd@11-10.0.0.12:22-10.0.0.1:51154.service. Mar 17 18:34:27.677000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.0.12:22-10.0.0.1:51154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:27.711000 audit[3939]: USER_ACCT pid=3939 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.713119 sshd[3939]: Accepted publickey for core from 10.0.0.1 port 51154 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:27.712000 audit[3939]: CRED_ACQ pid=3939 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.712000 audit[3939]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdfc423ad0 a2=3 a3=0 items=0 ppid=1 pid=3939 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.712000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:27.714298 sshd[3939]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:27.718219 systemd-logind[1287]: New session 12 of user core. Mar 17 18:34:27.718984 systemd[1]: Started session-12.scope. Mar 17 18:34:27.722000 audit[3939]: USER_START pid=3939 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.723000 audit[3942]: CRED_ACQ pid=3942 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit: BPF prog-id=13 op=LOAD Mar 17 18:34:27.763000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fffec246400 a2=40 a3=1 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.763000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.763000 audit: BPF prog-id=13 op=UNLOAD Mar 17 18:34:27.763000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.763000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=0 a1=7fffec2464d0 a2=50 a3=7fffec2465b0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.763000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fffec246410 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fffec246440 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fffec246350 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fffec246460 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fffec246440 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fffec246430 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fffec246460 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fffec246440 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fffec246460 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fffec246430 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fffec2464a0 a2=28 a3=0 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7fffec246250 a2=50 a3=1 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit: BPF prog-id=14 op=LOAD Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fffec246250 a2=94 a3=5 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit: BPF prog-id=14 op=UNLOAD Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7fffec246300 a2=50 a3=1 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=16 a1=7fffec246420 a2=4 a3=38 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.771000 audit[3938]: AVC avc: denied { confidentiality } for pid=3938 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:27.771000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7fffec246470 a2=94 a3=6 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { confidentiality } for pid=3938 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:27.772000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7fffec245c20 a2=94 a3=83 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.772000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { perfmon } for pid=3938 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { bpf } for pid=3938 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.772000 audit[3938]: AVC avc: denied { confidentiality } for pid=3938 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:27.772000 audit[3938]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7fffec245c20 a2=94 a3=83 items=0 ppid=3801 pid=3938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.772000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit: BPF prog-id=15 op=LOAD Mar 17 18:34:27.779000 audit[3955]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffac7344b0 a2=98 a3=1999999999999999 items=0 ppid=3801 pid=3955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.779000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:34:27.779000 audit: BPF prog-id=15 op=UNLOAD Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.779000 audit: BPF prog-id=16 op=LOAD Mar 17 18:34:27.779000 audit[3955]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffac734390 a2=74 a3=ffff items=0 ppid=3801 pid=3955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.779000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:34:27.780000 audit: BPF prog-id=16 op=UNLOAD Mar 17 18:34:27.780000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.780000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.780000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.780000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.780000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.780000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.780000 audit[3955]: AVC avc: denied { perfmon } for pid=3955 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.780000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.780000 audit[3955]: AVC avc: denied { bpf } for pid=3955 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.780000 audit: BPF prog-id=17 op=LOAD Mar 17 18:34:27.780000 audit[3955]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fffac7343d0 a2=40 a3=7fffac7345b0 items=0 ppid=3801 pid=3955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.780000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:34:27.780000 audit: BPF prog-id=17 op=UNLOAD Mar 17 18:34:27.826809 systemd-networkd[1078]: vxlan.calico: Link UP Mar 17 18:34:27.826821 systemd-networkd[1078]: vxlan.calico: Gained carrier Mar 17 18:34:27.827004 systemd-networkd[1078]: calie1360bac6e4: Gained IPv6LL Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit: BPF prog-id=18 op=LOAD Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd05a36d90 a2=98 a3=ffffffff items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit: BPF prog-id=18 op=UNLOAD Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit: BPF prog-id=19 op=LOAD Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd05a36ba0 a2=74 a3=540051 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit: BPF prog-id=19 op=UNLOAD Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit: BPF prog-id=20 op=LOAD Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffd05a36bd0 a2=94 a3=2 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit: BPF prog-id=20 op=UNLOAD Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd05a36aa0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd05a36ad0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd05a369e0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd05a36af0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd05a36ad0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd05a36ac0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd05a36af0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd05a36ad0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd05a36af0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffd05a36ac0 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffd05a36b30 a2=28 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.839000 audit: BPF prog-id=21 op=LOAD Mar 17 18:34:27.839000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd05a369a0 a2=40 a3=0 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.839000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.839000 audit: BPF prog-id=21 op=UNLOAD Mar 17 18:34:27.842000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.842000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=0 a1=7ffd05a36990 a2=50 a3=2800 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.842000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.842000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.842000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=0 a1=7ffd05a36990 a2=50 a3=2800 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.842000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit: BPF prog-id=22 op=LOAD Mar 17 18:34:27.843000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd05a361b0 a2=94 a3=2 items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.843000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.843000 audit: BPF prog-id=22 op=UNLOAD Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { perfmon } for pid=3980 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit[3980]: AVC avc: denied { bpf } for pid=3980 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.843000 audit: BPF prog-id=23 op=LOAD Mar 17 18:34:27.843000 audit[3980]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffd05a362b0 a2=94 a3=2d items=0 ppid=3801 pid=3980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.843000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:27.847000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.847000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.847000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.847000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.847000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.847000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.847000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.847000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.847000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.847000 audit: BPF prog-id=24 op=LOAD Mar 17 18:34:27.847000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffddfeeeff0 a2=98 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.847000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.847000 audit: BPF prog-id=24 op=UNLOAD Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit: BPF prog-id=25 op=LOAD Mar 17 18:34:27.848000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffddfeeedd0 a2=74 a3=540051 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.848000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.848000 audit: BPF prog-id=25 op=UNLOAD Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.848000 audit: BPF prog-id=26 op=LOAD Mar 17 18:34:27.848000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffddfeeee00 a2=94 a3=2 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.848000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.849000 audit: BPF prog-id=26 op=UNLOAD Mar 17 18:34:27.852482 sshd[3939]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:27.853499 systemd[1]: Started sshd@12-10.0.0.12:22-10.0.0.1:51168.service. Mar 17 18:34:27.852000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.0.12:22-10.0.0.1:51168 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:27.857000 audit[3939]: USER_END pid=3939 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.857000 audit[3939]: CRED_DISP pid=3939 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.859650 systemd[1]: sshd@11-10.0.0.12:22-10.0.0.1:51154.service: Deactivated successfully. Mar 17 18:34:27.858000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.0.12:22-10.0.0.1:51154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:27.860284 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 18:34:27.861272 systemd-logind[1287]: Session 12 logged out. Waiting for processes to exit. Mar 17 18:34:27.862079 systemd-logind[1287]: Removed session 12. Mar 17 18:34:27.885000 audit[3992]: USER_ACCT pid=3992 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.886308 sshd[3992]: Accepted publickey for core from 10.0.0.1 port 51168 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:27.886000 audit[3992]: CRED_ACQ pid=3992 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.886000 audit[3992]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcd344d090 a2=3 a3=0 items=0 ppid=1 pid=3992 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.886000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:27.887642 sshd[3992]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:27.892448 systemd-networkd[1078]: cali1db9154d887: Gained IPv6LL Mar 17 18:34:27.892635 systemd[1]: Started session-13.scope. Mar 17 18:34:27.893206 systemd-logind[1287]: New session 13 of user core. Mar 17 18:34:27.897000 audit[3992]: USER_START pid=3992 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.899000 audit[3997]: CRED_ACQ pid=3997 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:27.957000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.957000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.957000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.957000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.957000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.957000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.957000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.957000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.957000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.957000 audit: BPF prog-id=27 op=LOAD Mar 17 18:34:27.957000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffddfeeecc0 a2=40 a3=1 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.957000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.958000 audit: BPF prog-id=27 op=UNLOAD Mar 17 18:34:27.958000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.958000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=0 a1=7ffddfeeed90 a2=50 a3=7ffddfeeee70 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.958000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffddfeeecd0 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffddfeeed00 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffddfeeec10 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffddfeeed20 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffddfeeed00 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffddfeeecf0 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffddfeeed20 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffddfeeed00 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffddfeeed20 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffddfeeecf0 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffddfeeed60 a2=28 a3=0 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffddfeeeb10 a2=50 a3=1 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit: BPF prog-id=28 op=LOAD Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffddfeeeb10 a2=94 a3=5 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit: BPF prog-id=28 op=UNLOAD Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffddfeeebc0 a2=50 a3=1 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=16 a1=7ffddfeeece0 a2=4 a3=38 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.965000 audit[3984]: AVC avc: denied { confidentiality } for pid=3984 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:27.965000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffddfeeed30 a2=94 a3=6 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.965000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { confidentiality } for pid=3984 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:27.966000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffddfeee4e0 a2=94 a3=83 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.966000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { perfmon } for pid=3984 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { confidentiality } for pid=3984 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:27.966000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffddfeee4e0 a2=94 a3=83 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.966000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffddfeeff20 a2=10 a3=208 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.966000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffddfeefdc0 a2=10 a3=3 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.966000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffddfeefd60 a2=10 a3=3 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.966000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.966000 audit[3984]: AVC avc: denied { bpf } for pid=3984 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:27.966000 audit[3984]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffddfeefd60 a2=10 a3=7 items=0 ppid=3801 pid=3984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:27.966000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:27.973000 audit: BPF prog-id=23 op=UNLOAD Mar 17 18:34:28.022000 audit[4027]: NETFILTER_CFG table=mangle:101 family=2 entries=16 op=nft_register_chain pid=4027 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:28.022000 audit[4027]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffe0a4935d0 a2=0 a3=7ffe0a4935bc items=0 ppid=3801 pid=4027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:28.022000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:28.034000 audit[4028]: NETFILTER_CFG table=filter:102 family=2 entries=91 op=nft_register_chain pid=4028 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:28.034000 audit[4028]: SYSCALL arch=c000003e syscall=46 success=yes exit=50536 a0=3 a1=7ffc8f4b3490 a2=0 a3=7ffc8f4b347c items=0 ppid=3801 pid=4028 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:28.034000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:28.035000 audit[4026]: NETFILTER_CFG table=raw:103 family=2 entries=21 op=nft_register_chain pid=4026 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:28.035000 audit[4026]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7ffca915f990 a2=0 a3=7ffca915f97c items=0 ppid=3801 pid=4026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:28.035000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:28.037000 audit[4032]: NETFILTER_CFG table=nat:104 family=2 entries=15 op=nft_register_chain pid=4032 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:28.037000 audit[4032]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffce75d2ad0 a2=0 a3=7ffce75d2abc items=0 ppid=3801 pid=4032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:28.037000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:28.049705 env[1305]: time="2025-03-17T18:34:28.049645997Z" level=info msg="StopPodSandbox for \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\"" Mar 17 18:34:28.098256 sshd[3992]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:28.100000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.0.12:22-10.0.0.1:51178 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:28.101542 systemd[1]: Started sshd@13-10.0.0.12:22-10.0.0.1:51178.service. Mar 17 18:34:28.101000 audit[3992]: USER_END pid=3992 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:28.101000 audit[3992]: CRED_DISP pid=3992 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:28.103000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.0.12:22-10.0.0.1:51168 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:28.103962 systemd[1]: sshd@12-10.0.0.12:22-10.0.0.1:51168.service: Deactivated successfully. Mar 17 18:34:28.105066 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 18:34:28.105128 systemd-logind[1287]: Session 13 logged out. Waiting for processes to exit. Mar 17 18:34:28.106270 systemd-logind[1287]: Removed session 13. Mar 17 18:34:28.136000 audit[4060]: USER_ACCT pid=4060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:28.137000 audit[4060]: CRED_ACQ pid=4060 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:28.137000 audit[4060]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffce68e8250 a2=3 a3=0 items=0 ppid=1 pid=4060 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:28.137000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:28.139025 sshd[4060]: Accepted publickey for core from 10.0.0.1 port 51178 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:28.138377 sshd[4060]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:28.143812 systemd[1]: Started session-14.scope. Mar 17 18:34:28.144283 systemd-logind[1287]: New session 14 of user core. Mar 17 18:34:28.148000 audit[4060]: USER_START pid=4060 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:28.157000 audit[4072]: CRED_ACQ pid=4072 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.115 [INFO][4052] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.115 [INFO][4052] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" iface="eth0" netns="/var/run/netns/cni-96d52dff-e383-7088-4186-33dd990a624c" Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.116 [INFO][4052] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" iface="eth0" netns="/var/run/netns/cni-96d52dff-e383-7088-4186-33dd990a624c" Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.116 [INFO][4052] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" iface="eth0" netns="/var/run/netns/cni-96d52dff-e383-7088-4186-33dd990a624c" Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.117 [INFO][4052] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.117 [INFO][4052] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.150 [INFO][4064] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" HandleID="k8s-pod-network.edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.150 [INFO][4064] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.150 [INFO][4064] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.169 [WARNING][4064] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" HandleID="k8s-pod-network.edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.169 [INFO][4064] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" HandleID="k8s-pod-network.edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.180 [INFO][4064] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:28.183185 env[1305]: 2025-03-17 18:34:28.181 [INFO][4052] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:28.186264 systemd[1]: run-netns-cni\x2d96d52dff\x2de383\x2d7088\x2d4186\x2d33dd990a624c.mount: Deactivated successfully. Mar 17 18:34:28.187134 env[1305]: time="2025-03-17T18:34:28.187088429Z" level=info msg="TearDown network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\" successfully" Mar 17 18:34:28.187134 env[1305]: time="2025-03-17T18:34:28.187131952Z" level=info msg="StopPodSandbox for \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\" returns successfully" Mar 17 18:34:28.187732 env[1305]: time="2025-03-17T18:34:28.187690350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-649897fff-xtjrl,Uid:78d53884-0ccd-4a74-9597-a16b3590ffd8,Namespace:calico-apiserver,Attempt:1,}" Mar 17 18:34:28.224551 kubelet[2208]: E0317 18:34:28.223416 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:28.224551 kubelet[2208]: E0317 18:34:28.223646 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:28.224551 kubelet[2208]: E0317 18:34:28.223756 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:28.484000 audit[4104]: NETFILTER_CFG table=filter:105 family=2 entries=10 op=nft_register_rule pid=4104 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:28.484000 audit[4104]: SYSCALL arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7ffec9d0a320 a2=0 a3=7ffec9d0a30c items=0 ppid=2398 pid=4104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:28.484000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:28.549468 sshd[4060]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:28.549000 audit[4060]: USER_END pid=4060 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:28.549000 audit[4060]: CRED_DISP pid=4060 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:28.552224 systemd[1]: sshd@13-10.0.0.12:22-10.0.0.1:51178.service: Deactivated successfully. Mar 17 18:34:28.553261 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 18:34:28.551000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.0.12:22-10.0.0.1:51178 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:28.552000 audit[4104]: NETFILTER_CFG table=nat:106 family=2 entries=56 op=nft_register_chain pid=4104 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:28.552000 audit[4104]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffec9d0a320 a2=0 a3=7ffec9d0a30c items=0 ppid=2398 pid=4104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:28.552000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:28.554515 systemd-logind[1287]: Session 14 logged out. Waiting for processes to exit. Mar 17 18:34:28.555526 systemd-logind[1287]: Removed session 14. Mar 17 18:34:29.048813 env[1305]: time="2025-03-17T18:34:29.048772200Z" level=info msg="StopPodSandbox for \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\"" Mar 17 18:34:29.049097 env[1305]: time="2025-03-17T18:34:29.049058588Z" level=info msg="StopPodSandbox for \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\"" Mar 17 18:34:29.105082 systemd-networkd[1078]: vxlan.calico: Gained IPv6LL Mar 17 18:34:29.224905 kubelet[2208]: E0317 18:34:29.224866 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:29.225996 kubelet[2208]: E0317 18:34:29.225139 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.259 [INFO][4141] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.259 [INFO][4141] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" iface="eth0" netns="/var/run/netns/cni-ea8118ca-1369-ba59-1464-c87f0c3b4b24" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.260 [INFO][4141] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" iface="eth0" netns="/var/run/netns/cni-ea8118ca-1369-ba59-1464-c87f0c3b4b24" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.260 [INFO][4141] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" iface="eth0" netns="/var/run/netns/cni-ea8118ca-1369-ba59-1464-c87f0c3b4b24" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.260 [INFO][4141] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.260 [INFO][4141] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.279 [INFO][4163] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" HandleID="k8s-pod-network.e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.279 [INFO][4163] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.279 [INFO][4163] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.299 [WARNING][4163] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" HandleID="k8s-pod-network.e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.299 [INFO][4163] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" HandleID="k8s-pod-network.e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.301 [INFO][4163] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:29.310166 env[1305]: 2025-03-17 18:34:29.307 [INFO][4141] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:29.311124 env[1305]: time="2025-03-17T18:34:29.311083354Z" level=info msg="TearDown network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\" successfully" Mar 17 18:34:29.311208 env[1305]: time="2025-03-17T18:34:29.311119732Z" level=info msg="StopPodSandbox for \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\" returns successfully" Mar 17 18:34:29.311794 env[1305]: time="2025-03-17T18:34:29.311766747Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-649897fff-47dt2,Uid:9442466c-0eea-4718-be43-447a2a2a790c,Namespace:calico-apiserver,Attempt:1,}" Mar 17 18:34:29.313359 systemd[1]: run-netns-cni\x2dea8118ca\x2d1369\x2dba59\x2d1464\x2dc87f0c3b4b24.mount: Deactivated successfully. Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.169 [INFO][4142] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.169 [INFO][4142] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" iface="eth0" netns="/var/run/netns/cni-7a5bb063-46cd-03b4-acd0-07f13392c399" Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.169 [INFO][4142] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" iface="eth0" netns="/var/run/netns/cni-7a5bb063-46cd-03b4-acd0-07f13392c399" Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.170 [INFO][4142] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" iface="eth0" netns="/var/run/netns/cni-7a5bb063-46cd-03b4-acd0-07f13392c399" Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.170 [INFO][4142] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.170 [INFO][4142] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.298 [INFO][4156] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" HandleID="k8s-pod-network.a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.298 [INFO][4156] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.301 [INFO][4156] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.307 [WARNING][4156] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" HandleID="k8s-pod-network.a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.307 [INFO][4156] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" HandleID="k8s-pod-network.a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.313 [INFO][4156] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:29.318505 env[1305]: 2025-03-17 18:34:29.315 [INFO][4142] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:29.321273 systemd[1]: run-netns-cni\x2d7a5bb063\x2d46cd\x2d03b4\x2dacd0\x2d07f13392c399.mount: Deactivated successfully. Mar 17 18:34:29.322293 env[1305]: time="2025-03-17T18:34:29.322245569Z" level=info msg="TearDown network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\" successfully" Mar 17 18:34:29.322293 env[1305]: time="2025-03-17T18:34:29.322290704Z" level=info msg="StopPodSandbox for \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\" returns successfully" Mar 17 18:34:29.322977 env[1305]: time="2025-03-17T18:34:29.322940714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f5cc97b76-r9pvl,Uid:84b5b251-95c6-46de-a5be-62ebfcea6ad9,Namespace:calico-system,Attempt:1,}" Mar 17 18:34:29.458676 systemd-networkd[1078]: cali969b63fcad9: Link UP Mar 17 18:34:29.461122 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 18:34:29.461251 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali969b63fcad9: link becomes ready Mar 17 18:34:29.461258 systemd-networkd[1078]: cali969b63fcad9: Gained carrier Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.339 [INFO][4171] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0 calico-apiserver-649897fff- calico-apiserver 78d53884-0ccd-4a74-9597-a16b3590ffd8 943 0 2025-03-17 18:33:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:649897fff projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-649897fff-xtjrl eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali969b63fcad9 [] []}} ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-xtjrl" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--xtjrl-" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.340 [INFO][4171] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-xtjrl" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.386 [INFO][4185] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" HandleID="k8s-pod-network.76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.397 [INFO][4185] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" HandleID="k8s-pod-network.76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000364950), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-649897fff-xtjrl", "timestamp":"2025-03-17 18:34:29.386042928 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.397 [INFO][4185] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.397 [INFO][4185] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.397 [INFO][4185] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.399 [INFO][4185] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" host="localhost" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.403 [INFO][4185] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.406 [INFO][4185] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.408 [INFO][4185] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.410 [INFO][4185] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.410 [INFO][4185] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" host="localhost" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.418 [INFO][4185] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.440 [INFO][4185] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" host="localhost" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.454 [INFO][4185] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" host="localhost" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.454 [INFO][4185] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" host="localhost" Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.454 [INFO][4185] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:29.481156 env[1305]: 2025-03-17 18:34:29.454 [INFO][4185] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" HandleID="k8s-pod-network.76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:29.481761 env[1305]: 2025-03-17 18:34:29.457 [INFO][4171] cni-plugin/k8s.go 386: Populated endpoint ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-xtjrl" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0", GenerateName:"calico-apiserver-649897fff-", Namespace:"calico-apiserver", SelfLink:"", UID:"78d53884-0ccd-4a74-9597-a16b3590ffd8", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"649897fff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-649897fff-xtjrl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali969b63fcad9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:29.481761 env[1305]: 2025-03-17 18:34:29.457 [INFO][4171] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.131/32] ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-xtjrl" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:29.481761 env[1305]: 2025-03-17 18:34:29.457 [INFO][4171] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali969b63fcad9 ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-xtjrl" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:29.481761 env[1305]: 2025-03-17 18:34:29.458 [INFO][4171] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-xtjrl" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:29.481761 env[1305]: 2025-03-17 18:34:29.461 [INFO][4171] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-xtjrl" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0", GenerateName:"calico-apiserver-649897fff-", Namespace:"calico-apiserver", SelfLink:"", UID:"78d53884-0ccd-4a74-9597-a16b3590ffd8", ResourceVersion:"943", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"649897fff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c", Pod:"calico-apiserver-649897fff-xtjrl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali969b63fcad9", MAC:"ea:4c:1f:6d:b0:fa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:29.481761 env[1305]: 2025-03-17 18:34:29.470 [INFO][4171] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-xtjrl" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:29.493000 audit[4218]: NETFILTER_CFG table=filter:107 family=2 entries=48 op=nft_register_chain pid=4218 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:29.493000 audit[4218]: SYSCALL arch=c000003e syscall=46 success=yes exit=25868 a0=3 a1=7fff053cb960 a2=0 a3=7fff053cb94c items=0 ppid=3801 pid=4218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:29.493000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:29.531141 env[1305]: time="2025-03-17T18:34:29.530874524Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:34:29.531360 env[1305]: time="2025-03-17T18:34:29.531116369Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:34:29.531360 env[1305]: time="2025-03-17T18:34:29.531134172Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:34:29.531780 env[1305]: time="2025-03-17T18:34:29.531536738Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c pid=4241 runtime=io.containerd.runc.v2 Mar 17 18:34:29.595735 systemd-resolved[1221]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 18:34:29.624488 env[1305]: time="2025-03-17T18:34:29.624445328Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-649897fff-xtjrl,Uid:78d53884-0ccd-4a74-9597-a16b3590ffd8,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c\"" Mar 17 18:34:29.626323 env[1305]: time="2025-03-17T18:34:29.626305410Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Mar 17 18:34:30.040108 systemd-networkd[1078]: calica2756a831d: Link UP Mar 17 18:34:30.041719 systemd-networkd[1078]: calica2756a831d: Gained carrier Mar 17 18:34:30.042141 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): calica2756a831d: link becomes ready Mar 17 18:34:30.049185 env[1305]: time="2025-03-17T18:34:30.049141065Z" level=info msg="StopPodSandbox for \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\"" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.524 [INFO][4208] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0 calico-kube-controllers-7f5cc97b76- calico-system 84b5b251-95c6-46de-a5be-62ebfcea6ad9 963 0 2025-03-17 18:33:58 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7f5cc97b76 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7f5cc97b76-r9pvl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calica2756a831d [] []}} ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Namespace="calico-system" Pod="calico-kube-controllers-7f5cc97b76-r9pvl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.524 [INFO][4208] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Namespace="calico-system" Pod="calico-kube-controllers-7f5cc97b76-r9pvl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.590 [INFO][4262] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" HandleID="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.654 [INFO][4262] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" HandleID="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030bac0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7f5cc97b76-r9pvl", "timestamp":"2025-03-17 18:34:29.590323991 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.654 [INFO][4262] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.654 [INFO][4262] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.654 [INFO][4262] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.656 [INFO][4262] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" host="localhost" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.659 [INFO][4262] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.821 [INFO][4262] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.823 [INFO][4262] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.835 [INFO][4262] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.835 [INFO][4262] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" host="localhost" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.839 [INFO][4262] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65 Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:29.852 [INFO][4262] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" host="localhost" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:30.035 [INFO][4262] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" host="localhost" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:30.035 [INFO][4262] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" host="localhost" Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:30.036 [INFO][4262] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:30.093359 env[1305]: 2025-03-17 18:34:30.036 [INFO][4262] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" HandleID="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:30.094031 env[1305]: 2025-03-17 18:34:30.038 [INFO][4208] cni-plugin/k8s.go 386: Populated endpoint ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Namespace="calico-system" Pod="calico-kube-controllers-7f5cc97b76-r9pvl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0", GenerateName:"calico-kube-controllers-7f5cc97b76-", Namespace:"calico-system", SelfLink:"", UID:"84b5b251-95c6-46de-a5be-62ebfcea6ad9", ResourceVersion:"963", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f5cc97b76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7f5cc97b76-r9pvl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calica2756a831d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:30.094031 env[1305]: 2025-03-17 18:34:30.038 [INFO][4208] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.132/32] ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Namespace="calico-system" Pod="calico-kube-controllers-7f5cc97b76-r9pvl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:30.094031 env[1305]: 2025-03-17 18:34:30.038 [INFO][4208] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calica2756a831d ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Namespace="calico-system" Pod="calico-kube-controllers-7f5cc97b76-r9pvl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:30.094031 env[1305]: 2025-03-17 18:34:30.041 [INFO][4208] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Namespace="calico-system" Pod="calico-kube-controllers-7f5cc97b76-r9pvl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:30.094031 env[1305]: 2025-03-17 18:34:30.041 [INFO][4208] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Namespace="calico-system" Pod="calico-kube-controllers-7f5cc97b76-r9pvl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0", GenerateName:"calico-kube-controllers-7f5cc97b76-", Namespace:"calico-system", SelfLink:"", UID:"84b5b251-95c6-46de-a5be-62ebfcea6ad9", ResourceVersion:"963", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f5cc97b76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65", Pod:"calico-kube-controllers-7f5cc97b76-r9pvl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calica2756a831d", MAC:"ba:7c:2a:96:a9:4f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:30.094031 env[1305]: 2025-03-17 18:34:30.091 [INFO][4208] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Namespace="calico-system" Pod="calico-kube-controllers-7f5cc97b76-r9pvl" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:30.101000 audit[4333]: NETFILTER_CFG table=filter:108 family=2 entries=46 op=nft_register_chain pid=4333 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:30.101000 audit[4333]: SYSCALL arch=c000003e syscall=46 success=yes exit=22712 a0=3 a1=7ffcac659ca0 a2=0 a3=7ffcac659c8c items=0 ppid=3801 pid=4333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:30.101000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:30.223312 env[1305]: time="2025-03-17T18:34:30.223237180Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:34:30.223312 env[1305]: time="2025-03-17T18:34:30.223282124Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:34:30.223312 env[1305]: time="2025-03-17T18:34:30.223304025Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:34:30.223524 env[1305]: time="2025-03-17T18:34:30.223495935Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65 pid=4341 runtime=io.containerd.runc.v2 Mar 17 18:34:30.254593 systemd-resolved[1221]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 18:34:30.263236 systemd-networkd[1078]: calidb2980fe4ac: Link UP Mar 17 18:34:30.264980 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): calidb2980fe4ac: link becomes ready Mar 17 18:34:30.265102 systemd-networkd[1078]: calidb2980fe4ac: Gained carrier Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:29.542 [INFO][4220] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--649897fff--47dt2-eth0 calico-apiserver-649897fff- calico-apiserver 9442466c-0eea-4718-be43-447a2a2a790c 964 0 2025-03-17 18:33:58 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:649897fff projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-649897fff-47dt2 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calidb2980fe4ac [] []}} ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-47dt2" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--47dt2-" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:29.543 [INFO][4220] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-47dt2" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:29.598 [INFO][4272] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" HandleID="k8s-pod-network.b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:29.655 [INFO][4272] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" HandleID="k8s-pod-network.b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00028d380), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-649897fff-47dt2", "timestamp":"2025-03-17 18:34:29.598329641 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:29.655 [INFO][4272] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.035 [INFO][4272] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.036 [INFO][4272] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.037 [INFO][4272] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" host="localhost" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.042 [INFO][4272] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.090 [INFO][4272] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.230 [INFO][4272] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.234 [INFO][4272] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.234 [INFO][4272] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" host="localhost" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.236 [INFO][4272] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804 Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.240 [INFO][4272] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" host="localhost" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.247 [INFO][4272] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" host="localhost" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.247 [INFO][4272] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" host="localhost" Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.247 [INFO][4272] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:30.276705 env[1305]: 2025-03-17 18:34:30.247 [INFO][4272] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" HandleID="k8s-pod-network.b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:30.277348 env[1305]: 2025-03-17 18:34:30.250 [INFO][4220] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-47dt2" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--649897fff--47dt2-eth0", GenerateName:"calico-apiserver-649897fff-", Namespace:"calico-apiserver", SelfLink:"", UID:"9442466c-0eea-4718-be43-447a2a2a790c", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"649897fff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-649897fff-47dt2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb2980fe4ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:30.277348 env[1305]: 2025-03-17 18:34:30.250 [INFO][4220] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.133/32] ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-47dt2" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:30.277348 env[1305]: 2025-03-17 18:34:30.251 [INFO][4220] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidb2980fe4ac ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-47dt2" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:30.277348 env[1305]: 2025-03-17 18:34:30.265 [INFO][4220] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-47dt2" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:30.277348 env[1305]: 2025-03-17 18:34:30.265 [INFO][4220] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-47dt2" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--649897fff--47dt2-eth0", GenerateName:"calico-apiserver-649897fff-", Namespace:"calico-apiserver", SelfLink:"", UID:"9442466c-0eea-4718-be43-447a2a2a790c", ResourceVersion:"964", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"649897fff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804", Pod:"calico-apiserver-649897fff-47dt2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb2980fe4ac", MAC:"46:4d:31:4d:03:30", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:30.277348 env[1305]: 2025-03-17 18:34:30.273 [INFO][4220] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804" Namespace="calico-apiserver" Pod="calico-apiserver-649897fff-47dt2" WorkloadEndpoint="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.233 [INFO][4312] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.233 [INFO][4312] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" iface="eth0" netns="/var/run/netns/cni-7754ebea-a681-b481-8b6f-ec2e6ede4c24" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.234 [INFO][4312] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" iface="eth0" netns="/var/run/netns/cni-7754ebea-a681-b481-8b6f-ec2e6ede4c24" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.234 [INFO][4312] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" iface="eth0" netns="/var/run/netns/cni-7754ebea-a681-b481-8b6f-ec2e6ede4c24" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.234 [INFO][4312] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.234 [INFO][4312] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.275 [INFO][4361] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" HandleID="k8s-pod-network.8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.275 [INFO][4361] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.275 [INFO][4361] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.279 [WARNING][4361] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" HandleID="k8s-pod-network.8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.280 [INFO][4361] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" HandleID="k8s-pod-network.8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.281 [INFO][4361] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:30.285082 env[1305]: 2025-03-17 18:34:30.283 [INFO][4312] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:30.285926 env[1305]: time="2025-03-17T18:34:30.285879452Z" level=info msg="TearDown network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\" successfully" Mar 17 18:34:30.286027 env[1305]: time="2025-03-17T18:34:30.286005679Z" level=info msg="StopPodSandbox for \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\" returns successfully" Mar 17 18:34:30.286829 env[1305]: time="2025-03-17T18:34:30.286809838Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7dwlz,Uid:72499494-46cf-4998-b0e6-cf96b6f788d0,Namespace:calico-system,Attempt:1,}" Mar 17 18:34:30.288062 env[1305]: time="2025-03-17T18:34:30.288032674Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7f5cc97b76-r9pvl,Uid:84b5b251-95c6-46de-a5be-62ebfcea6ad9,Namespace:calico-system,Attempt:1,} returns sandbox id \"d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65\"" Mar 17 18:34:30.297000 audit[4401]: NETFILTER_CFG table=filter:109 family=2 entries=46 op=nft_register_chain pid=4401 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:30.297000 audit[4401]: SYSCALL arch=c000003e syscall=46 success=yes exit=23892 a0=3 a1=7fffbf984c40 a2=0 a3=7fffbf984c2c items=0 ppid=3801 pid=4401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:30.297000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:30.303070 env[1305]: time="2025-03-17T18:34:30.299984751Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:34:30.303070 env[1305]: time="2025-03-17T18:34:30.300020548Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:34:30.303070 env[1305]: time="2025-03-17T18:34:30.300029905Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:34:30.303070 env[1305]: time="2025-03-17T18:34:30.300175078Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804 pid=4406 runtime=io.containerd.runc.v2 Mar 17 18:34:30.317741 systemd[1]: run-netns-cni\x2d7754ebea\x2da681\x2db481\x2d8b6f\x2dec2e6ede4c24.mount: Deactivated successfully. Mar 17 18:34:30.341718 systemd-resolved[1221]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 18:34:30.364388 env[1305]: time="2025-03-17T18:34:30.364347733Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-649897fff-47dt2,Uid:9442466c-0eea-4718-be43-447a2a2a790c,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804\"" Mar 17 18:34:30.409644 systemd-networkd[1078]: calif094777abf8: Link UP Mar 17 18:34:30.411236 systemd-networkd[1078]: calif094777abf8: Gained carrier Mar 17 18:34:30.411933 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): calif094777abf8: link becomes ready Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.344 [INFO][4423] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--7dwlz-eth0 csi-node-driver- calico-system 72499494-46cf-4998-b0e6-cf96b6f788d0 980 0 2025-03-17 18:33:58 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-7dwlz eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calif094777abf8 [] []}} ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Namespace="calico-system" Pod="csi-node-driver-7dwlz" WorkloadEndpoint="localhost-k8s-csi--node--driver--7dwlz-" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.345 [INFO][4423] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Namespace="calico-system" Pod="csi-node-driver-7dwlz" WorkloadEndpoint="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.374 [INFO][4444] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" HandleID="k8s-pod-network.c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.381 [INFO][4444] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" HandleID="k8s-pod-network.c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0004d6980), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-7dwlz", "timestamp":"2025-03-17 18:34:30.374113405 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.381 [INFO][4444] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.381 [INFO][4444] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.381 [INFO][4444] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.382 [INFO][4444] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" host="localhost" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.386 [INFO][4444] ipam/ipam.go 372: Looking up existing affinities for host host="localhost" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.389 [INFO][4444] ipam/ipam.go 489: Trying affinity for 192.168.88.128/26 host="localhost" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.391 [INFO][4444] ipam/ipam.go 155: Attempting to load block cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.392 [INFO][4444] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.392 [INFO][4444] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" host="localhost" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.393 [INFO][4444] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791 Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.399 [INFO][4444] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" host="localhost" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.405 [INFO][4444] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" host="localhost" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.405 [INFO][4444] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" host="localhost" Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.405 [INFO][4444] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:30.421548 env[1305]: 2025-03-17 18:34:30.406 [INFO][4444] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" HandleID="k8s-pod-network.c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.422185 env[1305]: 2025-03-17 18:34:30.408 [INFO][4423] cni-plugin/k8s.go 386: Populated endpoint ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Namespace="calico-system" Pod="csi-node-driver-7dwlz" WorkloadEndpoint="localhost-k8s-csi--node--driver--7dwlz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7dwlz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"72499494-46cf-4998-b0e6-cf96b6f788d0", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-7dwlz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif094777abf8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:30.422185 env[1305]: 2025-03-17 18:34:30.408 [INFO][4423] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.88.134/32] ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Namespace="calico-system" Pod="csi-node-driver-7dwlz" WorkloadEndpoint="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.422185 env[1305]: 2025-03-17 18:34:30.408 [INFO][4423] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif094777abf8 ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Namespace="calico-system" Pod="csi-node-driver-7dwlz" WorkloadEndpoint="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.422185 env[1305]: 2025-03-17 18:34:30.411 [INFO][4423] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Namespace="calico-system" Pod="csi-node-driver-7dwlz" WorkloadEndpoint="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.422185 env[1305]: 2025-03-17 18:34:30.411 [INFO][4423] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Namespace="calico-system" Pod="csi-node-driver-7dwlz" WorkloadEndpoint="localhost-k8s-csi--node--driver--7dwlz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7dwlz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"72499494-46cf-4998-b0e6-cf96b6f788d0", ResourceVersion:"980", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791", Pod:"csi-node-driver-7dwlz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif094777abf8", MAC:"d6:dd:69:04:be:d5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:30.422185 env[1305]: 2025-03-17 18:34:30.419 [INFO][4423] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791" Namespace="calico-system" Pod="csi-node-driver-7dwlz" WorkloadEndpoint="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:30.430000 audit[4474]: NETFILTER_CFG table=filter:110 family=2 entries=50 op=nft_register_chain pid=4474 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:30.430000 audit[4474]: SYSCALL arch=c000003e syscall=46 success=yes exit=23392 a0=3 a1=7ffc0c922100 a2=0 a3=7ffc0c9220ec items=0 ppid=3801 pid=4474 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:30.430000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:30.439443 env[1305]: time="2025-03-17T18:34:30.439229030Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:34:30.439443 env[1305]: time="2025-03-17T18:34:30.439269657Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:34:30.439443 env[1305]: time="2025-03-17T18:34:30.439280257Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:34:30.440111 env[1305]: time="2025-03-17T18:34:30.440017250Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791 pid=4483 runtime=io.containerd.runc.v2 Mar 17 18:34:30.458808 systemd[1]: run-containerd-runc-k8s.io-c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791-runc.w0bWb2.mount: Deactivated successfully. Mar 17 18:34:30.469528 systemd-resolved[1221]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address Mar 17 18:34:30.479891 env[1305]: time="2025-03-17T18:34:30.479837692Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-7dwlz,Uid:72499494-46cf-4998-b0e6-cf96b6f788d0,Namespace:calico-system,Attempt:1,} returns sandbox id \"c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791\"" Mar 17 18:34:30.513056 systemd-networkd[1078]: cali969b63fcad9: Gained IPv6LL Mar 17 18:34:31.409110 systemd-networkd[1078]: calidb2980fe4ac: Gained IPv6LL Mar 17 18:34:31.729103 systemd-networkd[1078]: calica2756a831d: Gained IPv6LL Mar 17 18:34:32.113155 systemd-networkd[1078]: calif094777abf8: Gained IPv6LL Mar 17 18:34:33.554108 kernel: kauditd_printk_skb: 521 callbacks suppressed Mar 17 18:34:33.554259 kernel: audit: type=1130 audit(1742236473.552:459): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.0.12:22-10.0.0.1:51182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:33.552000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.0.12:22-10.0.0.1:51182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:33.552909 systemd[1]: Started sshd@14-10.0.0.12:22-10.0.0.1:51182.service. Mar 17 18:34:33.589000 audit[4522]: USER_ACCT pid=4522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.590586 sshd[4522]: Accepted publickey for core from 10.0.0.1 port 51182 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:33.593000 audit[4522]: CRED_ACQ pid=4522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.594944 kernel: audit: type=1101 audit(1742236473.589:460): pid=4522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.595007 kernel: audit: type=1103 audit(1742236473.593:461): pid=4522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.595197 sshd[4522]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:33.608772 kernel: audit: type=1006 audit(1742236473.593:462): pid=4522 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Mar 17 18:34:33.608898 kernel: audit: type=1300 audit(1742236473.593:462): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc526497f0 a2=3 a3=0 items=0 ppid=1 pid=4522 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:33.608968 kernel: audit: type=1327 audit(1742236473.593:462): proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:33.593000 audit[4522]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc526497f0 a2=3 a3=0 items=0 ppid=1 pid=4522 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:33.593000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:33.600697 systemd-logind[1287]: New session 15 of user core. Mar 17 18:34:33.601392 systemd[1]: Started session-15.scope. Mar 17 18:34:33.607000 audit[4522]: USER_START pid=4522 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.618719 kernel: audit: type=1105 audit(1742236473.607:463): pid=4522 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.618793 kernel: audit: type=1103 audit(1742236473.609:464): pid=4525 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.609000 audit[4525]: CRED_ACQ pid=4525 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.719828 sshd[4522]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:33.728673 kernel: audit: type=1106 audit(1742236473.719:465): pid=4522 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.728820 kernel: audit: type=1104 audit(1742236473.720:466): pid=4522 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.719000 audit[4522]: USER_END pid=4522 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.720000 audit[4522]: CRED_DISP pid=4522 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:33.722654 systemd[1]: sshd@14-10.0.0.12:22-10.0.0.1:51182.service: Deactivated successfully. Mar 17 18:34:33.723672 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 18:34:33.723716 systemd-logind[1287]: Session 15 logged out. Waiting for processes to exit. Mar 17 18:34:33.724711 systemd-logind[1287]: Removed session 15. Mar 17 18:34:33.722000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.0.12:22-10.0.0.1:51182 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:34.402056 env[1305]: time="2025-03-17T18:34:34.401996746Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:34.403823 env[1305]: time="2025-03-17T18:34:34.403786265Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:34.405315 env[1305]: time="2025-03-17T18:34:34.405267134Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:34.406630 env[1305]: time="2025-03-17T18:34:34.406589646Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:34.407136 env[1305]: time="2025-03-17T18:34:34.407106195Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Mar 17 18:34:34.408199 env[1305]: time="2025-03-17T18:34:34.408174981Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Mar 17 18:34:34.409266 env[1305]: time="2025-03-17T18:34:34.409237635Z" level=info msg="CreateContainer within sandbox \"76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 18:34:34.421754 env[1305]: time="2025-03-17T18:34:34.421708651Z" level=info msg="CreateContainer within sandbox \"76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"d201ef3e835eaf498ae28f3dd9381b9ee4e402339581e613ffae150ec94b9377\"" Mar 17 18:34:34.422447 env[1305]: time="2025-03-17T18:34:34.422411079Z" level=info msg="StartContainer for \"d201ef3e835eaf498ae28f3dd9381b9ee4e402339581e613ffae150ec94b9377\"" Mar 17 18:34:34.476049 env[1305]: time="2025-03-17T18:34:34.476006156Z" level=info msg="StartContainer for \"d201ef3e835eaf498ae28f3dd9381b9ee4e402339581e613ffae150ec94b9377\" returns successfully" Mar 17 18:34:35.422000 audit[4576]: NETFILTER_CFG table=filter:111 family=2 entries=10 op=nft_register_rule pid=4576 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:35.422000 audit[4576]: SYSCALL arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7fffeeb844b0 a2=0 a3=7fffeeb8449c items=0 ppid=2398 pid=4576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:35.422000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:35.427000 audit[4576]: NETFILTER_CFG table=nat:112 family=2 entries=20 op=nft_register_rule pid=4576 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:35.427000 audit[4576]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fffeeb844b0 a2=0 a3=7fffeeb8449c items=0 ppid=2398 pid=4576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:35.427000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:36.072829 kubelet[2208]: I0317 18:34:36.072751 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-649897fff-xtjrl" podStartSLOduration=33.290741323 podStartE2EDuration="38.072715352s" podCreationTimestamp="2025-03-17 18:33:58 +0000 UTC" firstStartedPulling="2025-03-17 18:34:29.626030503 +0000 UTC m=+51.663620113" lastFinishedPulling="2025-03-17 18:34:34.408004512 +0000 UTC m=+56.445594142" observedRunningTime="2025-03-17 18:34:35.348709299 +0000 UTC m=+57.386298909" watchObservedRunningTime="2025-03-17 18:34:36.072715352 +0000 UTC m=+58.110304962" Mar 17 18:34:36.091000 audit[4580]: NETFILTER_CFG table=filter:113 family=2 entries=9 op=nft_register_rule pid=4580 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:36.091000 audit[4580]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffd53ba1780 a2=0 a3=7ffd53ba176c items=0 ppid=2398 pid=4580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:36.091000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:36.097000 audit[4580]: NETFILTER_CFG table=nat:114 family=2 entries=27 op=nft_register_chain pid=4580 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:36.097000 audit[4580]: SYSCALL arch=c000003e syscall=46 success=yes exit=9348 a0=3 a1=7ffd53ba1780 a2=0 a3=7ffd53ba176c items=0 ppid=2398 pid=4580 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:36.097000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:38.055283 env[1305]: time="2025-03-17T18:34:38.055235867Z" level=info msg="StopPodSandbox for \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\"" Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.121 [WARNING][4608] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7b08c7f0-5fd2-4720-bc11-16f99c67d9d2", ResourceVersion:"927", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272", Pod:"coredns-7db6d8ff4d-z2tns", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1db9154d887", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.121 [INFO][4608] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.121 [INFO][4608] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" iface="eth0" netns="" Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.121 [INFO][4608] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.121 [INFO][4608] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.139 [INFO][4616] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" HandleID="k8s-pod-network.e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.139 [INFO][4616] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.139 [INFO][4616] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.144 [WARNING][4616] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" HandleID="k8s-pod-network.e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.144 [INFO][4616] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" HandleID="k8s-pod-network.e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.145 [INFO][4616] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:38.148816 env[1305]: 2025-03-17 18:34:38.147 [INFO][4608] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:38.149460 env[1305]: time="2025-03-17T18:34:38.149402821Z" level=info msg="TearDown network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\" successfully" Mar 17 18:34:38.149460 env[1305]: time="2025-03-17T18:34:38.149445191Z" level=info msg="StopPodSandbox for \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\" returns successfully" Mar 17 18:34:38.150204 env[1305]: time="2025-03-17T18:34:38.150161782Z" level=info msg="RemovePodSandbox for \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\"" Mar 17 18:34:38.150265 env[1305]: time="2025-03-17T18:34:38.150209842Z" level=info msg="Forcibly stopping sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\"" Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.187 [WARNING][4641] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7b08c7f0-5fd2-4720-bc11-16f99c67d9d2", ResourceVersion:"927", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"188bf2b2f95a1379cc0533a120c6eb495a8aa5304d49078947edaa88cbf07272", Pod:"coredns-7db6d8ff4d-z2tns", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali1db9154d887", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.187 [INFO][4641] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.187 [INFO][4641] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" iface="eth0" netns="" Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.187 [INFO][4641] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.187 [INFO][4641] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.213 [INFO][4648] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" HandleID="k8s-pod-network.e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.213 [INFO][4648] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.213 [INFO][4648] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.220 [WARNING][4648] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" HandleID="k8s-pod-network.e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.220 [INFO][4648] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" HandleID="k8s-pod-network.e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Workload="localhost-k8s-coredns--7db6d8ff4d--z2tns-eth0" Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.222 [INFO][4648] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:38.226032 env[1305]: 2025-03-17 18:34:38.224 [INFO][4641] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00" Mar 17 18:34:38.226498 env[1305]: time="2025-03-17T18:34:38.226077182Z" level=info msg="TearDown network for sandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\" successfully" Mar 17 18:34:38.279192 env[1305]: time="2025-03-17T18:34:38.279113400Z" level=info msg="RemovePodSandbox \"e42baa19de13de1aba8196356410d1ea04d485ab9f0ef64dca8c437584de5b00\" returns successfully" Mar 17 18:34:38.279691 env[1305]: time="2025-03-17T18:34:38.279651805Z" level=info msg="StopPodSandbox for \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\"" Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.310 [WARNING][4672] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7dwlz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"72499494-46cf-4998-b0e6-cf96b6f788d0", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791", Pod:"csi-node-driver-7dwlz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif094777abf8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.310 [INFO][4672] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.310 [INFO][4672] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" iface="eth0" netns="" Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.310 [INFO][4672] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.310 [INFO][4672] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.336 [INFO][4680] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" HandleID="k8s-pod-network.8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.336 [INFO][4680] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.336 [INFO][4680] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.341 [WARNING][4680] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" HandleID="k8s-pod-network.8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.341 [INFO][4680] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" HandleID="k8s-pod-network.8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.343 [INFO][4680] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:38.346823 env[1305]: 2025-03-17 18:34:38.345 [INFO][4672] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:38.347351 env[1305]: time="2025-03-17T18:34:38.346861392Z" level=info msg="TearDown network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\" successfully" Mar 17 18:34:38.347351 env[1305]: time="2025-03-17T18:34:38.346894836Z" level=info msg="StopPodSandbox for \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\" returns successfully" Mar 17 18:34:38.347687 env[1305]: time="2025-03-17T18:34:38.347635272Z" level=info msg="RemovePodSandbox for \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\"" Mar 17 18:34:38.347884 env[1305]: time="2025-03-17T18:34:38.347682039Z" level=info msg="Forcibly stopping sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\"" Mar 17 18:34:38.350495 env[1305]: time="2025-03-17T18:34:38.350469375Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:38.352805 env[1305]: time="2025-03-17T18:34:38.352766666Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:38.354454 env[1305]: time="2025-03-17T18:34:38.354409933Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:38.356034 env[1305]: time="2025-03-17T18:34:38.355966909Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:38.356432 env[1305]: time="2025-03-17T18:34:38.356406448Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Mar 17 18:34:38.357617 env[1305]: time="2025-03-17T18:34:38.357597403Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Mar 17 18:34:38.371328 env[1305]: time="2025-03-17T18:34:38.371283715Z" level=info msg="CreateContainer within sandbox \"d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.415 [WARNING][4702] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--7dwlz-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"72499494-46cf-4998-b0e6-cf96b6f788d0", ResourceVersion:"989", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791", Pod:"csi-node-driver-7dwlz", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calif094777abf8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.415 [INFO][4702] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.415 [INFO][4702] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" iface="eth0" netns="" Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.415 [INFO][4702] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.415 [INFO][4702] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.434 [INFO][4711] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" HandleID="k8s-pod-network.8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.434 [INFO][4711] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.434 [INFO][4711] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.439 [WARNING][4711] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" HandleID="k8s-pod-network.8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.439 [INFO][4711] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" HandleID="k8s-pod-network.8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Workload="localhost-k8s-csi--node--driver--7dwlz-eth0" Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.440 [INFO][4711] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:38.443364 env[1305]: 2025-03-17 18:34:38.441 [INFO][4702] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447" Mar 17 18:34:38.443882 env[1305]: time="2025-03-17T18:34:38.443394703Z" level=info msg="TearDown network for sandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\" successfully" Mar 17 18:34:38.723277 systemd[1]: Started sshd@15-10.0.0.12:22-10.0.0.1:44220.service. Mar 17 18:34:38.722000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.0.12:22-10.0.0.1:44220 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:38.724423 kernel: kauditd_printk_skb: 13 callbacks suppressed Mar 17 18:34:38.724505 kernel: audit: type=1130 audit(1742236478.722:472): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.0.12:22-10.0.0.1:44220 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:38.758000 audit[4719]: USER_ACCT pid=4719 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.759406 sshd[4719]: Accepted publickey for core from 10.0.0.1 port 44220 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:38.774869 sshd[4719]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:38.759000 audit[4719]: CRED_ACQ pid=4719 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.779161 systemd-logind[1287]: New session 16 of user core. Mar 17 18:34:38.782028 kernel: audit: type=1101 audit(1742236478.758:473): pid=4719 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.782061 kernel: audit: type=1103 audit(1742236478.759:474): pid=4719 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.782081 kernel: audit: type=1006 audit(1742236478.759:475): pid=4719 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Mar 17 18:34:38.779861 systemd[1]: Started session-16.scope. Mar 17 18:34:38.759000 audit[4719]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe4e2572c0 a2=3 a3=0 items=0 ppid=1 pid=4719 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:38.788270 kernel: audit: type=1300 audit(1742236478.759:475): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe4e2572c0 a2=3 a3=0 items=0 ppid=1 pid=4719 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:38.788329 kernel: audit: type=1327 audit(1742236478.759:475): proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:38.759000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:38.789560 kernel: audit: type=1105 audit(1742236478.784:476): pid=4719 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.784000 audit[4719]: USER_START pid=4719 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.793716 kernel: audit: type=1103 audit(1742236478.785:477): pid=4722 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.785000 audit[4722]: CRED_ACQ pid=4722 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.887084 sshd[4719]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:38.887000 audit[4719]: USER_END pid=4719 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.887000 audit[4719]: CRED_DISP pid=4719 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.889599 systemd[1]: sshd@15-10.0.0.12:22-10.0.0.1:44220.service: Deactivated successfully. Mar 17 18:34:38.890615 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 18:34:38.891163 systemd-logind[1287]: Session 16 logged out. Waiting for processes to exit. Mar 17 18:34:38.892032 systemd-logind[1287]: Removed session 16. Mar 17 18:34:38.896202 kernel: audit: type=1106 audit(1742236478.887:478): pid=4719 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.896261 kernel: audit: type=1104 audit(1742236478.887:479): pid=4719 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:38.887000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.0.12:22-10.0.0.1:44220 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:39.084686 env[1305]: time="2025-03-17T18:34:39.084636097Z" level=info msg="RemovePodSandbox \"8c7dd3abde9bf0894f397eecc731465e6760c4a8b2c9bd3125877b1a9f476447\" returns successfully" Mar 17 18:34:39.085283 env[1305]: time="2025-03-17T18:34:39.085232443Z" level=info msg="StopPodSandbox for \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\"" Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.116 [WARNING][4748] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6293eeb9-1677-4d5c-b023-b36c91a971a4", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303", Pod:"coredns-7db6d8ff4d-m5vrn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie1360bac6e4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.117 [INFO][4748] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.117 [INFO][4748] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" iface="eth0" netns="" Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.117 [INFO][4748] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.117 [INFO][4748] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.134 [INFO][4755] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" HandleID="k8s-pod-network.5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.135 [INFO][4755] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.135 [INFO][4755] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.140 [WARNING][4755] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" HandleID="k8s-pod-network.5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.140 [INFO][4755] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" HandleID="k8s-pod-network.5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.142 [INFO][4755] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:39.145608 env[1305]: 2025-03-17 18:34:39.144 [INFO][4748] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:39.146109 env[1305]: time="2025-03-17T18:34:39.145614034Z" level=info msg="TearDown network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\" successfully" Mar 17 18:34:39.146109 env[1305]: time="2025-03-17T18:34:39.145647379Z" level=info msg="StopPodSandbox for \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\" returns successfully" Mar 17 18:34:39.146231 env[1305]: time="2025-03-17T18:34:39.146205560Z" level=info msg="RemovePodSandbox for \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\"" Mar 17 18:34:39.146285 env[1305]: time="2025-03-17T18:34:39.146234366Z" level=info msg="Forcibly stopping sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\"" Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.176 [WARNING][4778] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"6293eeb9-1677-4d5c-b023-b36c91a971a4", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"47f951943e1c86947cc41ff2ad66fbbb6df9376b345feff2bb51876401094303", Pod:"coredns-7db6d8ff4d-m5vrn", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie1360bac6e4", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.176 [INFO][4778] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.176 [INFO][4778] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" iface="eth0" netns="" Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.176 [INFO][4778] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.176 [INFO][4778] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.197 [INFO][4785] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" HandleID="k8s-pod-network.5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.197 [INFO][4785] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.197 [INFO][4785] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.202 [WARNING][4785] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" HandleID="k8s-pod-network.5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.202 [INFO][4785] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" HandleID="k8s-pod-network.5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Workload="localhost-k8s-coredns--7db6d8ff4d--m5vrn-eth0" Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.204 [INFO][4785] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:39.208214 env[1305]: 2025-03-17 18:34:39.205 [INFO][4778] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4" Mar 17 18:34:39.208733 env[1305]: time="2025-03-17T18:34:39.208226897Z" level=info msg="TearDown network for sandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\" successfully" Mar 17 18:34:39.218455 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount970568440.mount: Deactivated successfully. Mar 17 18:34:39.226554 env[1305]: time="2025-03-17T18:34:39.226503501Z" level=info msg="CreateContainer within sandbox \"d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6\"" Mar 17 18:34:39.227184 env[1305]: time="2025-03-17T18:34:39.227144564Z" level=info msg="StartContainer for \"209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6\"" Mar 17 18:34:39.227297 env[1305]: time="2025-03-17T18:34:39.227265259Z" level=info msg="RemovePodSandbox \"5a0af5357550f990b1bd9fdae846e24bccd7264e4157c8fe6cfaf5b3512b6aa4\" returns successfully" Mar 17 18:34:39.227808 env[1305]: time="2025-03-17T18:34:39.227756900Z" level=info msg="StopPodSandbox for \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\"" Mar 17 18:34:39.240332 env[1305]: time="2025-03-17T18:34:39.240282373Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:39.249776 env[1305]: time="2025-03-17T18:34:39.249713035Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:39.253231 env[1305]: time="2025-03-17T18:34:39.251967522Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:39.258273 env[1305]: time="2025-03-17T18:34:39.258192951Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:39.258430 env[1305]: time="2025-03-17T18:34:39.258401999Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Mar 17 18:34:39.262082 env[1305]: time="2025-03-17T18:34:39.262038788Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Mar 17 18:34:39.264230 env[1305]: time="2025-03-17T18:34:39.264186154Z" level=info msg="CreateContainer within sandbox \"b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 18:34:39.276115 env[1305]: time="2025-03-17T18:34:39.276069582Z" level=info msg="CreateContainer within sandbox \"b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"0f446cf291e579d15969ce7e629fcc61a1253aaf2536fc5349306e65283b3651\"" Mar 17 18:34:39.276614 env[1305]: time="2025-03-17T18:34:39.276576082Z" level=info msg="StartContainer for \"0f446cf291e579d15969ce7e629fcc61a1253aaf2536fc5349306e65283b3651\"" Mar 17 18:34:39.290721 env[1305]: time="2025-03-17T18:34:39.290660672Z" level=info msg="StartContainer for \"209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6\" returns successfully" Mar 17 18:34:39.410639 env[1305]: time="2025-03-17T18:34:39.409570893Z" level=info msg="StartContainer for \"0f446cf291e579d15969ce7e629fcc61a1253aaf2536fc5349306e65283b3651\" returns successfully" Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.288 [WARNING][4816] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--649897fff--47dt2-eth0", GenerateName:"calico-apiserver-649897fff-", Namespace:"calico-apiserver", SelfLink:"", UID:"9442466c-0eea-4718-be43-447a2a2a790c", ResourceVersion:"984", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"649897fff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804", Pod:"calico-apiserver-649897fff-47dt2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb2980fe4ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.288 [INFO][4816] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.288 [INFO][4816] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" iface="eth0" netns="" Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.288 [INFO][4816] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.288 [INFO][4816] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.376 [INFO][4858] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" HandleID="k8s-pod-network.e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.376 [INFO][4858] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.376 [INFO][4858] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.411 [WARNING][4858] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" HandleID="k8s-pod-network.e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.411 [INFO][4858] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" HandleID="k8s-pod-network.e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.413 [INFO][4858] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:39.417486 env[1305]: 2025-03-17 18:34:39.414 [INFO][4816] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:39.418093 env[1305]: time="2025-03-17T18:34:39.418056760Z" level=info msg="TearDown network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\" successfully" Mar 17 18:34:39.418274 env[1305]: time="2025-03-17T18:34:39.418185892Z" level=info msg="StopPodSandbox for \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\" returns successfully" Mar 17 18:34:39.418961 env[1305]: time="2025-03-17T18:34:39.418923713Z" level=info msg="RemovePodSandbox for \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\"" Mar 17 18:34:39.419029 env[1305]: time="2025-03-17T18:34:39.418962078Z" level=info msg="Forcibly stopping sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\"" Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.615 [WARNING][4908] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--649897fff--47dt2-eth0", GenerateName:"calico-apiserver-649897fff-", Namespace:"calico-apiserver", SelfLink:"", UID:"9442466c-0eea-4718-be43-447a2a2a790c", ResourceVersion:"984", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"649897fff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"b8c2138d0eb1210bba822323d73a0bd8975ef24b5c0117945d45deeacef34804", Pod:"calico-apiserver-649897fff-47dt2", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calidb2980fe4ac", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.615 [INFO][4908] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.615 [INFO][4908] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" iface="eth0" netns="" Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.615 [INFO][4908] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.615 [INFO][4908] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.634 [INFO][4919] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" HandleID="k8s-pod-network.e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.634 [INFO][4919] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.634 [INFO][4919] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.640 [WARNING][4919] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" HandleID="k8s-pod-network.e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.640 [INFO][4919] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" HandleID="k8s-pod-network.e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Workload="localhost-k8s-calico--apiserver--649897fff--47dt2-eth0" Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.641 [INFO][4919] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:39.645169 env[1305]: 2025-03-17 18:34:39.643 [INFO][4908] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e" Mar 17 18:34:39.645740 env[1305]: time="2025-03-17T18:34:39.645192718Z" level=info msg="TearDown network for sandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\" successfully" Mar 17 18:34:39.734215 env[1305]: time="2025-03-17T18:34:39.734092461Z" level=info msg="RemovePodSandbox \"e95414d18a548871e6bb875d970c53efac832208db4d67422962848cb843d78e\" returns successfully" Mar 17 18:34:39.734640 env[1305]: time="2025-03-17T18:34:39.734601787Z" level=info msg="StopPodSandbox for \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\"" Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.768 [WARNING][4941] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0", GenerateName:"calico-kube-controllers-7f5cc97b76-", Namespace:"calico-system", SelfLink:"", UID:"84b5b251-95c6-46de-a5be-62ebfcea6ad9", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f5cc97b76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65", Pod:"calico-kube-controllers-7f5cc97b76-r9pvl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calica2756a831d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.769 [INFO][4941] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.769 [INFO][4941] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" iface="eth0" netns="" Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.769 [INFO][4941] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.769 [INFO][4941] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.792 [INFO][4948] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" HandleID="k8s-pod-network.a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.793 [INFO][4948] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.793 [INFO][4948] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.798 [WARNING][4948] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" HandleID="k8s-pod-network.a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.798 [INFO][4948] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" HandleID="k8s-pod-network.a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.800 [INFO][4948] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:39.804835 env[1305]: 2025-03-17 18:34:39.801 [INFO][4941] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:39.805487 env[1305]: time="2025-03-17T18:34:39.805398125Z" level=info msg="TearDown network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\" successfully" Mar 17 18:34:39.805487 env[1305]: time="2025-03-17T18:34:39.805446649Z" level=info msg="StopPodSandbox for \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\" returns successfully" Mar 17 18:34:39.805944 env[1305]: time="2025-03-17T18:34:39.805908543Z" level=info msg="RemovePodSandbox for \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\"" Mar 17 18:34:39.806029 env[1305]: time="2025-03-17T18:34:39.805949653Z" level=info msg="Forcibly stopping sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\"" Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.840 [WARNING][4971] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0", GenerateName:"calico-kube-controllers-7f5cc97b76-", Namespace:"calico-system", SelfLink:"", UID:"84b5b251-95c6-46de-a5be-62ebfcea6ad9", ResourceVersion:"977", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7f5cc97b76", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65", Pod:"calico-kube-controllers-7f5cc97b76-r9pvl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calica2756a831d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.840 [INFO][4971] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.840 [INFO][4971] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" iface="eth0" netns="" Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.840 [INFO][4971] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.841 [INFO][4971] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.864 [INFO][4978] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" HandleID="k8s-pod-network.a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.864 [INFO][4978] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.864 [INFO][4978] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.870 [WARNING][4978] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" HandleID="k8s-pod-network.a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.870 [INFO][4978] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" HandleID="k8s-pod-network.a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.873 [INFO][4978] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:39.876552 env[1305]: 2025-03-17 18:34:39.874 [INFO][4971] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381" Mar 17 18:34:39.877073 env[1305]: time="2025-03-17T18:34:39.876581310Z" level=info msg="TearDown network for sandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\" successfully" Mar 17 18:34:39.880085 env[1305]: time="2025-03-17T18:34:39.880058717Z" level=info msg="RemovePodSandbox \"a2c824457347d32371d557c0d73012456bfcea9d2ca4848bbb8316b445da8381\" returns successfully" Mar 17 18:34:39.880597 env[1305]: time="2025-03-17T18:34:39.880563583Z" level=info msg="StopPodSandbox for \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\"" Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.912 [WARNING][5001] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0", GenerateName:"calico-apiserver-649897fff-", Namespace:"calico-apiserver", SelfLink:"", UID:"78d53884-0ccd-4a74-9597-a16b3590ffd8", ResourceVersion:"1008", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"649897fff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c", Pod:"calico-apiserver-649897fff-xtjrl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali969b63fcad9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.913 [INFO][5001] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.913 [INFO][5001] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" iface="eth0" netns="" Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.913 [INFO][5001] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.913 [INFO][5001] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.933 [INFO][5009] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" HandleID="k8s-pod-network.edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.933 [INFO][5009] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.933 [INFO][5009] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.939 [WARNING][5009] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" HandleID="k8s-pod-network.edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.939 [INFO][5009] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" HandleID="k8s-pod-network.edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.942 [INFO][5009] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:39.944866 env[1305]: 2025-03-17 18:34:39.943 [INFO][5001] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:39.945663 env[1305]: time="2025-03-17T18:34:39.945574764Z" level=info msg="TearDown network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\" successfully" Mar 17 18:34:39.945663 env[1305]: time="2025-03-17T18:34:39.945624541Z" level=info msg="StopPodSandbox for \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\" returns successfully" Mar 17 18:34:39.946283 env[1305]: time="2025-03-17T18:34:39.946243100Z" level=info msg="RemovePodSandbox for \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\"" Mar 17 18:34:39.946427 env[1305]: time="2025-03-17T18:34:39.946376079Z" level=info msg="Forcibly stopping sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\"" Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:39.984 [WARNING][5031] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0", GenerateName:"calico-apiserver-649897fff-", Namespace:"calico-apiserver", SelfLink:"", UID:"78d53884-0ccd-4a74-9597-a16b3590ffd8", ResourceVersion:"1008", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 33, 58, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"649897fff", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"76bd4dfeb9f5b9bfe684bc5058e7bcaa195594c0acef7928ea80464db4dc743c", Pod:"calico-apiserver-649897fff-xtjrl", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali969b63fcad9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:39.985 [INFO][5031] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:39.985 [INFO][5031] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" iface="eth0" netns="" Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:39.985 [INFO][5031] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:39.985 [INFO][5031] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:40.011 [INFO][5038] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" HandleID="k8s-pod-network.edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:40.011 [INFO][5038] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:40.011 [INFO][5038] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:40.017 [WARNING][5038] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" HandleID="k8s-pod-network.edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:40.017 [INFO][5038] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" HandleID="k8s-pod-network.edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Workload="localhost-k8s-calico--apiserver--649897fff--xtjrl-eth0" Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:40.018 [INFO][5038] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:40.021796 env[1305]: 2025-03-17 18:34:40.020 [INFO][5031] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2" Mar 17 18:34:40.021796 env[1305]: time="2025-03-17T18:34:40.021727518Z" level=info msg="TearDown network for sandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\" successfully" Mar 17 18:34:40.025179 env[1305]: time="2025-03-17T18:34:40.025139219Z" level=info msg="RemovePodSandbox \"edcced1347a8ec264dc599dfa844621bcd0597d56ff29ba8a43aafd34d337bf2\" returns successfully" Mar 17 18:34:40.391000 audit[5067]: NETFILTER_CFG table=filter:115 family=2 entries=8 op=nft_register_rule pid=5067 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:40.391000 audit[5067]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffc760e8320 a2=0 a3=7ffc760e830c items=0 ppid=2398 pid=5067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:40.391000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:40.395000 audit[5067]: NETFILTER_CFG table=nat:116 family=2 entries=30 op=nft_register_rule pid=5067 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:40.395000 audit[5067]: SYSCALL arch=c000003e syscall=46 success=yes exit=9348 a0=3 a1=7ffc760e8320 a2=0 a3=7ffc760e830c items=0 ppid=2398 pid=5067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:40.395000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:40.455570 kubelet[2208]: I0317 18:34:40.455504 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-649897fff-47dt2" podStartSLOduration=33.561942368 podStartE2EDuration="42.455481036s" podCreationTimestamp="2025-03-17 18:33:58 +0000 UTC" firstStartedPulling="2025-03-17 18:34:30.366427097 +0000 UTC m=+52.404016707" lastFinishedPulling="2025-03-17 18:34:39.259965765 +0000 UTC m=+61.297555375" observedRunningTime="2025-03-17 18:34:40.337148554 +0000 UTC m=+62.374738165" watchObservedRunningTime="2025-03-17 18:34:40.455481036 +0000 UTC m=+62.493070646" Mar 17 18:34:41.279883 kubelet[2208]: I0317 18:34:41.279846 2208 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 18:34:41.758379 env[1305]: time="2025-03-17T18:34:41.758321798Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:41.760246 env[1305]: time="2025-03-17T18:34:41.760212505Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:41.761817 env[1305]: time="2025-03-17T18:34:41.761778419Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/csi:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:41.763330 env[1305]: time="2025-03-17T18:34:41.763300447Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:41.763766 env[1305]: time="2025-03-17T18:34:41.763729525Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Mar 17 18:34:41.766169 env[1305]: time="2025-03-17T18:34:41.766141338Z" level=info msg="CreateContainer within sandbox \"c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 18:34:41.778519 env[1305]: time="2025-03-17T18:34:41.778468149Z" level=info msg="CreateContainer within sandbox \"c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"0e9583b93a9991c5f687c86de8f961d9c464bbb72793920b0f79ae3c092c4fcc\"" Mar 17 18:34:41.779413 env[1305]: time="2025-03-17T18:34:41.778951842Z" level=info msg="StartContainer for \"0e9583b93a9991c5f687c86de8f961d9c464bbb72793920b0f79ae3c092c4fcc\"" Mar 17 18:34:41.942565 env[1305]: time="2025-03-17T18:34:41.942477028Z" level=info msg="StartContainer for \"0e9583b93a9991c5f687c86de8f961d9c464bbb72793920b0f79ae3c092c4fcc\" returns successfully" Mar 17 18:34:41.944228 env[1305]: time="2025-03-17T18:34:41.944196963Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Mar 17 18:34:43.890000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.0.12:22-10.0.0.1:40526 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:43.891440 systemd[1]: Started sshd@16-10.0.0.12:22-10.0.0.1:40526.service. Mar 17 18:34:43.892833 kernel: kauditd_printk_skb: 7 callbacks suppressed Mar 17 18:34:43.892910 kernel: audit: type=1130 audit(1742236483.890:483): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.0.12:22-10.0.0.1:40526 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:43.926000 audit[5104]: USER_ACCT pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:43.927781 sshd[5104]: Accepted publickey for core from 10.0.0.1 port 40526 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:43.931361 sshd[5104]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:43.932557 kernel: audit: type=1101 audit(1742236483.926:484): pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:43.932653 kernel: audit: type=1103 audit(1742236483.929:485): pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:43.929000 audit[5104]: CRED_ACQ pid=5104 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:43.935812 systemd-logind[1287]: New session 17 of user core. Mar 17 18:34:43.939393 kernel: audit: type=1006 audit(1742236483.930:486): pid=5104 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Mar 17 18:34:43.937112 systemd[1]: Started session-17.scope. Mar 17 18:34:43.944447 kernel: audit: type=1300 audit(1742236483.930:486): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6736cff0 a2=3 a3=0 items=0 ppid=1 pid=5104 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:43.930000 audit[5104]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd6736cff0 a2=3 a3=0 items=0 ppid=1 pid=5104 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:43.946209 kernel: audit: type=1327 audit(1742236483.930:486): proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:43.930000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:43.942000 audit[5104]: USER_START pid=5104 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:43.950574 kernel: audit: type=1105 audit(1742236483.942:487): pid=5104 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:43.950630 kernel: audit: type=1103 audit(1742236483.943:488): pid=5107 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:43.943000 audit[5107]: CRED_ACQ pid=5107 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:43.989577 systemd[1]: run-containerd-runc-k8s.io-209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6-runc.ACrusJ.mount: Deactivated successfully. Mar 17 18:34:44.103148 env[1305]: time="2025-03-17T18:34:44.103087161Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:44.105430 env[1305]: time="2025-03-17T18:34:44.105380250Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:44.107091 env[1305]: time="2025-03-17T18:34:44.107063683Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:44.115216 sshd[5104]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:44.115000 audit[5104]: USER_END pid=5104 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:44.117794 systemd[1]: sshd@16-10.0.0.12:22-10.0.0.1:40526.service: Deactivated successfully. Mar 17 18:34:44.119100 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 18:34:44.119181 systemd-logind[1287]: Session 17 logged out. Waiting for processes to exit. Mar 17 18:34:44.125108 kernel: audit: type=1106 audit(1742236484.115:489): pid=5104 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:44.125266 kernel: audit: type=1104 audit(1742236484.115:490): pid=5104 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:44.115000 audit[5104]: CRED_DISP pid=5104 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:44.125380 env[1305]: time="2025-03-17T18:34:44.122290016Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:34:44.125380 env[1305]: time="2025-03-17T18:34:44.122700274Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Mar 17 18:34:44.121868 systemd-logind[1287]: Removed session 17. Mar 17 18:34:44.117000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.0.12:22-10.0.0.1:40526 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:44.126401 env[1305]: time="2025-03-17T18:34:44.126364538Z" level=info msg="CreateContainer within sandbox \"c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 18:34:44.161294 env[1305]: time="2025-03-17T18:34:44.161169017Z" level=info msg="CreateContainer within sandbox \"c12e46eb7f6ef5ddb56ec89a47c845d5ae28a2f652cea8873bcfb3f927d92791\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"be4792af0957fd6d7a8e408245a6c8a0505accb63b5cd5757af4a611ab27ee1c\"" Mar 17 18:34:44.161795 env[1305]: time="2025-03-17T18:34:44.161765727Z" level=info msg="StartContainer for \"be4792af0957fd6d7a8e408245a6c8a0505accb63b5cd5757af4a611ab27ee1c\"" Mar 17 18:34:44.219973 env[1305]: time="2025-03-17T18:34:44.219883203Z" level=info msg="StartContainer for \"be4792af0957fd6d7a8e408245a6c8a0505accb63b5cd5757af4a611ab27ee1c\" returns successfully" Mar 17 18:34:44.299950 kubelet[2208]: I0317 18:34:44.299832 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-7dwlz" podStartSLOduration=32.657865283 podStartE2EDuration="46.299813528s" podCreationTimestamp="2025-03-17 18:33:58 +0000 UTC" firstStartedPulling="2025-03-17 18:34:30.481769558 +0000 UTC m=+52.519359168" lastFinishedPulling="2025-03-17 18:34:44.123717802 +0000 UTC m=+66.161307413" observedRunningTime="2025-03-17 18:34:44.299395515 +0000 UTC m=+66.336985126" watchObservedRunningTime="2025-03-17 18:34:44.299813528 +0000 UTC m=+66.337403138" Mar 17 18:34:44.300446 kubelet[2208]: I0317 18:34:44.299953 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7f5cc97b76-r9pvl" podStartSLOduration=38.232183152 podStartE2EDuration="46.299949102s" podCreationTimestamp="2025-03-17 18:33:58 +0000 UTC" firstStartedPulling="2025-03-17 18:34:30.28956521 +0000 UTC m=+52.327154821" lastFinishedPulling="2025-03-17 18:34:38.357331171 +0000 UTC m=+60.394920771" observedRunningTime="2025-03-17 18:34:40.457964625 +0000 UTC m=+62.495554225" watchObservedRunningTime="2025-03-17 18:34:44.299949102 +0000 UTC m=+66.337538712" Mar 17 18:34:44.671368 kubelet[2208]: E0317 18:34:44.671302 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:45.155610 kubelet[2208]: I0317 18:34:45.155571 2208 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 18:34:45.155610 kubelet[2208]: I0317 18:34:45.155604 2208 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 18:34:49.119156 systemd[1]: Started sshd@17-10.0.0.12:22-10.0.0.1:40532.service. Mar 17 18:34:49.118000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.0.12:22-10.0.0.1:40532 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:49.120248 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:34:49.120310 kernel: audit: type=1130 audit(1742236489.118:492): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.0.12:22-10.0.0.1:40532 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:49.149000 audit[5203]: USER_ACCT pid=5203 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.150788 sshd[5203]: Accepted publickey for core from 10.0.0.1 port 40532 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:49.153370 sshd[5203]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:49.152000 audit[5203]: CRED_ACQ pid=5203 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.156610 systemd-logind[1287]: New session 18 of user core. Mar 17 18:34:49.157323 systemd[1]: Started session-18.scope. Mar 17 18:34:49.158220 kernel: audit: type=1101 audit(1742236489.149:493): pid=5203 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.158281 kernel: audit: type=1103 audit(1742236489.152:494): pid=5203 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.158325 kernel: audit: type=1006 audit(1742236489.152:495): pid=5203 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Mar 17 18:34:49.152000 audit[5203]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdb4f43320 a2=3 a3=0 items=0 ppid=1 pid=5203 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:49.164487 kernel: audit: type=1300 audit(1742236489.152:495): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdb4f43320 a2=3 a3=0 items=0 ppid=1 pid=5203 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:49.164541 kernel: audit: type=1327 audit(1742236489.152:495): proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:49.152000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:49.162000 audit[5203]: USER_START pid=5203 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.169949 kernel: audit: type=1105 audit(1742236489.162:496): pid=5203 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.169997 kernel: audit: type=1103 audit(1742236489.163:497): pid=5206 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.163000 audit[5206]: CRED_ACQ pid=5206 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.264535 sshd[5203]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:49.264000 audit[5203]: USER_END pid=5203 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.266797 systemd[1]: sshd@17-10.0.0.12:22-10.0.0.1:40532.service: Deactivated successfully. Mar 17 18:34:49.267716 systemd-logind[1287]: Session 18 logged out. Waiting for processes to exit. Mar 17 18:34:49.267811 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 18:34:49.269236 systemd-logind[1287]: Removed session 18. Mar 17 18:34:49.264000 audit[5203]: CRED_DISP pid=5203 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.273737 kernel: audit: type=1106 audit(1742236489.264:498): pid=5203 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.273797 kernel: audit: type=1104 audit(1742236489.264:499): pid=5203 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:49.266000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.0.12:22-10.0.0.1:40532 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:53.786024 kubelet[2208]: I0317 18:34:53.785979 2208 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 18:34:53.811000 audit[5222]: NETFILTER_CFG table=filter:117 family=2 entries=8 op=nft_register_rule pid=5222 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:53.811000 audit[5222]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7fffda587ad0 a2=0 a3=7fffda587abc items=0 ppid=2398 pid=5222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:53.811000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:53.816000 audit[5222]: NETFILTER_CFG table=nat:118 family=2 entries=34 op=nft_register_chain pid=5222 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:53.816000 audit[5222]: SYSCALL arch=c000003e syscall=46 success=yes exit=11236 a0=3 a1=7fffda587ad0 a2=0 a3=7fffda587abc items=0 ppid=2398 pid=5222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:53.816000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:53.984621 env[1305]: time="2025-03-17T18:34:53.984560638Z" level=info msg="StopContainer for \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\" with timeout 300 (s)" Mar 17 18:34:53.985382 env[1305]: time="2025-03-17T18:34:53.985349881Z" level=info msg="Stop container \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\" with signal terminated" Mar 17 18:34:54.001000 audit[5230]: NETFILTER_CFG table=filter:119 family=2 entries=8 op=nft_register_rule pid=5230 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:54.001000 audit[5230]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffd1d752bf0 a2=0 a3=7ffd1d752bdc items=0 ppid=2398 pid=5230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:54.001000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:54.007000 audit[5230]: NETFILTER_CFG table=nat:120 family=2 entries=30 op=nft_register_rule pid=5230 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:54.007000 audit[5230]: SYSCALL arch=c000003e syscall=46 success=yes exit=9348 a0=3 a1=7ffd1d752bf0 a2=0 a3=7ffd1d752bdc items=0 ppid=2398 pid=5230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:54.007000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:54.034297 systemd[1]: run-containerd-runc-k8s.io-f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc-runc.B2zvZK.mount: Deactivated successfully. Mar 17 18:34:54.054973 env[1305]: time="2025-03-17T18:34:54.054838561Z" level=info msg="StopContainer for \"209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6\" with timeout 30 (s)" Mar 17 18:34:54.055246 env[1305]: time="2025-03-17T18:34:54.055221610Z" level=info msg="Stop container \"209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6\" with signal terminated" Mar 17 18:34:54.102248 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6-rootfs.mount: Deactivated successfully. Mar 17 18:34:54.104296 env[1305]: time="2025-03-17T18:34:54.104238991Z" level=info msg="shim disconnected" id=209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6 Mar 17 18:34:54.104296 env[1305]: time="2025-03-17T18:34:54.104293907Z" level=warning msg="cleaning up after shim disconnected" id=209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6 namespace=k8s.io Mar 17 18:34:54.104296 env[1305]: time="2025-03-17T18:34:54.104302924Z" level=info msg="cleaning up dead shim" Mar 17 18:34:54.116052 env[1305]: time="2025-03-17T18:34:54.116012429Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:54Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=5273 runtime=io.containerd.runc.v2\n" Mar 17 18:34:54.128182 env[1305]: time="2025-03-17T18:34:54.128129971Z" level=info msg="StopContainer for \"209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6\" returns successfully" Mar 17 18:34:54.129001 env[1305]: time="2025-03-17T18:34:54.128957656Z" level=info msg="StopPodSandbox for \"d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65\"" Mar 17 18:34:54.129061 env[1305]: time="2025-03-17T18:34:54.129041528Z" level=info msg="Container to stop \"209a2e6e568208ec72c6e80f6a86d08d63e018706481aebc675d315099f1ddc6\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:34:54.132074 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65-shm.mount: Deactivated successfully. Mar 17 18:34:54.159648 env[1305]: time="2025-03-17T18:34:54.159499037Z" level=info msg="shim disconnected" id=d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65 Mar 17 18:34:54.160083 env[1305]: time="2025-03-17T18:34:54.160061992Z" level=warning msg="cleaning up after shim disconnected" id=d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65 namespace=k8s.io Mar 17 18:34:54.160200 env[1305]: time="2025-03-17T18:34:54.160182665Z" level=info msg="cleaning up dead shim" Mar 17 18:34:54.177138 env[1305]: time="2025-03-17T18:34:54.177100812Z" level=info msg="StopContainer for \"f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc\" with timeout 5 (s)" Mar 17 18:34:54.177631 env[1305]: time="2025-03-17T18:34:54.177584905Z" level=info msg="Stop container \"f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc\" with signal terminated" Mar 17 18:34:54.179241 env[1305]: time="2025-03-17T18:34:54.179218946Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:54Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=5305 runtime=io.containerd.runc.v2\n" Mar 17 18:34:54.225895 env[1305]: time="2025-03-17T18:34:54.225839155Z" level=info msg="shim disconnected" id=f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc Mar 17 18:34:54.225895 env[1305]: time="2025-03-17T18:34:54.225889873Z" level=warning msg="cleaning up after shim disconnected" id=f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc namespace=k8s.io Mar 17 18:34:54.225895 env[1305]: time="2025-03-17T18:34:54.225899280Z" level=info msg="cleaning up dead shim" Mar 17 18:34:54.235050 env[1305]: time="2025-03-17T18:34:54.235004565Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:54Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=5362 runtime=io.containerd.runc.v2\n" Mar 17 18:34:54.245650 systemd-networkd[1078]: calica2756a831d: Link DOWN Mar 17 18:34:54.246536 env[1305]: time="2025-03-17T18:34:54.246018870Z" level=info msg="StopContainer for \"f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc\" returns successfully" Mar 17 18:34:54.245656 systemd-networkd[1078]: calica2756a831d: Lost carrier Mar 17 18:34:54.246657 env[1305]: time="2025-03-17T18:34:54.246596563Z" level=info msg="StopPodSandbox for \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\"" Mar 17 18:34:54.246697 env[1305]: time="2025-03-17T18:34:54.246657831Z" level=info msg="Container to stop \"f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:34:54.246697 env[1305]: time="2025-03-17T18:34:54.246672439Z" level=info msg="Container to stop \"50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:34:54.246697 env[1305]: time="2025-03-17T18:34:54.246682519Z" level=info msg="Container to stop \"8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:34:54.274161 kernel: kauditd_printk_skb: 13 callbacks suppressed Mar 17 18:34:54.274516 kernel: audit: type=1130 audit(1742236494.267:505): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.0.12:22-10.0.0.1:47302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:54.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.0.12:22-10.0.0.1:47302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:54.268126 systemd[1]: Started sshd@18-10.0.0.12:22-10.0.0.1:47302.service. Mar 17 18:34:54.280700 env[1305]: time="2025-03-17T18:34:54.280652169Z" level=info msg="shim disconnected" id=0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c Mar 17 18:34:54.280700 env[1305]: time="2025-03-17T18:34:54.280701164Z" level=warning msg="cleaning up after shim disconnected" id=0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c namespace=k8s.io Mar 17 18:34:54.280843 env[1305]: time="2025-03-17T18:34:54.280710431Z" level=info msg="cleaning up dead shim" Mar 17 18:34:54.294687 env[1305]: time="2025-03-17T18:34:54.294637823Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:54Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=5405 runtime=io.containerd.runc.v2\n" Mar 17 18:34:54.309401 kubelet[2208]: I0317 18:34:54.309309 2208 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Mar 17 18:34:54.352000 audit[5398]: USER_ACCT pid=5398 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.353411 sshd[5398]: Accepted publickey for core from 10.0.0.1 port 47302 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:54.356000 audit[5398]: CRED_ACQ pid=5398 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.357681 sshd[5398]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:54.360875 kernel: audit: type=1101 audit(1742236494.352:506): pid=5398 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.361020 kernel: audit: type=1103 audit(1742236494.356:507): pid=5398 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.361043 kernel: audit: type=1006 audit(1742236494.356:508): pid=5398 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Mar 17 18:34:54.361405 systemd-logind[1287]: New session 19 of user core. Mar 17 18:34:54.362345 systemd[1]: Started session-19.scope. Mar 17 18:34:54.356000 audit[5398]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd5f4e4030 a2=3 a3=0 items=0 ppid=1 pid=5398 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:54.367206 kernel: audit: type=1300 audit(1742236494.356:508): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd5f4e4030 a2=3 a3=0 items=0 ppid=1 pid=5398 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:54.367258 kernel: audit: type=1327 audit(1742236494.356:508): proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:54.369673 kernel: audit: type=1105 audit(1742236494.366:509): pid=5398 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.356000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:54.366000 audit[5398]: USER_START pid=5398 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.368000 audit[5421]: CRED_ACQ pid=5421 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.377987 kernel: audit: type=1103 audit(1742236494.368:510): pid=5421 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.389835 env[1305]: time="2025-03-17T18:34:54.389794576Z" level=info msg="TearDown network for sandbox \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\" successfully" Mar 17 18:34:54.389835 env[1305]: time="2025-03-17T18:34:54.389831207Z" level=info msg="StopPodSandbox for \"0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c\" returns successfully" Mar 17 18:34:54.454077 kubelet[2208]: I0317 18:34:54.453191 2208 topology_manager.go:215] "Topology Admit Handler" podUID="281e0c3f-a551-40f5-ba3a-01196afea51f" podNamespace="calico-system" podName="calico-node-mdxrs" Mar 17 18:34:54.454077 kubelet[2208]: E0317 18:34:54.453258 2208 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="eb977058-058c-48a0-a562-5b91756026fc" containerName="flexvol-driver" Mar 17 18:34:54.454077 kubelet[2208]: E0317 18:34:54.453267 2208 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="eb977058-058c-48a0-a562-5b91756026fc" containerName="install-cni" Mar 17 18:34:54.454077 kubelet[2208]: E0317 18:34:54.453274 2208 cpu_manager.go:395] "RemoveStaleState: removing container" podUID="eb977058-058c-48a0-a562-5b91756026fc" containerName="calico-node" Mar 17 18:34:54.454077 kubelet[2208]: I0317 18:34:54.453431 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-var-lib-calico\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.454077 kubelet[2208]: I0317 18:34:54.453448 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-net-dir\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.454077 kubelet[2208]: I0317 18:34:54.453463 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-xtables-lock\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.454077 kubelet[2208]: I0317 18:34:54.453483 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qlrh\" (UniqueName: \"kubernetes.io/projected/eb977058-058c-48a0-a562-5b91756026fc-kube-api-access-8qlrh\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.461528 kubelet[2208]: I0317 18:34:54.453497 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-var-run-calico\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.461528 kubelet[2208]: I0317 18:34:54.453509 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-lib-modules\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.461528 kubelet[2208]: I0317 18:34:54.453525 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/eb977058-058c-48a0-a562-5b91756026fc-node-certs\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.461528 kubelet[2208]: I0317 18:34:54.453540 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb977058-058c-48a0-a562-5b91756026fc-tigera-ca-bundle\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.461528 kubelet[2208]: I0317 18:34:54.453552 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-log-dir\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.461528 kubelet[2208]: I0317 18:34:54.453565 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-policysync\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.461794 kubelet[2208]: I0317 18:34:54.453578 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-bin-dir\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.461794 kubelet[2208]: I0317 18:34:54.453589 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-flexvol-driver-host\") pod \"eb977058-058c-48a0-a562-5b91756026fc\" (UID: \"eb977058-058c-48a0-a562-5b91756026fc\") " Mar 17 18:34:54.461794 kubelet[2208]: I0317 18:34:54.453653 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-flexvol-driver-host" (OuterVolumeSpecName: "flexvol-driver-host") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "flexvol-driver-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:34:54.461794 kubelet[2208]: I0317 18:34:54.453685 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-var-lib-calico" (OuterVolumeSpecName: "var-lib-calico") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "var-lib-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:34:54.461794 kubelet[2208]: I0317 18:34:54.453699 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-net-dir" (OuterVolumeSpecName: "cni-net-dir") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "cni-net-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:34:54.461937 kubelet[2208]: I0317 18:34:54.453712 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-xtables-lock" (OuterVolumeSpecName: "xtables-lock") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "xtables-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:34:54.461937 kubelet[2208]: I0317 18:34:54.454646 2208 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb977058-058c-48a0-a562-5b91756026fc" containerName="calico-node" Mar 17 18:34:54.461937 kubelet[2208]: I0317 18:34:54.458535 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-policysync" (OuterVolumeSpecName: "policysync") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "policysync". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:34:54.461937 kubelet[2208]: I0317 18:34:54.458558 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-log-dir" (OuterVolumeSpecName: "cni-log-dir") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "cni-log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:34:54.461937 kubelet[2208]: I0317 18:34:54.458573 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-bin-dir" (OuterVolumeSpecName: "cni-bin-dir") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "cni-bin-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:34:54.462063 kubelet[2208]: I0317 18:34:54.458585 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:34:54.462063 kubelet[2208]: I0317 18:34:54.458597 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-var-run-calico" (OuterVolumeSpecName: "var-run-calico") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "var-run-calico". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 17 18:34:54.471412 kubelet[2208]: I0317 18:34:54.463481 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb977058-058c-48a0-a562-5b91756026fc-node-certs" (OuterVolumeSpecName: "node-certs") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "node-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 18:34:54.471412 kubelet[2208]: I0317 18:34:54.465504 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb977058-058c-48a0-a562-5b91756026fc-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 18:34:54.471412 kubelet[2208]: I0317 18:34:54.467084 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb977058-058c-48a0-a562-5b91756026fc-kube-api-access-8qlrh" (OuterVolumeSpecName: "kube-api-access-8qlrh") pod "eb977058-058c-48a0-a562-5b91756026fc" (UID: "eb977058-058c-48a0-a562-5b91756026fc"). InnerVolumeSpecName "kube-api-access-8qlrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.244 [INFO][5340] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.244 [INFO][5340] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" iface="eth0" netns="/var/run/netns/cni-72c2b89e-e133-245d-1531-8d472424a3ff" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.244 [INFO][5340] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" iface="eth0" netns="/var/run/netns/cni-72c2b89e-e133-245d-1531-8d472424a3ff" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.260 [INFO][5340] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" after=15.221026ms iface="eth0" netns="/var/run/netns/cni-72c2b89e-e133-245d-1531-8d472424a3ff" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.260 [INFO][5340] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.260 [INFO][5340] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.287 [INFO][5387] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" HandleID="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.287 [INFO][5387] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.287 [INFO][5387] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.450 [INFO][5387] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" HandleID="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.450 [INFO][5387] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" HandleID="k8s-pod-network.d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Workload="localhost-k8s-calico--kube--controllers--7f5cc97b76--r9pvl-eth0" Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.463 [INFO][5387] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:34:54.471835 env[1305]: 2025-03-17 18:34:54.465 [INFO][5340] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65" Mar 17 18:34:54.471835 env[1305]: time="2025-03-17T18:34:54.467304268Z" level=info msg="TearDown network for sandbox \"d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65\" successfully" Mar 17 18:34:54.471835 env[1305]: time="2025-03-17T18:34:54.467339075Z" level=info msg="StopPodSandbox for \"d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65\" returns successfully" Mar 17 18:34:54.515853 sshd[5398]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:54.517551 systemd[1]: Started sshd@19-10.0.0.12:22-10.0.0.1:47318.service. Mar 17 18:34:54.526096 kernel: audit: type=1130 audit(1742236494.516:511): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.0.12:22-10.0.0.1:47318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:54.526162 kernel: audit: type=1106 audit(1742236494.517:512): pid=5398 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.516000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.0.12:22-10.0.0.1:47318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:54.517000 audit[5398]: USER_END pid=5398 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.519465 systemd[1]: sshd@18-10.0.0.12:22-10.0.0.1:47302.service: Deactivated successfully. Mar 17 18:34:54.520378 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 18:34:54.525441 systemd-logind[1287]: Session 19 logged out. Waiting for processes to exit. Mar 17 18:34:54.526223 systemd-logind[1287]: Removed session 19. Mar 17 18:34:54.517000 audit[5398]: CRED_DISP pid=5398 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.518000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.0.12:22-10.0.0.1:47302 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:54.552000 audit[5434]: USER_ACCT pid=5434 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.553552 sshd[5434]: Accepted publickey for core from 10.0.0.1 port 47318 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:54.553829 kubelet[2208]: I0317 18:34:54.553756 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdhx\" (UniqueName: \"kubernetes.io/projected/84b5b251-95c6-46de-a5be-62ebfcea6ad9-kube-api-access-zgdhx\") pod \"84b5b251-95c6-46de-a5be-62ebfcea6ad9\" (UID: \"84b5b251-95c6-46de-a5be-62ebfcea6ad9\") " Mar 17 18:34:54.553829 kubelet[2208]: I0317 18:34:54.553797 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84b5b251-95c6-46de-a5be-62ebfcea6ad9-tigera-ca-bundle\") pod \"84b5b251-95c6-46de-a5be-62ebfcea6ad9\" (UID: \"84b5b251-95c6-46de-a5be-62ebfcea6ad9\") " Mar 17 18:34:54.553902 kubelet[2208]: I0317 18:34:54.553851 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzpzv\" (UniqueName: \"kubernetes.io/projected/281e0c3f-a551-40f5-ba3a-01196afea51f-kube-api-access-rzpzv\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.553902 kubelet[2208]: I0317 18:34:54.553876 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/281e0c3f-a551-40f5-ba3a-01196afea51f-policysync\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.553902 kubelet[2208]: I0317 18:34:54.553895 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/281e0c3f-a551-40f5-ba3a-01196afea51f-cni-bin-dir\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554030 kubelet[2208]: I0317 18:34:54.553909 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/281e0c3f-a551-40f5-ba3a-01196afea51f-cni-net-dir\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554030 kubelet[2208]: I0317 18:34:54.553937 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/281e0c3f-a551-40f5-ba3a-01196afea51f-flexvol-driver-host\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554030 kubelet[2208]: I0317 18:34:54.553953 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/281e0c3f-a551-40f5-ba3a-01196afea51f-var-lib-calico\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554030 kubelet[2208]: I0317 18:34:54.553968 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/281e0c3f-a551-40f5-ba3a-01196afea51f-node-certs\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554030 kubelet[2208]: I0317 18:34:54.553983 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/281e0c3f-a551-40f5-ba3a-01196afea51f-xtables-lock\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554154 kubelet[2208]: I0317 18:34:54.553998 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/281e0c3f-a551-40f5-ba3a-01196afea51f-cni-log-dir\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554154 kubelet[2208]: I0317 18:34:54.554012 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/281e0c3f-a551-40f5-ba3a-01196afea51f-var-run-calico\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554154 kubelet[2208]: I0317 18:34:54.554026 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/281e0c3f-a551-40f5-ba3a-01196afea51f-tigera-ca-bundle\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554154 kubelet[2208]: I0317 18:34:54.554043 2208 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/281e0c3f-a551-40f5-ba3a-01196afea51f-lib-modules\") pod \"calico-node-mdxrs\" (UID: \"281e0c3f-a551-40f5-ba3a-01196afea51f\") " pod="calico-system/calico-node-mdxrs" Mar 17 18:34:54.554154 kubelet[2208]: I0317 18:34:54.554064 2208 reconciler_common.go:289] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-lib-modules\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554154 kubelet[2208]: I0317 18:34:54.554072 2208 reconciler_common.go:289] "Volume detached for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/eb977058-058c-48a0-a562-5b91756026fc-node-certs\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554313 kubelet[2208]: I0317 18:34:54.554080 2208 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb977058-058c-48a0-a562-5b91756026fc-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554313 kubelet[2208]: I0317 18:34:54.554089 2208 reconciler_common.go:289] "Volume detached for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-log-dir\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554313 kubelet[2208]: I0317 18:34:54.554096 2208 reconciler_common.go:289] "Volume detached for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-policysync\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554313 kubelet[2208]: I0317 18:34:54.554104 2208 reconciler_common.go:289] "Volume detached for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-bin-dir\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554313 kubelet[2208]: I0317 18:34:54.554123 2208 reconciler_common.go:289] "Volume detached for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-flexvol-driver-host\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554313 kubelet[2208]: I0317 18:34:54.554132 2208 reconciler_common.go:289] "Volume detached for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-var-lib-calico\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554313 kubelet[2208]: I0317 18:34:54.554139 2208 reconciler_common.go:289] "Volume detached for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-cni-net-dir\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554313 kubelet[2208]: I0317 18:34:54.554147 2208 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-8qlrh\" (UniqueName: \"kubernetes.io/projected/eb977058-058c-48a0-a562-5b91756026fc-kube-api-access-8qlrh\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554489 kubelet[2208]: I0317 18:34:54.554154 2208 reconciler_common.go:289] "Volume detached for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-var-run-calico\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.554489 kubelet[2208]: I0317 18:34:54.554161 2208 reconciler_common.go:289] "Volume detached for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/eb977058-058c-48a0-a562-5b91756026fc-xtables-lock\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.555000 audit[5434]: CRED_ACQ pid=5434 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.557328 kubelet[2208]: I0317 18:34:54.557284 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84b5b251-95c6-46de-a5be-62ebfcea6ad9-kube-api-access-zgdhx" (OuterVolumeSpecName: "kube-api-access-zgdhx") pod "84b5b251-95c6-46de-a5be-62ebfcea6ad9" (UID: "84b5b251-95c6-46de-a5be-62ebfcea6ad9"). InnerVolumeSpecName "kube-api-access-zgdhx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 18:34:54.556000 audit[5434]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffca172ecd0 a2=3 a3=0 items=0 ppid=1 pid=5434 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:54.556000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:54.557770 sshd[5434]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:54.560638 kubelet[2208]: I0317 18:34:54.560554 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84b5b251-95c6-46de-a5be-62ebfcea6ad9-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "84b5b251-95c6-46de-a5be-62ebfcea6ad9" (UID: "84b5b251-95c6-46de-a5be-62ebfcea6ad9"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 18:34:54.565376 systemd[1]: Started session-20.scope. Mar 17 18:34:54.565789 systemd-logind[1287]: New session 20 of user core. Mar 17 18:34:54.570000 audit[5434]: USER_START pid=5434 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.572000 audit[5441]: CRED_ACQ pid=5441 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.655550 kubelet[2208]: I0317 18:34:54.655503 2208 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84b5b251-95c6-46de-a5be-62ebfcea6ad9-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.655550 kubelet[2208]: I0317 18:34:54.655538 2208 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-zgdhx\" (UniqueName: \"kubernetes.io/projected/84b5b251-95c6-46de-a5be-62ebfcea6ad9-kube-api-access-zgdhx\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:54.757703 kubelet[2208]: E0317 18:34:54.757652 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:54.759031 env[1305]: time="2025-03-17T18:34:54.758877155Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mdxrs,Uid:281e0c3f-a551-40f5-ba3a-01196afea51f,Namespace:calico-system,Attempt:0,}" Mar 17 18:34:54.776819 env[1305]: time="2025-03-17T18:34:54.776590455Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:34:54.776819 env[1305]: time="2025-03-17T18:34:54.776632856Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:34:54.776819 env[1305]: time="2025-03-17T18:34:54.776646092Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:34:54.778059 env[1305]: time="2025-03-17T18:34:54.777382902Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/9606084e9268e23468a35429978cfee8dd5f8bc83cfe0411ea398666af6d43f9 pid=5457 runtime=io.containerd.runc.v2 Mar 17 18:34:54.812529 env[1305]: time="2025-03-17T18:34:54.812411194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mdxrs,Uid:281e0c3f-a551-40f5-ba3a-01196afea51f,Namespace:calico-system,Attempt:0,} returns sandbox id \"9606084e9268e23468a35429978cfee8dd5f8bc83cfe0411ea398666af6d43f9\"" Mar 17 18:34:54.813608 kubelet[2208]: E0317 18:34:54.813316 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:54.815200 env[1305]: time="2025-03-17T18:34:54.815170183Z" level=info msg="CreateContainer within sandbox \"9606084e9268e23468a35429978cfee8dd5f8bc83cfe0411ea398666af6d43f9\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 18:34:54.842746 env[1305]: time="2025-03-17T18:34:54.842703615Z" level=info msg="CreateContainer within sandbox \"9606084e9268e23468a35429978cfee8dd5f8bc83cfe0411ea398666af6d43f9\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"9478ba328c3d4200c4a2702dd8b76b1b176f0722c887f47d836f636c70493084\"" Mar 17 18:34:54.843713 env[1305]: time="2025-03-17T18:34:54.843676250Z" level=info msg="StartContainer for \"9478ba328c3d4200c4a2702dd8b76b1b176f0722c887f47d836f636c70493084\"" Mar 17 18:34:54.860641 systemd[1]: Started sshd@20-10.0.0.12:22-10.0.0.1:47320.service. Mar 17 18:34:54.859000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.0.12:22-10.0.0.1:47320 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:54.862795 sshd[5434]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:54.862000 audit[5434]: USER_END pid=5434 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.862000 audit[5434]: CRED_DISP pid=5434 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.865280 systemd[1]: sshd@19-10.0.0.12:22-10.0.0.1:47318.service: Deactivated successfully. Mar 17 18:34:54.864000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.0.12:22-10.0.0.1:47318 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:54.866289 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 18:34:54.866948 systemd-logind[1287]: Session 20 logged out. Waiting for processes to exit. Mar 17 18:34:54.867809 systemd-logind[1287]: Removed session 20. Mar 17 18:34:54.895705 sshd[5506]: Accepted publickey for core from 10.0.0.1 port 47320 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:54.896885 sshd[5506]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:54.894000 audit[5506]: USER_ACCT pid=5506 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.895000 audit[5506]: CRED_ACQ pid=5506 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.895000 audit[5506]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff33e25010 a2=3 a3=0 items=0 ppid=1 pid=5506 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:54.895000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:54.902594 systemd[1]: Started session-21.scope. Mar 17 18:34:54.903708 systemd-logind[1287]: New session 21 of user core. Mar 17 18:34:54.907551 env[1305]: time="2025-03-17T18:34:54.907309478Z" level=info msg="StartContainer for \"9478ba328c3d4200c4a2702dd8b76b1b176f0722c887f47d836f636c70493084\" returns successfully" Mar 17 18:34:54.907000 audit[5506]: USER_START pid=5506 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.908000 audit[5524]: CRED_ACQ pid=5524 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:54.986827 env[1305]: time="2025-03-17T18:34:54.986766525Z" level=info msg="shim disconnected" id=9478ba328c3d4200c4a2702dd8b76b1b176f0722c887f47d836f636c70493084 Mar 17 18:34:54.986827 env[1305]: time="2025-03-17T18:34:54.986813074Z" level=warning msg="cleaning up after shim disconnected" id=9478ba328c3d4200c4a2702dd8b76b1b176f0722c887f47d836f636c70493084 namespace=k8s.io Mar 17 18:34:54.986827 env[1305]: time="2025-03-17T18:34:54.986821862Z" level=info msg="cleaning up dead shim" Mar 17 18:34:54.994353 env[1305]: time="2025-03-17T18:34:54.994327341Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:54Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=5547 runtime=io.containerd.runc.v2\n" Mar 17 18:34:55.021000 audit[5560]: NETFILTER_CFG table=filter:121 family=2 entries=8 op=nft_register_rule pid=5560 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:55.021000 audit[5560]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffcf85a7380 a2=0 a3=7ffcf85a736c items=0 ppid=2398 pid=5560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:55.021000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:55.028542 systemd[1]: var-lib-kubelet-pods-84b5b251\x2d95c6\x2d46de\x2da5be\x2d62ebfcea6ad9-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dkube\x2dcontrollers-1.mount: Deactivated successfully. Mar 17 18:34:55.028694 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d49e207a9dd8450906e4dbba907fcd3bec5ddc9afe5ca213ac4ce2e1e617eb65-rootfs.mount: Deactivated successfully. Mar 17 18:34:55.028784 systemd[1]: run-netns-cni\x2d72c2b89e\x2de133\x2d245d\x2d1531\x2d8d472424a3ff.mount: Deactivated successfully. Mar 17 18:34:55.028873 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc-rootfs.mount: Deactivated successfully. Mar 17 18:34:55.028991 systemd[1]: var-lib-kubelet-pods-eb977058\x2d058c\x2d48a0\x2da562\x2d5b91756026fc-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dnode-1.mount: Deactivated successfully. Mar 17 18:34:55.029115 systemd[1]: var-lib-kubelet-pods-84b5b251\x2d95c6\x2d46de\x2da5be\x2d62ebfcea6ad9-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dzgdhx.mount: Deactivated successfully. Mar 17 18:34:55.029228 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c-rootfs.mount: Deactivated successfully. Mar 17 18:34:55.029330 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-0f37918ccb65eb6696efc6f2b2ab7beaf7b4fa13200c9f8bf86fb0e65fbdd72c-shm.mount: Deactivated successfully. Mar 17 18:34:55.029438 systemd[1]: var-lib-kubelet-pods-eb977058\x2d058c\x2d48a0\x2da562\x2d5b91756026fc-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d8qlrh.mount: Deactivated successfully. Mar 17 18:34:55.029800 systemd[1]: var-lib-kubelet-pods-eb977058\x2d058c\x2d48a0\x2da562\x2d5b91756026fc-volumes-kubernetes.io\x7esecret-node\x2dcerts.mount: Deactivated successfully. Mar 17 18:34:55.029000 audit[5560]: NETFILTER_CFG table=nat:122 family=2 entries=30 op=nft_register_rule pid=5560 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:55.029000 audit[5560]: SYSCALL arch=c000003e syscall=46 success=yes exit=9348 a0=3 a1=7ffcf85a7380 a2=0 a3=7ffcf85a736c items=0 ppid=2398 pid=5560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:55.029000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:55.313559 kubelet[2208]: E0317 18:34:55.313180 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:55.315864 kubelet[2208]: I0317 18:34:55.315835 2208 scope.go:117] "RemoveContainer" containerID="f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc" Mar 17 18:34:55.319152 env[1305]: time="2025-03-17T18:34:55.319119326Z" level=info msg="CreateContainer within sandbox \"9606084e9268e23468a35429978cfee8dd5f8bc83cfe0411ea398666af6d43f9\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 18:34:55.320751 env[1305]: time="2025-03-17T18:34:55.320712466Z" level=info msg="RemoveContainer for \"f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc\"" Mar 17 18:34:55.329297 env[1305]: time="2025-03-17T18:34:55.329254683Z" level=info msg="RemoveContainer for \"f35362396536d5003730279aba2466cf3be3ae9dfdcc66c0ce66dd67fe2eb3dc\" returns successfully" Mar 17 18:34:55.329703 kubelet[2208]: I0317 18:34:55.329593 2208 scope.go:117] "RemoveContainer" containerID="8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f" Mar 17 18:34:55.330835 env[1305]: time="2025-03-17T18:34:55.330803136Z" level=info msg="RemoveContainer for \"8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f\"" Mar 17 18:34:55.334588 env[1305]: time="2025-03-17T18:34:55.334555375Z" level=info msg="RemoveContainer for \"8b56e443aba8eed3a01634026c783beefe8354a9888dc0608c471592cb701d0f\" returns successfully" Mar 17 18:34:55.335137 kubelet[2208]: I0317 18:34:55.335105 2208 scope.go:117] "RemoveContainer" containerID="50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0" Mar 17 18:34:55.337486 env[1305]: time="2025-03-17T18:34:55.337462596Z" level=info msg="RemoveContainer for \"50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0\"" Mar 17 18:34:55.340791 env[1305]: time="2025-03-17T18:34:55.340742785Z" level=info msg="CreateContainer within sandbox \"9606084e9268e23468a35429978cfee8dd5f8bc83cfe0411ea398666af6d43f9\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"c5e6a3e9d03264a549775314aa74923736333102765e5883aedcc0083e723a4e\"" Mar 17 18:34:55.341327 env[1305]: time="2025-03-17T18:34:55.341300439Z" level=info msg="StartContainer for \"c5e6a3e9d03264a549775314aa74923736333102765e5883aedcc0083e723a4e\"" Mar 17 18:34:55.343707 env[1305]: time="2025-03-17T18:34:55.343669894Z" level=info msg="RemoveContainer for \"50a66d9ba5e1884bd882a41220a3038206abd949ee2549f3d9bca229339675c0\" returns successfully" Mar 17 18:34:55.405099 env[1305]: time="2025-03-17T18:34:55.405050476Z" level=info msg="StartContainer for \"c5e6a3e9d03264a549775314aa74923736333102765e5883aedcc0083e723a4e\" returns successfully" Mar 17 18:34:56.050793 kubelet[2208]: I0317 18:34:56.050737 2208 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84b5b251-95c6-46de-a5be-62ebfcea6ad9" path="/var/lib/kubelet/pods/84b5b251-95c6-46de-a5be-62ebfcea6ad9/volumes" Mar 17 18:34:56.051370 kubelet[2208]: I0317 18:34:56.051337 2208 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb977058-058c-48a0-a562-5b91756026fc" path="/var/lib/kubelet/pods/eb977058-058c-48a0-a562-5b91756026fc/volumes" Mar 17 18:34:56.320758 kubelet[2208]: E0317 18:34:56.320641 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:56.742000 audit[5643]: NETFILTER_CFG table=filter:123 family=2 entries=20 op=nft_register_rule pid=5643 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:56.742000 audit[5643]: SYSCALL arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffc77b9ea50 a2=0 a3=7ffc77b9ea3c items=0 ppid=2398 pid=5643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:56.742000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:56.746000 audit[5643]: NETFILTER_CFG table=nat:124 family=2 entries=22 op=nft_register_rule pid=5643 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:56.752621 sshd[5506]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:56.754652 systemd[1]: Started sshd@21-10.0.0.12:22-10.0.0.1:47326.service. Mar 17 18:34:56.752000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.0.12:22-10.0.0.1:47326 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:56.746000 audit[5643]: SYSCALL arch=c000003e syscall=46 success=yes exit=6540 a0=3 a1=7ffc77b9ea50 a2=0 a3=0 items=0 ppid=2398 pid=5643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:56.746000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:56.754000 audit[5506]: USER_END pid=5506 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:56.754000 audit[5506]: CRED_DISP pid=5506 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:56.756000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.0.12:22-10.0.0.1:47320 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:56.758505 systemd[1]: sshd@20-10.0.0.12:22-10.0.0.1:47320.service: Deactivated successfully. Mar 17 18:34:56.759989 systemd[1]: session-21.scope: Deactivated successfully. Mar 17 18:34:56.760699 systemd-logind[1287]: Session 21 logged out. Waiting for processes to exit. Mar 17 18:34:56.762045 systemd-logind[1287]: Removed session 21. Mar 17 18:34:56.769958 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c5e6a3e9d03264a549775314aa74923736333102765e5883aedcc0083e723a4e-rootfs.mount: Deactivated successfully. Mar 17 18:34:56.787980 env[1305]: time="2025-03-17T18:34:56.787907991Z" level=info msg="shim disconnected" id=c5e6a3e9d03264a549775314aa74923736333102765e5883aedcc0083e723a4e Mar 17 18:34:56.787980 env[1305]: time="2025-03-17T18:34:56.787968878Z" level=warning msg="cleaning up after shim disconnected" id=c5e6a3e9d03264a549775314aa74923736333102765e5883aedcc0083e723a4e namespace=k8s.io Mar 17 18:34:56.787980 env[1305]: time="2025-03-17T18:34:56.787977946Z" level=info msg="cleaning up dead shim" Mar 17 18:34:56.801981 env[1305]: time="2025-03-17T18:34:56.801932992Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:56Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=5654 runtime=io.containerd.runc.v2\n" Mar 17 18:34:56.803000 audit[5645]: USER_ACCT pid=5645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:56.805751 sshd[5645]: Accepted publickey for core from 10.0.0.1 port 47326 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:56.804000 audit[5645]: CRED_ACQ pid=5645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:56.804000 audit[5645]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcc2f0efd0 a2=3 a3=0 items=0 ppid=1 pid=5645 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:56.804000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:56.806788 sshd[5645]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:56.810670 systemd-logind[1287]: New session 22 of user core. Mar 17 18:34:56.811745 systemd[1]: Started session-22.scope. Mar 17 18:34:56.821000 audit[5645]: USER_START pid=5645 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:56.823000 audit[5667]: CRED_ACQ pid=5667 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:57.094166 sshd[5645]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:57.094000 audit[5645]: USER_END pid=5645 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:57.094000 audit[5645]: CRED_DISP pid=5645 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:57.096644 systemd[1]: Started sshd@22-10.0.0.12:22-10.0.0.1:47330.service. Mar 17 18:34:57.095000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.0.12:22-10.0.0.1:47330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:57.097179 systemd[1]: sshd@21-10.0.0.12:22-10.0.0.1:47326.service: Deactivated successfully. Mar 17 18:34:57.096000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.0.12:22-10.0.0.1:47326 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:57.098467 systemd[1]: session-22.scope: Deactivated successfully. Mar 17 18:34:57.098700 systemd-logind[1287]: Session 22 logged out. Waiting for processes to exit. Mar 17 18:34:57.100323 systemd-logind[1287]: Removed session 22. Mar 17 18:34:57.132000 audit[5675]: USER_ACCT pid=5675 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:57.133366 sshd[5675]: Accepted publickey for core from 10.0.0.1 port 47330 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:34:57.133000 audit[5675]: CRED_ACQ pid=5675 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:57.133000 audit[5675]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffdb3250f0 a2=3 a3=0 items=0 ppid=1 pid=5675 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:57.133000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:34:57.134852 sshd[5675]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:34:57.139602 systemd-logind[1287]: New session 23 of user core. Mar 17 18:34:57.140506 systemd[1]: Started session-23.scope. Mar 17 18:34:57.145000 audit[5675]: USER_START pid=5675 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:57.146000 audit[5679]: CRED_ACQ pid=5679 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:57.255190 sshd[5675]: pam_unix(sshd:session): session closed for user core Mar 17 18:34:57.255000 audit[5675]: USER_END pid=5675 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:57.255000 audit[5675]: CRED_DISP pid=5675 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:34:57.257000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.0.12:22-10.0.0.1:47330 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:34:57.257666 systemd[1]: sshd@22-10.0.0.12:22-10.0.0.1:47330.service: Deactivated successfully. Mar 17 18:34:57.259623 systemd[1]: session-23.scope: Deactivated successfully. Mar 17 18:34:57.260067 systemd-logind[1287]: Session 23 logged out. Waiting for processes to exit. Mar 17 18:34:57.261256 systemd-logind[1287]: Removed session 23. Mar 17 18:34:57.324479 kubelet[2208]: E0317 18:34:57.324435 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:57.341430 env[1305]: time="2025-03-17T18:34:57.341383646Z" level=info msg="CreateContainer within sandbox \"9606084e9268e23468a35429978cfee8dd5f8bc83cfe0411ea398666af6d43f9\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 18:34:57.354070 env[1305]: time="2025-03-17T18:34:57.353215210Z" level=info msg="CreateContainer within sandbox \"9606084e9268e23468a35429978cfee8dd5f8bc83cfe0411ea398666af6d43f9\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"c02926b9cec39759c4b52c58396a75932782c8287b0736fe835d48b0443b091c\"" Mar 17 18:34:57.354070 env[1305]: time="2025-03-17T18:34:57.353754377Z" level=info msg="StartContainer for \"c02926b9cec39759c4b52c58396a75932782c8287b0736fe835d48b0443b091c\"" Mar 17 18:34:57.400155 env[1305]: time="2025-03-17T18:34:57.400089963Z" level=info msg="StartContainer for \"c02926b9cec39759c4b52c58396a75932782c8287b0736fe835d48b0443b091c\" returns successfully" Mar 17 18:34:57.550000 audit[5742]: NETFILTER_CFG table=filter:125 family=2 entries=32 op=nft_register_rule pid=5742 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:57.550000 audit[5742]: SYSCALL arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffd84481b00 a2=0 a3=7ffd84481aec items=0 ppid=2398 pid=5742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:57.550000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:57.556000 audit[5742]: NETFILTER_CFG table=nat:126 family=2 entries=32 op=nft_register_chain pid=5742 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:57.556000 audit[5742]: SYSCALL arch=c000003e syscall=46 success=yes exit=9476 a0=3 a1=7ffd84481b00 a2=0 a3=7ffd84481aec items=0 ppid=2398 pid=5742 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:57.556000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:58.329085 kubelet[2208]: E0317 18:34:58.329036 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:58.633000 audit[5814]: AVC avc: denied { write } for pid=5814 comm="tee" name="fd" dev="proc" ino=34855 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:58.633000 audit[5814]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffe6e3d6a2a a2=241 a3=1b6 items=1 ppid=5784 pid=5814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.633000 audit: CWD cwd="/etc/service/enabled/bird6/log" Mar 17 18:34:58.633000 audit: PATH item=0 name="/dev/fd/63" inode=32998 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:58.633000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:58.644000 audit[5842]: AVC avc: denied { write } for pid=5842 comm="tee" name="fd" dev="proc" ino=33009 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:58.644000 audit[5842]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc0f8c7a2a a2=241 a3=1b6 items=1 ppid=5795 pid=5842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.644000 audit: CWD cwd="/etc/service/enabled/confd/log" Mar 17 18:34:58.644000 audit: PATH item=0 name="/dev/fd/63" inode=33864 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:58.644000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:58.647000 audit[5846]: AVC avc: denied { write } for pid=5846 comm="tee" name="fd" dev="proc" ino=33015 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:58.647000 audit[5846]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc20823a1a a2=241 a3=1b6 items=1 ppid=5801 pid=5846 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.647000 audit: CWD cwd="/etc/service/enabled/allocate-tunnel-addrs/log" Mar 17 18:34:58.647000 audit: PATH item=0 name="/dev/fd/63" inode=33866 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:58.647000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:58.663000 audit[5848]: AVC avc: denied { write } for pid=5848 comm="tee" name="fd" dev="proc" ino=33019 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:58.663000 audit[5848]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffe08f35a2c a2=241 a3=1b6 items=1 ppid=5785 pid=5848 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.663000 audit: CWD cwd="/etc/service/enabled/cni/log" Mar 17 18:34:58.663000 audit: PATH item=0 name="/dev/fd/63" inode=32340 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:58.663000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:58.665000 audit[5851]: AVC avc: denied { write } for pid=5851 comm="tee" name="fd" dev="proc" ino=33871 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:58.665000 audit[5851]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc68c14a1b a2=241 a3=1b6 items=1 ppid=5794 pid=5851 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.665000 audit: CWD cwd="/etc/service/enabled/node-status-reporter/log" Mar 17 18:34:58.665000 audit: PATH item=0 name="/dev/fd/63" inode=32343 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:58.665000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:58.669000 audit[5863]: AVC avc: denied { write } for pid=5863 comm="tee" name="fd" dev="proc" ino=33025 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:58.669000 audit[5863]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffcdbf5da2a a2=241 a3=1b6 items=1 ppid=5786 pid=5863 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.669000 audit: CWD cwd="/etc/service/enabled/felix/log" Mar 17 18:34:58.669000 audit: PATH item=0 name="/dev/fd/63" inode=34867 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:58.669000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:58.685000 audit[5831]: AVC avc: denied { write } for pid=5831 comm="tee" name="fd" dev="proc" ino=34870 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:34:58.685000 audit[5831]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffde56c8a2b a2=241 a3=1b6 items=1 ppid=5792 pid=5831 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.685000 audit: CWD cwd="/etc/service/enabled/bird/log" Mar 17 18:34:58.685000 audit: PATH item=0 name="/dev/fd/63" inode=34863 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:34:58.685000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:34:58.771000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.771000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.771000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.771000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.771000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.771000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.771000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.771000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.771000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.771000 audit: BPF prog-id=29 op=LOAD Mar 17 18:34:58.771000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc5fcdd4b0 a2=98 a3=3 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.771000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.771000 audit: BPF prog-id=29 op=UNLOAD Mar 17 18:34:58.772000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.772000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.772000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.772000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.772000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.772000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.772000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.772000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.772000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.772000 audit: BPF prog-id=30 op=LOAD Mar 17 18:34:58.772000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc5fcdd290 a2=74 a3=540051 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.772000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.773000 audit: BPF prog-id=30 op=UNLOAD Mar 17 18:34:58.773000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.773000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.773000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.773000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.773000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.773000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.773000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.773000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.773000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.773000 audit: BPF prog-id=31 op=LOAD Mar 17 18:34:58.773000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc5fcdd2c0 a2=94 a3=2 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.773000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.773000 audit: BPF prog-id=31 op=UNLOAD Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit: BPF prog-id=32 op=LOAD Mar 17 18:34:58.886000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffc5fcdd180 a2=40 a3=1 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.886000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.886000 audit: BPF prog-id=32 op=UNLOAD Mar 17 18:34:58.886000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.886000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=0 a1=7ffc5fcdd250 a2=50 a3=7ffc5fcdd330 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.886000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc5fcdd190 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc5fcdd1c0 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc5fcdd0d0 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc5fcdd1e0 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc5fcdd1c0 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc5fcdd1b0 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc5fcdd1e0 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc5fcdd1c0 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc5fcdd1e0 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc5fcdd1b0 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffc5fcdd220 a2=28 a3=0 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffc5fcdcfd0 a2=50 a3=1 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit: BPF prog-id=33 op=LOAD Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc5fcdcfd0 a2=94 a3=5 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit: BPF prog-id=33 op=UNLOAD Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffc5fcdd080 a2=50 a3=1 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=16 a1=7ffc5fcdd1a0 a2=4 a3=38 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { confidentiality } for pid=5900 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffc5fcdd1f0 a2=94 a3=6 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { confidentiality } for pid=5900 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffc5fcdc9a0 a2=94 a3=83 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { perfmon } for pid=5900 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { bpf } for pid=5900 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.894000 audit[5900]: AVC avc: denied { confidentiality } for pid=5900 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:58.894000 audit[5900]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffc5fcdc9a0 a2=94 a3=83 items=0 ppid=5789 pid=5900 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.894000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit: BPF prog-id=34 op=LOAD Mar 17 18:34:58.903000 audit[5904]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffece0afdd0 a2=98 a3=1999999999999999 items=0 ppid=5789 pid=5904 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.903000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:34:58.903000 audit: BPF prog-id=34 op=UNLOAD Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit: BPF prog-id=35 op=LOAD Mar 17 18:34:58.903000 audit[5904]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffece0afcb0 a2=74 a3=ffff items=0 ppid=5789 pid=5904 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.903000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:34:58.903000 audit: BPF prog-id=35 op=UNLOAD Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { perfmon } for pid=5904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit[5904]: AVC avc: denied { bpf } for pid=5904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.903000 audit: BPF prog-id=36 op=LOAD Mar 17 18:34:58.903000 audit[5904]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffece0afcf0 a2=40 a3=7ffece0afed0 items=0 ppid=5789 pid=5904 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.903000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:34:58.903000 audit: BPF prog-id=36 op=UNLOAD Mar 17 18:34:58.942162 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2-rootfs.mount: Deactivated successfully. Mar 17 18:34:58.943896 env[1305]: time="2025-03-17T18:34:58.943848033Z" level=info msg="shim disconnected" id=e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2 Mar 17 18:34:58.943896 env[1305]: time="2025-03-17T18:34:58.943893400Z" level=warning msg="cleaning up after shim disconnected" id=e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2 namespace=k8s.io Mar 17 18:34:58.944309 env[1305]: time="2025-03-17T18:34:58.943902167Z" level=info msg="cleaning up dead shim" Mar 17 18:34:58.951783 env[1305]: time="2025-03-17T18:34:58.951747471Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:58Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=5934 runtime=io.containerd.runc.v2\n" Mar 17 18:34:58.962000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.962000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.962000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.962000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.962000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.962000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.962000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.962000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.962000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.962000 audit: BPF prog-id=37 op=LOAD Mar 17 18:34:58.962000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc6cf1deb0 a2=98 a3=ffffffff items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.962000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.962000 audit: BPF prog-id=37 op=UNLOAD Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit: BPF prog-id=38 op=LOAD Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc6cf1dcc0 a2=74 a3=540051 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit: BPF prog-id=38 op=UNLOAD Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit: BPF prog-id=39 op=LOAD Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffc6cf1dcf0 a2=94 a3=2 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit: BPF prog-id=39 op=UNLOAD Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffc6cf1dbc0 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc6cf1dbf0 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc6cf1db00 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffc6cf1dc10 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffc6cf1dbf0 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffc6cf1dbe0 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffc6cf1dc10 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc6cf1dbf0 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc6cf1dc10 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffc6cf1dbe0 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffc6cf1dc50 a2=28 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.963000 audit: BPF prog-id=40 op=LOAD Mar 17 18:34:58.963000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc6cf1dac0 a2=40 a3=0 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.963000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.963000 audit: BPF prog-id=40 op=UNLOAD Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=0 a1=7ffc6cf1dab0 a2=50 a3=2800 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.964000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=0 a1=7ffc6cf1dab0 a2=50 a3=2800 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.964000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit: BPF prog-id=41 op=LOAD Mar 17 18:34:58.964000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc6cf1d2d0 a2=94 a3=2 items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.964000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.964000 audit: BPF prog-id=41 op=UNLOAD Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { perfmon } for pid=5953 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit[5953]: AVC avc: denied { bpf } for pid=5953 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.964000 audit: BPF prog-id=42 op=LOAD Mar 17 18:34:58.964000 audit[5953]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffc6cf1d3d0 a2=94 a3=2d items=0 ppid=5789 pid=5953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.964000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:34:58.968000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.968000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.968000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.968000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.968000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.968000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.968000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.968000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.968000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.968000 audit: BPF prog-id=43 op=LOAD Mar 17 18:34:58.968000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffe7cf74d50 a2=98 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.968000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:58.969000 audit: BPF prog-id=43 op=UNLOAD Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit: BPF prog-id=44 op=LOAD Mar 17 18:34:58.969000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe7cf74b30 a2=74 a3=540051 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.969000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:58.969000 audit: BPF prog-id=44 op=UNLOAD Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:58.969000 audit: BPF prog-id=45 op=LOAD Mar 17 18:34:58.969000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe7cf74b60 a2=94 a3=2 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:58.969000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:58.969000 audit: BPF prog-id=45 op=UNLOAD Mar 17 18:34:58.975757 env[1305]: time="2025-03-17T18:34:58.975728850Z" level=info msg="StopContainer for \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\" returns successfully" Mar 17 18:34:58.979598 env[1305]: time="2025-03-17T18:34:58.979579093Z" level=info msg="StopPodSandbox for \"4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7\"" Mar 17 18:34:58.979765 env[1305]: time="2025-03-17T18:34:58.979741164Z" level=info msg="Container to stop \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Mar 17 18:34:58.986376 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7-shm.mount: Deactivated successfully. Mar 17 18:34:59.007354 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7-rootfs.mount: Deactivated successfully. Mar 17 18:34:59.008717 env[1305]: time="2025-03-17T18:34:59.008652092Z" level=info msg="shim disconnected" id=4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7 Mar 17 18:34:59.008717 env[1305]: time="2025-03-17T18:34:59.008707359Z" level=warning msg="cleaning up after shim disconnected" id=4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7 namespace=k8s.io Mar 17 18:34:59.008717 env[1305]: time="2025-03-17T18:34:59.008716686Z" level=info msg="cleaning up dead shim" Mar 17 18:34:59.015208 env[1305]: time="2025-03-17T18:34:59.015178883Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:34:59Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=5985 runtime=io.containerd.runc.v2\n" Mar 17 18:34:59.026406 env[1305]: time="2025-03-17T18:34:59.026361060Z" level=info msg="TearDown network for sandbox \"4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7\" successfully" Mar 17 18:34:59.026406 env[1305]: time="2025-03-17T18:34:59.026391648Z" level=info msg="StopPodSandbox for \"4131ffb2663af143cac7cdba9c7e48cc9dc53ba358504a606d275ad12a3a04d7\" returns successfully" Mar 17 18:34:59.050389 kubelet[2208]: I0317 18:34:59.050316 2208 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-mdxrs" podStartSLOduration=5.045994866 podStartE2EDuration="5.045994866s" podCreationTimestamp="2025-03-17 18:34:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:34:58.34106163 +0000 UTC m=+80.378651240" watchObservedRunningTime="2025-03-17 18:34:59.045994866 +0000 UTC m=+81.083584486" Mar 17 18:34:59.061000 audit[5998]: NETFILTER_CFG table=filter:127 family=2 entries=32 op=nft_register_rule pid=5998 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:59.061000 audit[5998]: SYSCALL arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffda00ea250 a2=0 a3=7ffda00ea23c items=0 ppid=2398 pid=5998 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.061000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:59.066000 audit[5998]: NETFILTER_CFG table=nat:128 family=2 entries=30 op=nft_register_rule pid=5998 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:34:59.066000 audit[5998]: SYSCALL arch=c000003e syscall=46 success=yes exit=9348 a0=3 a1=7ffda00ea250 a2=0 a3=0 items=0 ppid=2398 pid=5998 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.066000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit: BPF prog-id=46 op=LOAD Mar 17 18:34:59.099000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffe7cf74a20 a2=40 a3=1 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.099000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.099000 audit: BPF prog-id=46 op=UNLOAD Mar 17 18:34:59.099000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.099000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=0 a1=7ffe7cf74af0 a2=50 a3=7ffe7cf74bd0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.099000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe7cf74a30 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe7cf74a60 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe7cf74970 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe7cf74a80 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe7cf74a60 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe7cf74a50 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe7cf74a80 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe7cf74a60 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe7cf74a80 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffe7cf74a50 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffe7cf74ac0 a2=28 a3=0 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffe7cf74870 a2=50 a3=1 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit: BPF prog-id=47 op=LOAD Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffe7cf74870 a2=94 a3=5 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit: BPF prog-id=47 op=UNLOAD Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffe7cf74920 a2=50 a3=1 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=16 a1=7ffe7cf74a40 a2=4 a3=38 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.107000 audit[5959]: AVC avc: denied { confidentiality } for pid=5959 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:59.107000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffe7cf74a90 a2=94 a3=6 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.107000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { confidentiality } for pid=5959 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:59.108000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffe7cf74240 a2=94 a3=83 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.108000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { perfmon } for pid=5959 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { confidentiality } for pid=5959 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:34:59.108000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffe7cf74240 a2=94 a3=83 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.108000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffe7cf75c80 a2=10 a3=208 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.108000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffe7cf75b20 a2=10 a3=3 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.108000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffe7cf75ac0 a2=10 a3=3 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.108000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.108000 audit[5959]: AVC avc: denied { bpf } for pid=5959 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:34:59.108000 audit[5959]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7ffe7cf75ac0 a2=10 a3=7 items=0 ppid=5789 pid=5959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.108000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:34:59.114000 audit: BPF prog-id=42 op=UNLOAD Mar 17 18:34:59.156000 audit[6027]: NETFILTER_CFG table=filter:129 family=2 entries=46 op=nft_register_rule pid=6027 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:59.156000 audit[6027]: SYSCALL arch=c000003e syscall=46 success=yes exit=8196 a0=3 a1=7ffc842d3870 a2=0 a3=7ffc842d385c items=0 ppid=5789 pid=6027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.156000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:59.157000 audit[6027]: NETFILTER_CFG table=filter:130 family=2 entries=4 op=nft_unregister_chain pid=6027 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:34:59.157000 audit[6027]: SYSCALL arch=c000003e syscall=46 success=yes exit=592 a0=3 a1=7ffc842d3870 a2=0 a3=55c014626000 items=0 ppid=5789 pid=6027 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:34:59.157000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:34:59.194354 kubelet[2208]: I0317 18:34:59.194309 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/94704619-936c-454e-b51a-0971929c1171-typha-certs\") pod \"94704619-936c-454e-b51a-0971929c1171\" (UID: \"94704619-936c-454e-b51a-0971929c1171\") " Mar 17 18:34:59.194535 kubelet[2208]: I0317 18:34:59.194367 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"kube-api-access-trbpg\" (UniqueName: \"kubernetes.io/projected/94704619-936c-454e-b51a-0971929c1171-kube-api-access-trbpg\") pod \"94704619-936c-454e-b51a-0971929c1171\" (UID: \"94704619-936c-454e-b51a-0971929c1171\") " Mar 17 18:34:59.194535 kubelet[2208]: I0317 18:34:59.194392 2208 reconciler_common.go:161] "operationExecutor.UnmountVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94704619-936c-454e-b51a-0971929c1171-tigera-ca-bundle\") pod \"94704619-936c-454e-b51a-0971929c1171\" (UID: \"94704619-936c-454e-b51a-0971929c1171\") " Mar 17 18:34:59.204225 systemd[1]: var-lib-kubelet-pods-94704619\x2d936c\x2d454e\x2db51a\x2d0971929c1171-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dtrbpg.mount: Deactivated successfully. Mar 17 18:34:59.204397 systemd[1]: var-lib-kubelet-pods-94704619\x2d936c\x2d454e\x2db51a\x2d0971929c1171-volumes-kubernetes.io\x7esecret-typha\x2dcerts.mount: Deactivated successfully. Mar 17 18:34:59.208199 kubelet[2208]: I0317 18:34:59.208141 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94704619-936c-454e-b51a-0971929c1171-typha-certs" (OuterVolumeSpecName: "typha-certs") pod "94704619-936c-454e-b51a-0971929c1171" (UID: "94704619-936c-454e-b51a-0971929c1171"). InnerVolumeSpecName "typha-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 17 18:34:59.224439 kubelet[2208]: I0317 18:34:59.224377 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94704619-936c-454e-b51a-0971929c1171-kube-api-access-trbpg" (OuterVolumeSpecName: "kube-api-access-trbpg") pod "94704619-936c-454e-b51a-0971929c1171" (UID: "94704619-936c-454e-b51a-0971929c1171"). InnerVolumeSpecName "kube-api-access-trbpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 17 18:34:59.227701 kubelet[2208]: I0317 18:34:59.227660 2208 operation_generator.go:887] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94704619-936c-454e-b51a-0971929c1171-tigera-ca-bundle" (OuterVolumeSpecName: "tigera-ca-bundle") pod "94704619-936c-454e-b51a-0971929c1171" (UID: "94704619-936c-454e-b51a-0971929c1171"). InnerVolumeSpecName "tigera-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 17 18:34:59.294696 kubelet[2208]: I0317 18:34:59.294639 2208 reconciler_common.go:289] "Volume detached for volume \"kube-api-access-trbpg\" (UniqueName: \"kubernetes.io/projected/94704619-936c-454e-b51a-0971929c1171-kube-api-access-trbpg\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:59.294696 kubelet[2208]: I0317 18:34:59.294679 2208 reconciler_common.go:289] "Volume detached for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94704619-936c-454e-b51a-0971929c1171-tigera-ca-bundle\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:59.294696 kubelet[2208]: I0317 18:34:59.294689 2208 reconciler_common.go:289] "Volume detached for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/94704619-936c-454e-b51a-0971929c1171-typha-certs\") on node \"localhost\" DevicePath \"\"" Mar 17 18:34:59.339879 kubelet[2208]: E0317 18:34:59.339845 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:34:59.340765 kubelet[2208]: I0317 18:34:59.340681 2208 scope.go:117] "RemoveContainer" containerID="e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2" Mar 17 18:34:59.342328 env[1305]: time="2025-03-17T18:34:59.342287069Z" level=info msg="RemoveContainer for \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\"" Mar 17 18:34:59.354880 env[1305]: time="2025-03-17T18:34:59.354816914Z" level=info msg="RemoveContainer for \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\" returns successfully" Mar 17 18:34:59.355370 kubelet[2208]: I0317 18:34:59.355204 2208 scope.go:117] "RemoveContainer" containerID="e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2" Mar 17 18:34:59.356078 env[1305]: time="2025-03-17T18:34:59.355854729Z" level=error msg="ContainerStatus for \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\": not found" Mar 17 18:34:59.357434 kubelet[2208]: E0317 18:34:59.357379 2208 remote_runtime.go:432] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\": not found" containerID="e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2" Mar 17 18:34:59.357519 kubelet[2208]: I0317 18:34:59.357431 2208 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2"} err="failed to get container status \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\": rpc error: code = NotFound desc = an error occurred when try to find container \"e388241873734dbd037f2b2f2a2a4eb96eaea52e74b87f6726261b92365d10c2\": not found" Mar 17 18:34:59.942057 systemd[1]: var-lib-kubelet-pods-94704619\x2d936c\x2d454e\x2db51a\x2d0971929c1171-volume\x2dsubpaths-tigera\x2dca\x2dbundle-calico\x2dtypha-1.mount: Deactivated successfully. Mar 17 18:35:00.050474 kubelet[2208]: I0317 18:35:00.050429 2208 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94704619-936c-454e-b51a-0971929c1171" path="/var/lib/kubelet/pods/94704619-936c-454e-b51a-0971929c1171/volumes" Mar 17 18:35:02.258180 systemd[1]: Started sshd@23-10.0.0.12:22-10.0.0.1:47338.service. Mar 17 18:35:02.257000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.0.12:22-10.0.0.1:47338 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:02.259338 kernel: kauditd_printk_skb: 578 callbacks suppressed Mar 17 18:35:02.259419 kernel: audit: type=1130 audit(1742236502.257:658): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.0.12:22-10.0.0.1:47338 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:02.292000 audit[6058]: USER_ACCT pid=6058 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.297954 kernel: audit: type=1101 audit(1742236502.292:659): pid=6058 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.297994 kernel: audit: type=1103 audit(1742236502.297:660): pid=6058 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.297000 audit[6058]: CRED_ACQ pid=6058 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.298057 sshd[6058]: Accepted publickey for core from 10.0.0.1 port 47338 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:35:02.298126 sshd[6058]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:35:02.301987 systemd-logind[1287]: New session 24 of user core. Mar 17 18:35:02.302729 systemd[1]: Started session-24.scope. Mar 17 18:35:02.304341 kernel: audit: type=1006 audit(1742236502.297:661): pid=6058 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Mar 17 18:35:02.297000 audit[6058]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb5dc3440 a2=3 a3=0 items=0 ppid=1 pid=6058 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:02.308780 kernel: audit: type=1300 audit(1742236502.297:661): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb5dc3440 a2=3 a3=0 items=0 ppid=1 pid=6058 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:02.310441 kernel: audit: type=1327 audit(1742236502.297:661): proctitle=737368643A20636F7265205B707269765D Mar 17 18:35:02.297000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:35:02.306000 audit[6058]: USER_START pid=6058 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.316425 kernel: audit: type=1105 audit(1742236502.306:662): pid=6058 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.316519 kernel: audit: type=1103 audit(1742236502.307:663): pid=6061 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.307000 audit[6061]: CRED_ACQ pid=6061 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.429604 sshd[6058]: pam_unix(sshd:session): session closed for user core Mar 17 18:35:02.429000 audit[6058]: USER_END pid=6058 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.431937 systemd[1]: sshd@23-10.0.0.12:22-10.0.0.1:47338.service: Deactivated successfully. Mar 17 18:35:02.433259 systemd[1]: session-24.scope: Deactivated successfully. Mar 17 18:35:02.434117 systemd-logind[1287]: Session 24 logged out. Waiting for processes to exit. Mar 17 18:35:02.435340 systemd-logind[1287]: Removed session 24. Mar 17 18:35:02.429000 audit[6058]: CRED_DISP pid=6058 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.439459 kernel: audit: type=1106 audit(1742236502.429:664): pid=6058 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.439513 kernel: audit: type=1104 audit(1742236502.429:665): pid=6058 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:02.429000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.0.12:22-10.0.0.1:47338 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:05.593000 audit[6081]: NETFILTER_CFG table=filter:131 family=2 entries=20 op=nft_register_rule pid=6081 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:35:05.593000 audit[6081]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffc67f11850 a2=0 a3=7ffc67f1183c items=0 ppid=2398 pid=6081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:05.593000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:35:05.600000 audit[6081]: NETFILTER_CFG table=nat:132 family=2 entries=106 op=nft_register_chain pid=6081 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:35:05.600000 audit[6081]: SYSCALL arch=c000003e syscall=46 success=yes exit=49452 a0=3 a1=7ffc67f11850 a2=0 a3=7ffc67f1183c items=0 ppid=2398 pid=6081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:05.600000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:35:07.432391 systemd[1]: Started sshd@24-10.0.0.12:22-10.0.0.1:52514.service. Mar 17 18:35:07.431000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.0.12:22-10.0.0.1:52514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:07.433477 kernel: kauditd_printk_skb: 7 callbacks suppressed Mar 17 18:35:07.433527 kernel: audit: type=1130 audit(1742236507.431:669): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.0.12:22-10.0.0.1:52514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:07.463000 audit[6083]: USER_ACCT pid=6083 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.464525 sshd[6083]: Accepted publickey for core from 10.0.0.1 port 52514 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:35:07.465438 sshd[6083]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:35:07.464000 audit[6083]: CRED_ACQ pid=6083 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.469332 systemd-logind[1287]: New session 25 of user core. Mar 17 18:35:07.470327 systemd[1]: Started session-25.scope. Mar 17 18:35:07.473463 kernel: audit: type=1101 audit(1742236507.463:670): pid=6083 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.473511 kernel: audit: type=1103 audit(1742236507.464:671): pid=6083 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.476759 kernel: audit: type=1006 audit(1742236507.464:672): pid=6083 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Mar 17 18:35:07.476831 kernel: audit: type=1300 audit(1742236507.464:672): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcee22df70 a2=3 a3=0 items=0 ppid=1 pid=6083 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:07.464000 audit[6083]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcee22df70 a2=3 a3=0 items=0 ppid=1 pid=6083 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:07.464000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:35:07.482747 kernel: audit: type=1327 audit(1742236507.464:672): proctitle=737368643A20636F7265205B707269765D Mar 17 18:35:07.482790 kernel: audit: type=1105 audit(1742236507.474:673): pid=6083 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.474000 audit[6083]: USER_START pid=6083 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.475000 audit[6086]: CRED_ACQ pid=6086 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.490650 kernel: audit: type=1103 audit(1742236507.475:674): pid=6086 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.571763 sshd[6083]: pam_unix(sshd:session): session closed for user core Mar 17 18:35:07.571000 audit[6083]: USER_END pid=6083 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.573970 systemd[1]: sshd@24-10.0.0.12:22-10.0.0.1:52514.service: Deactivated successfully. Mar 17 18:35:07.575238 systemd[1]: session-25.scope: Deactivated successfully. Mar 17 18:35:07.575284 systemd-logind[1287]: Session 25 logged out. Waiting for processes to exit. Mar 17 18:35:07.576352 systemd-logind[1287]: Removed session 25. Mar 17 18:35:07.577952 kernel: audit: type=1106 audit(1742236507.571:675): pid=6083 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.578006 kernel: audit: type=1104 audit(1742236507.571:676): pid=6083 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.571000 audit[6083]: CRED_DISP pid=6083 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:07.573000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.0.12:22-10.0.0.1:52514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:08.053866 kubelet[2208]: E0317 18:35:08.053814 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:35:09.049188 kubelet[2208]: E0317 18:35:09.049116 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:35:12.049271 kubelet[2208]: E0317 18:35:12.049222 2208 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" Mar 17 18:35:12.574534 systemd[1]: Started sshd@25-10.0.0.12:22-10.0.0.1:52528.service. Mar 17 18:35:12.573000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.0.12:22-10.0.0.1:52528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:12.575520 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:35:12.575587 kernel: audit: type=1130 audit(1742236512.573:678): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.0.12:22-10.0.0.1:52528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:12.605000 audit[6106]: USER_ACCT pid=6106 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.606472 sshd[6106]: Accepted publickey for core from 10.0.0.1 port 52528 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:35:12.609824 sshd[6106]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:35:12.608000 audit[6106]: CRED_ACQ pid=6106 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.613701 kernel: audit: type=1101 audit(1742236512.605:679): pid=6106 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.613751 kernel: audit: type=1103 audit(1742236512.608:680): pid=6106 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.613769 kernel: audit: type=1006 audit(1742236512.608:681): pid=6106 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Mar 17 18:35:12.613373 systemd-logind[1287]: New session 26 of user core. Mar 17 18:35:12.614108 systemd[1]: Started session-26.scope. Mar 17 18:35:12.608000 audit[6106]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc13d167a0 a2=3 a3=0 items=0 ppid=1 pid=6106 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:12.620560 kernel: audit: type=1300 audit(1742236512.608:681): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc13d167a0 a2=3 a3=0 items=0 ppid=1 pid=6106 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:12.620609 kernel: audit: type=1327 audit(1742236512.608:681): proctitle=737368643A20636F7265205B707269765D Mar 17 18:35:12.608000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:35:12.621836 kernel: audit: type=1105 audit(1742236512.618:682): pid=6106 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.618000 audit[6106]: USER_START pid=6106 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.626040 kernel: audit: type=1103 audit(1742236512.619:683): pid=6109 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.619000 audit[6109]: CRED_ACQ pid=6109 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.714822 sshd[6106]: pam_unix(sshd:session): session closed for user core Mar 17 18:35:12.714000 audit[6106]: USER_END pid=6106 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.717483 systemd[1]: sshd@25-10.0.0.12:22-10.0.0.1:52528.service: Deactivated successfully. Mar 17 18:35:12.718461 systemd-logind[1287]: Session 26 logged out. Waiting for processes to exit. Mar 17 18:35:12.718527 systemd[1]: session-26.scope: Deactivated successfully. Mar 17 18:35:12.719551 systemd-logind[1287]: Removed session 26. Mar 17 18:35:12.715000 audit[6106]: CRED_DISP pid=6106 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.723727 kernel: audit: type=1106 audit(1742236512.714:684): pid=6106 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.723773 kernel: audit: type=1104 audit(1742236512.715:685): pid=6106 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:12.716000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.0.0.12:22-10.0.0.1:52528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:17.716000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.0.12:22-10.0.0.1:41988 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:17.717669 systemd[1]: Started sshd@26-10.0.0.12:22-10.0.0.1:41988.service. Mar 17 18:35:17.718783 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:35:17.718934 kernel: audit: type=1130 audit(1742236517.716:687): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.0.12:22-10.0.0.1:41988 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:35:17.748000 audit[6120]: USER_ACCT pid=6120 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.749424 sshd[6120]: Accepted publickey for core from 10.0.0.1 port 41988 ssh2: RSA SHA256:DYcGKLA+BUI3KXBOyjzF6/uTec/cV0nLMAEcssN4/64 Mar 17 18:35:17.751286 sshd[6120]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:35:17.750000 audit[6120]: CRED_ACQ pid=6120 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.755019 systemd-logind[1287]: New session 27 of user core. Mar 17 18:35:17.755826 systemd[1]: Started session-27.scope. Mar 17 18:35:17.756808 kernel: audit: type=1101 audit(1742236517.748:688): pid=6120 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.756849 kernel: audit: type=1103 audit(1742236517.750:689): pid=6120 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.759147 kernel: audit: type=1006 audit(1742236517.750:690): pid=6120 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Mar 17 18:35:17.750000 audit[6120]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff42e8b7e0 a2=3 a3=0 items=0 ppid=1 pid=6120 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:17.763378 kernel: audit: type=1300 audit(1742236517.750:690): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff42e8b7e0 a2=3 a3=0 items=0 ppid=1 pid=6120 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:35:17.763412 kernel: audit: type=1327 audit(1742236517.750:690): proctitle=737368643A20636F7265205B707269765D Mar 17 18:35:17.750000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:35:17.758000 audit[6120]: USER_START pid=6120 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.768833 kernel: audit: type=1105 audit(1742236517.758:691): pid=6120 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.768866 kernel: audit: type=1103 audit(1742236517.760:692): pid=6123 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.760000 audit[6123]: CRED_ACQ pid=6123 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.857754 sshd[6120]: pam_unix(sshd:session): session closed for user core Mar 17 18:35:17.857000 audit[6120]: USER_END pid=6120 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.860069 systemd[1]: sshd@26-10.0.0.12:22-10.0.0.1:41988.service: Deactivated successfully. Mar 17 18:35:17.861096 systemd[1]: session-27.scope: Deactivated successfully. Mar 17 18:35:17.861137 systemd-logind[1287]: Session 27 logged out. Waiting for processes to exit. Mar 17 18:35:17.861939 systemd-logind[1287]: Removed session 27. Mar 17 18:35:17.857000 audit[6120]: CRED_DISP pid=6120 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.866430 kernel: audit: type=1106 audit(1742236517.857:693): pid=6120 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.866484 kernel: audit: type=1104 audit(1742236517.857:694): pid=6120 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.0.0.1 addr=10.0.0.1 terminal=ssh res=success' Mar 17 18:35:17.859000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-10.0.0.12:22-10.0.0.1:41988 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success'