Mar 17 18:47:38.022527 kernel: Linux version 5.15.179-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 11.3.1_p20221209 p3) 11.3.1 20221209, GNU ld (Gentoo 2.39 p5) 2.39.0) #1 SMP Mon Mar 17 17:12:34 -00 2025 Mar 17 18:47:38.022559 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=249ccd113f901380672c0d31e18f792e8e0344094c0e39eedc449f039418b31a Mar 17 18:47:38.022574 kernel: BIOS-provided physical RAM map: Mar 17 18:47:38.022585 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable Mar 17 18:47:38.022595 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved Mar 17 18:47:38.022604 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003ff40fff] usable Mar 17 18:47:38.022620 kernel: BIOS-e820: [mem 0x000000003ff41000-0x000000003ffc8fff] reserved Mar 17 18:47:38.022631 kernel: BIOS-e820: [mem 0x000000003ffc9000-0x000000003fffafff] ACPI data Mar 17 18:47:38.022642 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS Mar 17 18:47:38.022653 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable Mar 17 18:47:38.022663 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable Mar 17 18:47:38.022674 kernel: printk: bootconsole [earlyser0] enabled Mar 17 18:47:38.022685 kernel: NX (Execute Disable) protection: active Mar 17 18:47:38.022695 kernel: efi: EFI v2.70 by Microsoft Mar 17 18:47:38.022712 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff85000 SMBIOS 3.0=0x3ff83000 MEMATTR=0x3f5c8a98 RNG=0x3ffd1018 Mar 17 18:47:38.022724 kernel: random: crng init done Mar 17 18:47:38.022735 kernel: SMBIOS 3.1.0 present. Mar 17 18:47:38.022747 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 03/08/2024 Mar 17 18:47:38.022759 kernel: Hypervisor detected: Microsoft Hyper-V Mar 17 18:47:38.022771 kernel: Hyper-V: privilege flags low 0x2e7f, high 0x3b8030, hints 0x64e24, misc 0xbed7b2 Mar 17 18:47:38.022782 kernel: Hyper-V Host Build:20348-10.0-1-0.1799 Mar 17 18:47:38.022794 kernel: Hyper-V: Nested features: 0x1e0101 Mar 17 18:47:38.022808 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 Mar 17 18:47:38.022819 kernel: Hyper-V: Using hypercall for remote TLB flush Mar 17 18:47:38.022830 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns Mar 17 18:47:38.022842 kernel: tsc: Marking TSC unstable due to running on Hyper-V Mar 17 18:47:38.022854 kernel: tsc: Detected 2593.906 MHz processor Mar 17 18:47:38.022867 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved Mar 17 18:47:38.022879 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable Mar 17 18:47:38.022891 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x400000000 Mar 17 18:47:38.022902 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT Mar 17 18:47:38.022914 kernel: e820: update [mem 0x40000000-0xffffffff] usable ==> reserved Mar 17 18:47:38.022928 kernel: last_pfn = 0x40000 max_arch_pfn = 0x400000000 Mar 17 18:47:38.022940 kernel: Using GB pages for direct mapping Mar 17 18:47:38.022952 kernel: Secure boot disabled Mar 17 18:47:38.022964 kernel: ACPI: Early table checksum verification disabled Mar 17 18:47:38.022975 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) Mar 17 18:47:38.022987 kernel: ACPI: XSDT 0x000000003FFF90E8 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:47:38.022999 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:47:38.023012 kernel: ACPI: DSDT 0x000000003FFD6000 01E184 (v02 MSFTVM DSDT01 00000001 MSFT 05000000) Mar 17 18:47:38.023031 kernel: ACPI: FACS 0x000000003FFFE000 000040 Mar 17 18:47:38.023044 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:47:38.023056 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:47:38.023068 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:47:38.023082 kernel: ACPI: APIC 0x000000003FFD5000 000058 (v04 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:47:38.023095 kernel: ACPI: SRAT 0x000000003FFD4000 0002D0 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:47:38.023110 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:47:38.023123 kernel: ACPI: FPDT 0x000000003FFD2000 000034 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) Mar 17 18:47:38.023136 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] Mar 17 18:47:38.023148 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff4183] Mar 17 18:47:38.023161 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] Mar 17 18:47:38.023174 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] Mar 17 18:47:38.023187 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] Mar 17 18:47:38.023200 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] Mar 17 18:47:38.023215 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5057] Mar 17 18:47:38.023228 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd42cf] Mar 17 18:47:38.023241 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] Mar 17 18:47:38.023253 kernel: ACPI: Reserving FPDT table memory at [mem 0x3ffd2000-0x3ffd2033] Mar 17 18:47:38.023265 kernel: SRAT: PXM 0 -> APIC 0x00 -> Node 0 Mar 17 18:47:38.023278 kernel: SRAT: PXM 0 -> APIC 0x01 -> Node 0 Mar 17 18:47:38.023291 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] hotplug Mar 17 18:47:38.023304 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] hotplug Mar 17 18:47:38.023317 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2c0000000-0xfdfffffff] hotplug Mar 17 18:47:38.023332 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000-0xffffffffff] hotplug Mar 17 18:47:38.023345 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x10000000000-0x1ffffffffff] hotplug Mar 17 18:47:38.023358 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x20000000000-0x3ffffffffff] hotplug Mar 17 18:47:38.023371 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000000-0x7ffffffffff] hotplug Mar 17 18:47:38.023383 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x80000000000-0xfffffffffff] hotplug Mar 17 18:47:38.023396 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000000-0x1fffffffffff] hotplug Mar 17 18:47:38.023409 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x200000000000-0x3fffffffffff] hotplug Mar 17 18:47:38.023422 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x400000000000-0x7fffffffffff] hotplug Mar 17 18:47:38.023434 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x800000000000-0xffffffffffff] hotplug Mar 17 18:47:38.023450 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x1000000000000-0x1ffffffffffff] hotplug Mar 17 18:47:38.023477 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x2000000000000-0x3ffffffffffff] hotplug Mar 17 18:47:38.023498 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x4000000000000-0x7ffffffffffff] hotplug Mar 17 18:47:38.023511 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x8000000000000-0xfffffffffffff] hotplug Mar 17 18:47:38.023524 kernel: NUMA: Node 0 [mem 0x00000000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00000000-0x2bfffffff] Mar 17 18:47:38.023536 kernel: NODE_DATA(0) allocated [mem 0x2bfffa000-0x2bfffffff] Mar 17 18:47:38.023549 kernel: Zone ranges: Mar 17 18:47:38.023561 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] Mar 17 18:47:38.023573 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] Mar 17 18:47:38.023589 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] Mar 17 18:47:38.023602 kernel: Movable zone start for each node Mar 17 18:47:38.023614 kernel: Early memory node ranges Mar 17 18:47:38.023626 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] Mar 17 18:47:38.023638 kernel: node 0: [mem 0x0000000000100000-0x000000003ff40fff] Mar 17 18:47:38.023651 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] Mar 17 18:47:38.023663 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] Mar 17 18:47:38.023675 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] Mar 17 18:47:38.023688 kernel: On node 0, zone DMA: 1 pages in unavailable ranges Mar 17 18:47:38.023701 kernel: On node 0, zone DMA: 96 pages in unavailable ranges Mar 17 18:47:38.023711 kernel: On node 0, zone DMA32: 190 pages in unavailable ranges Mar 17 18:47:38.023723 kernel: ACPI: PM-Timer IO Port: 0x408 Mar 17 18:47:38.023735 kernel: ACPI: LAPIC_NMI (acpi_id[0x01] dfl dfl lint[0x1]) Mar 17 18:47:38.023745 kernel: IOAPIC[0]: apic_id 2, version 17, address 0xfec00000, GSI 0-23 Mar 17 18:47:38.023763 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) Mar 17 18:47:38.023774 kernel: ACPI: Using ACPI (MADT) for SMP configuration information Mar 17 18:47:38.023786 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 Mar 17 18:47:38.023797 kernel: smpboot: Allowing 2 CPUs, 0 hotplug CPUs Mar 17 18:47:38.023813 kernel: [mem 0x40000000-0xffffffff] available for PCI devices Mar 17 18:47:38.023825 kernel: Booting paravirtualized kernel on Hyper-V Mar 17 18:47:38.023838 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns Mar 17 18:47:38.023851 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:512 nr_cpu_ids:2 nr_node_ids:1 Mar 17 18:47:38.023863 kernel: percpu: Embedded 56 pages/cpu s188696 r8192 d32488 u1048576 Mar 17 18:47:38.023876 kernel: pcpu-alloc: s188696 r8192 d32488 u1048576 alloc=1*2097152 Mar 17 18:47:38.023888 kernel: pcpu-alloc: [0] 0 1 Mar 17 18:47:38.023901 kernel: Hyper-V: PV spinlocks enabled Mar 17 18:47:38.023914 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) Mar 17 18:47:38.023929 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2062618 Mar 17 18:47:38.023941 kernel: Policy zone: Normal Mar 17 18:47:38.023956 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=249ccd113f901380672c0d31e18f792e8e0344094c0e39eedc449f039418b31a Mar 17 18:47:38.023969 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Mar 17 18:47:38.023982 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Mar 17 18:47:38.023995 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Mar 17 18:47:38.024007 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Mar 17 18:47:38.024021 kernel: Memory: 8079144K/8387460K available (12294K kernel code, 2278K rwdata, 13724K rodata, 47472K init, 4108K bss, 308056K reserved, 0K cma-reserved) Mar 17 18:47:38.024035 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Mar 17 18:47:38.024049 kernel: ftrace: allocating 34580 entries in 136 pages Mar 17 18:47:38.024070 kernel: ftrace: allocated 136 pages with 2 groups Mar 17 18:47:38.024086 kernel: rcu: Hierarchical RCU implementation. Mar 17 18:47:38.024099 kernel: rcu: RCU event tracing is enabled. Mar 17 18:47:38.024112 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Mar 17 18:47:38.024126 kernel: Rude variant of Tasks RCU enabled. Mar 17 18:47:38.024139 kernel: Tracing variant of Tasks RCU enabled. Mar 17 18:47:38.024153 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Mar 17 18:47:38.024166 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Mar 17 18:47:38.024179 kernel: Using NULL legacy PIC Mar 17 18:47:38.024195 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 Mar 17 18:47:38.024209 kernel: Console: colour dummy device 80x25 Mar 17 18:47:38.024222 kernel: printk: console [tty1] enabled Mar 17 18:47:38.024235 kernel: printk: console [ttyS0] enabled Mar 17 18:47:38.024249 kernel: printk: bootconsole [earlyser0] disabled Mar 17 18:47:38.024264 kernel: ACPI: Core revision 20210730 Mar 17 18:47:38.024278 kernel: Failed to register legacy timer interrupt Mar 17 18:47:38.024291 kernel: APIC: Switch to symmetric I/O mode setup Mar 17 18:47:38.024304 kernel: Hyper-V: Using IPI hypercalls Mar 17 18:47:38.024318 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 5187.81 BogoMIPS (lpj=2593906) Mar 17 18:47:38.024343 kernel: Last level iTLB entries: 4KB 64, 2MB 8, 4MB 8 Mar 17 18:47:38.024356 kernel: Last level dTLB entries: 4KB 64, 2MB 32, 4MB 32, 1GB 4 Mar 17 18:47:38.024369 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization Mar 17 18:47:38.024382 kernel: Spectre V2 : Mitigation: Retpolines Mar 17 18:47:38.024395 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch Mar 17 18:47:38.024410 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT Mar 17 18:47:38.024423 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! Mar 17 18:47:38.024436 kernel: RETBleed: Vulnerable Mar 17 18:47:38.024449 kernel: Speculative Store Bypass: Vulnerable Mar 17 18:47:38.024472 kernel: TAA: Vulnerable: Clear CPU buffers attempted, no microcode Mar 17 18:47:38.024484 kernel: MMIO Stale Data: Vulnerable: Clear CPU buffers attempted, no microcode Mar 17 18:47:38.024497 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' Mar 17 18:47:38.024510 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' Mar 17 18:47:38.024522 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' Mar 17 18:47:38.024535 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' Mar 17 18:47:38.024551 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' Mar 17 18:47:38.024563 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' Mar 17 18:47:38.024576 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 Mar 17 18:47:38.024589 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 Mar 17 18:47:38.024602 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 Mar 17 18:47:38.024614 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 Mar 17 18:47:38.024627 kernel: x86/fpu: Enabled xstate features 0xe7, context size is 2432 bytes, using 'compacted' format. Mar 17 18:47:38.024640 kernel: Freeing SMP alternatives memory: 32K Mar 17 18:47:38.024652 kernel: pid_max: default: 32768 minimum: 301 Mar 17 18:47:38.024665 kernel: LSM: Security Framework initializing Mar 17 18:47:38.024677 kernel: SELinux: Initializing. Mar 17 18:47:38.024690 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 17 18:47:38.024705 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) Mar 17 18:47:38.024718 kernel: smpboot: CPU0: Intel(R) Xeon(R) Platinum 8272CL CPU @ 2.60GHz (family: 0x6, model: 0x55, stepping: 0x7) Mar 17 18:47:38.024731 kernel: Performance Events: unsupported p6 CPU model 85 no PMU driver, software events only. Mar 17 18:47:38.024745 kernel: signal: max sigframe size: 3632 Mar 17 18:47:38.024758 kernel: rcu: Hierarchical SRCU implementation. Mar 17 18:47:38.024771 kernel: NMI watchdog: Perf NMI watchdog permanently disabled Mar 17 18:47:38.024784 kernel: smp: Bringing up secondary CPUs ... Mar 17 18:47:38.024796 kernel: x86: Booting SMP configuration: Mar 17 18:47:38.024809 kernel: .... node #0, CPUs: #1 Mar 17 18:47:38.024822 kernel: TAA CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/tsx_async_abort.html for more details. Mar 17 18:47:38.024838 kernel: MMIO Stale Data CPU bug present and SMT on, data leak possible. See https://www.kernel.org/doc/html/latest/admin-guide/hw-vuln/processor_mmio_stale_data.html for more details. Mar 17 18:47:38.024851 kernel: smp: Brought up 1 node, 2 CPUs Mar 17 18:47:38.024864 kernel: smpboot: Max logical packages: 1 Mar 17 18:47:38.024876 kernel: smpboot: Total of 2 processors activated (10375.62 BogoMIPS) Mar 17 18:47:38.024890 kernel: devtmpfs: initialized Mar 17 18:47:38.024903 kernel: x86/mm: Memory block size: 128MB Mar 17 18:47:38.024916 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) Mar 17 18:47:38.024929 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Mar 17 18:47:38.024944 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Mar 17 18:47:38.024957 kernel: pinctrl core: initialized pinctrl subsystem Mar 17 18:47:38.024970 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Mar 17 18:47:38.024983 kernel: audit: initializing netlink subsys (disabled) Mar 17 18:47:38.024996 kernel: audit: type=2000 audit(1742237256.023:1): state=initialized audit_enabled=0 res=1 Mar 17 18:47:38.025009 kernel: thermal_sys: Registered thermal governor 'step_wise' Mar 17 18:47:38.025022 kernel: thermal_sys: Registered thermal governor 'user_space' Mar 17 18:47:38.025035 kernel: cpuidle: using governor menu Mar 17 18:47:38.025048 kernel: ACPI: bus type PCI registered Mar 17 18:47:38.025063 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Mar 17 18:47:38.025076 kernel: dca service started, version 1.12.1 Mar 17 18:47:38.025089 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. Mar 17 18:47:38.025102 kernel: HugeTLB registered 1.00 GiB page size, pre-allocated 0 pages Mar 17 18:47:38.025115 kernel: HugeTLB registered 2.00 MiB page size, pre-allocated 0 pages Mar 17 18:47:38.025127 kernel: ACPI: Added _OSI(Module Device) Mar 17 18:47:38.025140 kernel: ACPI: Added _OSI(Processor Device) Mar 17 18:47:38.025169 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) Mar 17 18:47:38.025182 kernel: ACPI: Added _OSI(Processor Aggregator Device) Mar 17 18:47:38.025198 kernel: ACPI: Added _OSI(Linux-Dell-Video) Mar 17 18:47:38.025211 kernel: ACPI: Added _OSI(Linux-Lenovo-NV-HDMI-Audio) Mar 17 18:47:38.025224 kernel: ACPI: Added _OSI(Linux-HPI-Hybrid-Graphics) Mar 17 18:47:38.025237 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Mar 17 18:47:38.025251 kernel: ACPI: Interpreter enabled Mar 17 18:47:38.025264 kernel: ACPI: PM: (supports S0 S5) Mar 17 18:47:38.025277 kernel: ACPI: Using IOAPIC for interrupt routing Mar 17 18:47:38.025291 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug Mar 17 18:47:38.025304 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F Mar 17 18:47:38.025319 kernel: iommu: Default domain type: Translated Mar 17 18:47:38.025333 kernel: iommu: DMA domain TLB invalidation policy: lazy mode Mar 17 18:47:38.025345 kernel: vgaarb: loaded Mar 17 18:47:38.025359 kernel: pps_core: LinuxPPS API ver. 1 registered Mar 17 18:47:38.025372 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Mar 17 18:47:38.025386 kernel: PTP clock support registered Mar 17 18:47:38.025399 kernel: Registered efivars operations Mar 17 18:47:38.025412 kernel: PCI: Using ACPI for IRQ routing Mar 17 18:47:38.025426 kernel: PCI: System does not support PCI Mar 17 18:47:38.025441 kernel: clocksource: Switched to clocksource hyperv_clocksource_tsc_page Mar 17 18:47:38.046849 kernel: VFS: Disk quotas dquot_6.6.0 Mar 17 18:47:38.046873 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Mar 17 18:47:38.046888 kernel: pnp: PnP ACPI init Mar 17 18:47:38.046902 kernel: pnp: PnP ACPI: found 3 devices Mar 17 18:47:38.046916 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns Mar 17 18:47:38.046928 kernel: NET: Registered PF_INET protocol family Mar 17 18:47:38.046946 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) Mar 17 18:47:38.046962 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) Mar 17 18:47:38.046982 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Mar 17 18:47:38.046996 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) Mar 17 18:47:38.047010 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear) Mar 17 18:47:38.047023 kernel: TCP: Hash tables configured (established 65536 bind 65536) Mar 17 18:47:38.047037 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) Mar 17 18:47:38.047055 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) Mar 17 18:47:38.047069 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Mar 17 18:47:38.047083 kernel: NET: Registered PF_XDP protocol family Mar 17 18:47:38.047095 kernel: PCI: CLS 0 bytes, default 64 Mar 17 18:47:38.047112 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) Mar 17 18:47:38.047131 kernel: software IO TLB: mapped [mem 0x000000003a8ad000-0x000000003e8ad000] (64MB) Mar 17 18:47:38.047145 kernel: RAPL PMU: API unit is 2^-32 Joules, 0 fixed counters, 10737418240 ms ovfl timer Mar 17 18:47:38.047158 kernel: Initialise system trusted keyrings Mar 17 18:47:38.047177 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 Mar 17 18:47:38.047191 kernel: Key type asymmetric registered Mar 17 18:47:38.047203 kernel: Asymmetric key parser 'x509' registered Mar 17 18:47:38.047216 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Mar 17 18:47:38.047235 kernel: io scheduler mq-deadline registered Mar 17 18:47:38.047251 kernel: io scheduler kyber registered Mar 17 18:47:38.047264 kernel: io scheduler bfq registered Mar 17 18:47:38.047278 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 Mar 17 18:47:38.047297 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Mar 17 18:47:38.047311 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A Mar 17 18:47:38.047324 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A Mar 17 18:47:38.047344 kernel: i8042: PNP: No PS/2 controller found. Mar 17 18:47:38.050572 kernel: rtc_cmos 00:02: registered as rtc0 Mar 17 18:47:38.050697 kernel: rtc_cmos 00:02: setting system clock to 2025-03-17T18:47:37 UTC (1742237257) Mar 17 18:47:38.050799 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram Mar 17 18:47:38.050815 kernel: intel_pstate: CPU model not supported Mar 17 18:47:38.050829 kernel: efifb: probing for efifb Mar 17 18:47:38.050842 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k Mar 17 18:47:38.050855 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 Mar 17 18:47:38.050869 kernel: efifb: scrolling: redraw Mar 17 18:47:38.050882 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 Mar 17 18:47:38.050896 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 18:47:38.050912 kernel: fb0: EFI VGA frame buffer device Mar 17 18:47:38.050925 kernel: pstore: Registered efi as persistent store backend Mar 17 18:47:38.050938 kernel: NET: Registered PF_INET6 protocol family Mar 17 18:47:38.050951 kernel: Segment Routing with IPv6 Mar 17 18:47:38.050963 kernel: In-situ OAM (IOAM) with IPv6 Mar 17 18:47:38.050977 kernel: NET: Registered PF_PACKET protocol family Mar 17 18:47:38.050991 kernel: Key type dns_resolver registered Mar 17 18:47:38.051004 kernel: IPI shorthand broadcast: enabled Mar 17 18:47:38.051017 kernel: sched_clock: Marking stable (708208300, 20225000)->(891663000, -163229700) Mar 17 18:47:38.051032 kernel: registered taskstats version 1 Mar 17 18:47:38.051045 kernel: Loading compiled-in X.509 certificates Mar 17 18:47:38.051058 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 5.15.179-flatcar: d5b956bbabb2d386c0246a969032c0de9eaa8220' Mar 17 18:47:38.051070 kernel: Key type .fscrypt registered Mar 17 18:47:38.051083 kernel: Key type fscrypt-provisioning registered Mar 17 18:47:38.051097 kernel: pstore: Using crash dump compression: deflate Mar 17 18:47:38.051111 kernel: ima: No TPM chip found, activating TPM-bypass! Mar 17 18:47:38.051124 kernel: ima: Allocated hash algorithm: sha1 Mar 17 18:47:38.051140 kernel: ima: No architecture policies found Mar 17 18:47:38.051153 kernel: clk: Disabling unused clocks Mar 17 18:47:38.051166 kernel: Freeing unused kernel image (initmem) memory: 47472K Mar 17 18:47:38.051179 kernel: Write protecting the kernel read-only data: 28672k Mar 17 18:47:38.051192 kernel: Freeing unused kernel image (text/rodata gap) memory: 2040K Mar 17 18:47:38.051206 kernel: Freeing unused kernel image (rodata/data gap) memory: 612K Mar 17 18:47:38.051219 kernel: Run /init as init process Mar 17 18:47:38.051232 kernel: with arguments: Mar 17 18:47:38.051245 kernel: /init Mar 17 18:47:38.051260 kernel: with environment: Mar 17 18:47:38.051272 kernel: HOME=/ Mar 17 18:47:38.051285 kernel: TERM=linux Mar 17 18:47:38.051298 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Mar 17 18:47:38.051314 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Mar 17 18:47:38.051331 systemd[1]: Detected virtualization microsoft. Mar 17 18:47:38.051345 systemd[1]: Detected architecture x86-64. Mar 17 18:47:38.051358 systemd[1]: Running in initrd. Mar 17 18:47:38.051374 systemd[1]: No hostname configured, using default hostname. Mar 17 18:47:38.051386 systemd[1]: Hostname set to . Mar 17 18:47:38.051400 systemd[1]: Initializing machine ID from random generator. Mar 17 18:47:38.051413 systemd[1]: Queued start job for default target initrd.target. Mar 17 18:47:38.051426 systemd[1]: Started systemd-ask-password-console.path. Mar 17 18:47:38.051439 systemd[1]: Reached target cryptsetup.target. Mar 17 18:47:38.051496 systemd[1]: Reached target paths.target. Mar 17 18:47:38.051512 systemd[1]: Reached target slices.target. Mar 17 18:47:38.051525 systemd[1]: Reached target swap.target. Mar 17 18:47:38.051542 systemd[1]: Reached target timers.target. Mar 17 18:47:38.051556 systemd[1]: Listening on iscsid.socket. Mar 17 18:47:38.051570 systemd[1]: Listening on iscsiuio.socket. Mar 17 18:47:38.051583 systemd[1]: Listening on systemd-journald-audit.socket. Mar 17 18:47:38.051597 systemd[1]: Listening on systemd-journald-dev-log.socket. Mar 17 18:47:38.051611 systemd[1]: Listening on systemd-journald.socket. Mar 17 18:47:38.051625 systemd[1]: Listening on systemd-networkd.socket. Mar 17 18:47:38.051641 systemd[1]: Listening on systemd-udevd-control.socket. Mar 17 18:47:38.051655 systemd[1]: Listening on systemd-udevd-kernel.socket. Mar 17 18:47:38.051669 systemd[1]: Reached target sockets.target. Mar 17 18:47:38.051682 systemd[1]: Starting kmod-static-nodes.service... Mar 17 18:47:38.051696 systemd[1]: Finished network-cleanup.service. Mar 17 18:47:38.051709 systemd[1]: Starting systemd-fsck-usr.service... Mar 17 18:47:38.051722 systemd[1]: Starting systemd-journald.service... Mar 17 18:47:38.051736 systemd[1]: Starting systemd-modules-load.service... Mar 17 18:47:38.051750 systemd[1]: Starting systemd-resolved.service... Mar 17 18:47:38.051767 systemd[1]: Starting systemd-vconsole-setup.service... Mar 17 18:47:38.051780 systemd[1]: Finished kmod-static-nodes.service. Mar 17 18:47:38.051794 kernel: audit: type=1130 audit(1742237258.025:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.051807 systemd[1]: Finished systemd-fsck-usr.service. Mar 17 18:47:38.051826 systemd-journald[183]: Journal started Mar 17 18:47:38.051892 systemd-journald[183]: Runtime Journal (/run/log/journal/c552ce0304a9469791dcc6b83af0737b) is 8.0M, max 159.0M, 151.0M free. Mar 17 18:47:38.025000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.014564 systemd-modules-load[184]: Inserted module 'overlay' Mar 17 18:47:38.059438 systemd-resolved[185]: Positive Trust Anchors: Mar 17 18:47:38.062342 systemd[1]: Started systemd-journald.service. Mar 17 18:47:38.059679 systemd-resolved[185]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 18:47:38.059736 systemd-resolved[185]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Mar 17 18:47:38.115705 kernel: audit: type=1130 audit(1742237258.056:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.115735 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Mar 17 18:47:38.115759 kernel: audit: type=1130 audit(1742237258.090:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.056000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.090000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.062685 systemd-resolved[185]: Defaulting to hostname 'linux'. Mar 17 18:47:38.091017 systemd[1]: Started systemd-resolved.service. Mar 17 18:47:38.140703 kernel: Bridge firewalling registered Mar 17 18:47:38.140736 kernel: audit: type=1130 audit(1742237258.091:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.091000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.099413 systemd[1]: Finished systemd-vconsole-setup.service. Mar 17 18:47:38.119060 systemd[1]: Reached target nss-lookup.target. Mar 17 18:47:38.132253 systemd-modules-load[184]: Inserted module 'br_netfilter' Mar 17 18:47:38.135554 systemd[1]: Starting dracut-cmdline-ask.service... Mar 17 18:47:38.150091 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Mar 17 18:47:38.156022 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Mar 17 18:47:38.115000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.184787 kernel: audit: type=1130 audit(1742237258.115:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.184827 kernel: audit: type=1130 audit(1742237258.161:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.192246 kernel: SCSI subsystem initialized Mar 17 18:47:38.185190 systemd[1]: Finished dracut-cmdline-ask.service. Mar 17 18:47:38.190308 systemd[1]: Starting dracut-cmdline.service... Mar 17 18:47:38.188000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.209267 dracut-cmdline[200]: dracut-dracut-053 Mar 17 18:47:38.211313 kernel: audit: type=1130 audit(1742237258.188:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.211342 dracut-cmdline[200]: Using kernel command line parameters: rd.driver.pre=btrfs rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=249ccd113f901380672c0d31e18f792e8e0344094c0e39eedc449f039418b31a Mar 17 18:47:38.237040 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Mar 17 18:47:38.237087 kernel: device-mapper: uevent: version 1.0.3 Mar 17 18:47:38.242219 kernel: device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com Mar 17 18:47:38.246322 systemd-modules-load[184]: Inserted module 'dm_multipath' Mar 17 18:47:38.248201 systemd[1]: Finished systemd-modules-load.service. Mar 17 18:47:38.251412 systemd[1]: Starting systemd-sysctl.service... Mar 17 18:47:38.270118 kernel: audit: type=1130 audit(1742237258.250:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.250000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.277770 systemd[1]: Finished systemd-sysctl.service. Mar 17 18:47:38.281000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.294469 kernel: audit: type=1130 audit(1742237258.281:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.298471 kernel: Loading iSCSI transport class v2.0-870. Mar 17 18:47:38.317472 kernel: iscsi: registered transport (tcp) Mar 17 18:47:38.343679 kernel: iscsi: registered transport (qla4xxx) Mar 17 18:47:38.343740 kernel: QLogic iSCSI HBA Driver Mar 17 18:47:38.372847 systemd[1]: Finished dracut-cmdline.service. Mar 17 18:47:38.376000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.378014 systemd[1]: Starting dracut-pre-udev.service... Mar 17 18:47:38.427479 kernel: raid6: avx512x4 gen() 18688 MB/s Mar 17 18:47:38.447470 kernel: raid6: avx512x4 xor() 7612 MB/s Mar 17 18:47:38.467466 kernel: raid6: avx512x2 gen() 18900 MB/s Mar 17 18:47:38.487472 kernel: raid6: avx512x2 xor() 29717 MB/s Mar 17 18:47:38.506469 kernel: raid6: avx512x1 gen() 18893 MB/s Mar 17 18:47:38.525464 kernel: raid6: avx512x1 xor() 26868 MB/s Mar 17 18:47:38.545469 kernel: raid6: avx2x4 gen() 18943 MB/s Mar 17 18:47:38.565465 kernel: raid6: avx2x4 xor() 7954 MB/s Mar 17 18:47:38.585464 kernel: raid6: avx2x2 gen() 18852 MB/s Mar 17 18:47:38.605467 kernel: raid6: avx2x2 xor() 21880 MB/s Mar 17 18:47:38.624468 kernel: raid6: avx2x1 gen() 14031 MB/s Mar 17 18:47:38.643464 kernel: raid6: avx2x1 xor() 19271 MB/s Mar 17 18:47:38.663465 kernel: raid6: sse2x4 gen() 11758 MB/s Mar 17 18:47:38.682464 kernel: raid6: sse2x4 xor() 7342 MB/s Mar 17 18:47:38.702464 kernel: raid6: sse2x2 gen() 13022 MB/s Mar 17 18:47:38.722466 kernel: raid6: sse2x2 xor() 7737 MB/s Mar 17 18:47:38.741463 kernel: raid6: sse2x1 gen() 11673 MB/s Mar 17 18:47:38.764086 kernel: raid6: sse2x1 xor() 5972 MB/s Mar 17 18:47:38.764114 kernel: raid6: using algorithm avx2x4 gen() 18943 MB/s Mar 17 18:47:38.764126 kernel: raid6: .... xor() 7954 MB/s, rmw enabled Mar 17 18:47:38.767250 kernel: raid6: using avx512x2 recovery algorithm Mar 17 18:47:38.785473 kernel: xor: automatically using best checksumming function avx Mar 17 18:47:38.881478 kernel: Btrfs loaded, crc32c=crc32c-intel, zoned=no, fsverity=no Mar 17 18:47:38.889052 systemd[1]: Finished dracut-pre-udev.service. Mar 17 18:47:38.890000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.892000 audit: BPF prog-id=7 op=LOAD Mar 17 18:47:38.892000 audit: BPF prog-id=8 op=LOAD Mar 17 18:47:38.893641 systemd[1]: Starting systemd-udevd.service... Mar 17 18:47:38.907645 systemd-udevd[384]: Using default interface naming scheme 'v252'. Mar 17 18:47:38.914389 systemd[1]: Started systemd-udevd.service. Mar 17 18:47:38.917488 systemd[1]: Starting dracut-pre-trigger.service... Mar 17 18:47:38.916000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:38.932913 dracut-pre-trigger[396]: rd.md=0: removing MD RAID activation Mar 17 18:47:38.964715 systemd[1]: Finished dracut-pre-trigger.service. Mar 17 18:47:38.967747 systemd[1]: Starting systemd-udev-trigger.service... Mar 17 18:47:38.966000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:39.006000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:39.004562 systemd[1]: Finished systemd-udev-trigger.service. Mar 17 18:47:39.047493 kernel: cryptd: max_cpu_qlen set to 1000 Mar 17 18:47:39.066475 kernel: AVX2 version of gcm_enc/dec engaged. Mar 17 18:47:39.074333 kernel: hv_vmbus: Vmbus version:5.2 Mar 17 18:47:39.074388 kernel: AES CTR mode by8 optimization enabled Mar 17 18:47:39.108475 kernel: hv_vmbus: registering driver hyperv_keyboard Mar 17 18:47:39.116473 kernel: hv_vmbus: registering driver hv_storvsc Mar 17 18:47:39.131693 kernel: scsi host0: storvsc_host_t Mar 17 18:47:39.131768 kernel: scsi host1: storvsc_host_t Mar 17 18:47:39.146852 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/VMBUS:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 Mar 17 18:47:39.146910 kernel: scsi 0:0:0:0: Direct-Access Msft Virtual Disk 1.0 PQ: 0 ANSI: 5 Mar 17 18:47:39.147468 kernel: hid: raw HID events driver (C) Jiri Kosina Mar 17 18:47:39.155750 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 0 Mar 17 18:47:39.161465 kernel: hv_vmbus: registering driver hv_netvsc Mar 17 18:47:39.167469 kernel: hv_vmbus: registering driver hid_hyperv Mar 17 18:47:39.179564 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 Mar 17 18:47:39.179611 kernel: hid 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on Mar 17 18:47:39.204775 kernel: sr 0:0:0:2: [sr0] scsi-1 drive Mar 17 18:47:39.208110 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Mar 17 18:47:39.208136 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 Mar 17 18:47:39.219906 kernel: sd 0:0:0:0: [sda] 63737856 512-byte logical blocks: (32.6 GB/30.4 GiB) Mar 17 18:47:39.236482 kernel: sd 0:0:0:0: [sda] 4096-byte physical blocks Mar 17 18:47:39.236666 kernel: sd 0:0:0:0: [sda] Write Protect is off Mar 17 18:47:39.236820 kernel: sd 0:0:0:0: [sda] Mode Sense: 0f 00 10 00 Mar 17 18:47:39.236975 kernel: sd 0:0:0:0: [sda] Write cache: disabled, read cache: enabled, supports DPO and FUA Mar 17 18:47:39.237125 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 18:47:39.237145 kernel: sd 0:0:0:0: [sda] Attached SCSI disk Mar 17 18:47:39.285296 kernel: hv_netvsc 7c1e5237-d8af-7c1e-5237-d8af7c1e5237 eth0: VF slot 1 added Mar 17 18:47:39.294478 kernel: hv_vmbus: registering driver hv_pci Mar 17 18:47:39.299482 kernel: hv_pci e9cbd427-7e43-4114-ab6e-8aa6e78f2bf2: PCI VMBus probing: Using version 0x10004 Mar 17 18:47:39.378236 kernel: hv_pci e9cbd427-7e43-4114-ab6e-8aa6e78f2bf2: PCI host bridge to bus 7e43:00 Mar 17 18:47:39.378407 kernel: pci_bus 7e43:00: root bus resource [mem 0xfe0000000-0xfe00fffff window] Mar 17 18:47:39.378583 kernel: pci_bus 7e43:00: No busn resource found for root bus, will use [bus 00-ff] Mar 17 18:47:39.378741 kernel: pci 7e43:00:02.0: [15b3:1016] type 00 class 0x020000 Mar 17 18:47:39.378922 kernel: pci 7e43:00:02.0: reg 0x10: [mem 0xfe0000000-0xfe00fffff 64bit pref] Mar 17 18:47:39.379079 kernel: pci 7e43:00:02.0: enabling Extended Tags Mar 17 18:47:39.379235 kernel: pci 7e43:00:02.0: 0.000 Gb/s available PCIe bandwidth, limited by Unknown x0 link at 7e43:00:02.0 (capable of 63.008 Gb/s with 8.0 GT/s PCIe x8 link) Mar 17 18:47:39.379389 kernel: pci_bus 7e43:00: busn_res: [bus 00-ff] end is updated to 00 Mar 17 18:47:39.379548 kernel: pci 7e43:00:02.0: BAR 0: assigned [mem 0xfe0000000-0xfe00fffff 64bit pref] Mar 17 18:47:39.472479 kernel: mlx5_core 7e43:00:02.0: firmware version: 14.30.5000 Mar 17 18:47:39.719606 kernel: mlx5_core 7e43:00:02.0: MLX5E: StrdRq(0) RqSz(1024) StrdSz(256) RxCqeCmprss(0) Mar 17 18:47:39.719764 kernel: BTRFS: device label OEM devid 1 transid 12 /dev/sda6 scanned by (udev-worker) (441) Mar 17 18:47:39.719777 kernel: mlx5_core 7e43:00:02.0: Supported tc offload range - chains: 1, prios: 1 Mar 17 18:47:39.719876 kernel: mlx5_core 7e43:00:02.0: mlx5e_tc_post_act_init:40:(pid 16): firmware level support is missing Mar 17 18:47:39.719972 kernel: hv_netvsc 7c1e5237-d8af-7c1e-5237-d8af7c1e5237 eth0: VF registering: eth1 Mar 17 18:47:39.720066 kernel: mlx5_core 7e43:00:02.0 eth1: joined to eth0 Mar 17 18:47:39.596240 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device. Mar 17 18:47:39.648570 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Mar 17 18:47:39.731516 kernel: mlx5_core 7e43:00:02.0 enP32323s1: renamed from eth1 Mar 17 18:47:39.806610 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device. Mar 17 18:47:39.825261 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device. Mar 17 18:47:39.827926 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device. Mar 17 18:47:39.833482 systemd[1]: Starting disk-uuid.service... Mar 17 18:47:39.848476 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 18:47:39.855474 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 18:47:40.863483 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Mar 17 18:47:40.863922 disk-uuid[567]: The operation has completed successfully. Mar 17 18:47:40.932058 systemd[1]: disk-uuid.service: Deactivated successfully. Mar 17 18:47:40.932159 systemd[1]: Finished disk-uuid.service. Mar 17 18:47:40.936000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:40.936000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:40.946857 systemd[1]: Starting verity-setup.service... Mar 17 18:47:40.985478 kernel: device-mapper: verity: sha256 using implementation "sha256-avx2" Mar 17 18:47:41.251090 systemd[1]: Found device dev-mapper-usr.device. Mar 17 18:47:41.256586 systemd[1]: Finished verity-setup.service. Mar 17 18:47:41.261000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=verity-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:41.262388 systemd[1]: Mounting sysusr-usr.mount... Mar 17 18:47:41.337022 kernel: EXT4-fs (dm-0): mounted filesystem without journal. Opts: norecovery. Quota mode: none. Mar 17 18:47:41.336928 systemd[1]: Mounted sysusr-usr.mount. Mar 17 18:47:41.338815 systemd[1]: afterburn-network-kargs.service was skipped because no trigger condition checks were met. Mar 17 18:47:41.339585 systemd[1]: Starting ignition-setup.service... Mar 17 18:47:41.350418 systemd[1]: Starting parse-ip-for-networkd.service... Mar 17 18:47:41.367748 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 18:47:41.367778 kernel: BTRFS info (device sda6): using free space tree Mar 17 18:47:41.367798 kernel: BTRFS info (device sda6): has skinny extents Mar 17 18:47:41.422750 systemd[1]: Finished parse-ip-for-networkd.service. Mar 17 18:47:41.427000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:41.427000 audit: BPF prog-id=9 op=LOAD Mar 17 18:47:41.428959 systemd[1]: Starting systemd-networkd.service... Mar 17 18:47:41.451431 systemd-networkd[808]: lo: Link UP Mar 17 18:47:41.455000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:41.451442 systemd-networkd[808]: lo: Gained carrier Mar 17 18:47:41.451985 systemd-networkd[808]: Enumeration completed Mar 17 18:47:41.452058 systemd[1]: Started systemd-networkd.service. Mar 17 18:47:41.455228 systemd-networkd[808]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:47:41.455978 systemd[1]: Reached target network.target. Mar 17 18:47:41.461966 systemd[1]: Starting iscsiuio.service... Mar 17 18:47:41.476096 systemd[1]: Started iscsiuio.service. Mar 17 18:47:41.483000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsiuio comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:41.485538 systemd[1]: mnt-oem.mount: Deactivated successfully. Mar 17 18:47:41.488563 systemd[1]: Starting iscsid.service... Mar 17 18:47:41.495637 iscsid[817]: iscsid: can't open InitiatorName configuration file /etc/iscsi/initiatorname.iscsi Mar 17 18:47:41.495637 iscsid[817]: iscsid: Warning: InitiatorName file /etc/iscsi/initiatorname.iscsi does not exist or does not contain a properly formatted InitiatorName. If using software iscsi (iscsi_tcp or ib_iser) or partial offload (bnx2i or cxgbi iscsi), you may not be able to log Mar 17 18:47:41.495637 iscsid[817]: into or discover targets. Please create a file /etc/iscsi/initiatorname.iscsi that contains a sting with the format: InitiatorName=iqn.yyyy-mm.[:identifier]. Mar 17 18:47:41.495637 iscsid[817]: Example: InitiatorName=iqn.2001-04.com.redhat:fc6. Mar 17 18:47:41.495637 iscsid[817]: If using hardware iscsi like qla4xxx this message can be ignored. Mar 17 18:47:41.495637 iscsid[817]: iscsid: can't open InitiatorAlias configuration file /etc/iscsi/initiatorname.iscsi Mar 17 18:47:41.495637 iscsid[817]: iscsid: can't open iscsid.safe_logout configuration file /etc/iscsi/iscsid.conf Mar 17 18:47:41.500000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=iscsid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:41.495698 systemd[1]: Started iscsid.service. Mar 17 18:47:41.528561 kernel: mlx5_core 7e43:00:02.0 enP32323s1: Link up Mar 17 18:47:41.528428 systemd[1]: Starting dracut-initqueue.service... Mar 17 18:47:41.539397 systemd[1]: Finished dracut-initqueue.service. Mar 17 18:47:41.543000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:41.543398 systemd[1]: Reached target remote-fs-pre.target. Mar 17 18:47:41.547259 systemd[1]: Reached target remote-cryptsetup.target. Mar 17 18:47:41.551468 systemd[1]: Reached target remote-fs.target. Mar 17 18:47:41.555776 systemd[1]: Starting dracut-pre-mount.service... Mar 17 18:47:41.564086 systemd[1]: Finished dracut-pre-mount.service. Mar 17 18:47:41.572055 kernel: hv_netvsc 7c1e5237-d8af-7c1e-5237-d8af7c1e5237 eth0: Data path switched to VF: enP32323s1 Mar 17 18:47:41.572254 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 18:47:41.573081 systemd-networkd[808]: enP32323s1: Link UP Mar 17 18:47:41.574000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:41.573290 systemd-networkd[808]: eth0: Link UP Mar 17 18:47:41.573730 systemd-networkd[808]: eth0: Gained carrier Mar 17 18:47:41.580285 systemd-networkd[808]: enP32323s1: Gained carrier Mar 17 18:47:41.602518 systemd-networkd[808]: eth0: DHCPv4 address 10.200.8.36/24, gateway 10.200.8.1 acquired from 168.63.129.16 Mar 17 18:47:41.636447 systemd[1]: Finished ignition-setup.service. Mar 17 18:47:41.638000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:41.639763 systemd[1]: Starting ignition-fetch-offline.service... Mar 17 18:47:43.507688 systemd-networkd[808]: eth0: Gained IPv6LL Mar 17 18:47:45.018728 ignition[832]: Ignition 2.14.0 Mar 17 18:47:45.018746 ignition[832]: Stage: fetch-offline Mar 17 18:47:45.018833 ignition[832]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:47:45.018883 ignition[832]: parsing config with SHA512: 4824fd4a4e57848da530dc2b56e2d3e9f5f19634d1c84ef29f8fc49255520728d0377a861a375d7c8cb5301ed861ff4ede4b440b074b1d6a86e23be9cefc2f63 Mar 17 18:47:45.109544 ignition[832]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:47:45.114000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.111008 systemd[1]: Finished ignition-fetch-offline.service. Mar 17 18:47:45.134595 kernel: kauditd_printk_skb: 18 callbacks suppressed Mar 17 18:47:45.134630 kernel: audit: type=1130 audit(1742237265.114:29): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.109725 ignition[832]: parsed url from cmdline: "" Mar 17 18:47:45.116548 systemd[1]: Starting ignition-fetch.service... Mar 17 18:47:45.109729 ignition[832]: no config URL provided Mar 17 18:47:45.109735 ignition[832]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 18:47:45.109743 ignition[832]: no config at "/usr/lib/ignition/user.ign" Mar 17 18:47:45.109749 ignition[832]: failed to fetch config: resource requires networking Mar 17 18:47:45.109864 ignition[832]: Ignition finished successfully Mar 17 18:47:45.124750 ignition[838]: Ignition 2.14.0 Mar 17 18:47:45.124756 ignition[838]: Stage: fetch Mar 17 18:47:45.124864 ignition[838]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:47:45.124890 ignition[838]: parsing config with SHA512: 4824fd4a4e57848da530dc2b56e2d3e9f5f19634d1c84ef29f8fc49255520728d0377a861a375d7c8cb5301ed861ff4ede4b440b074b1d6a86e23be9cefc2f63 Mar 17 18:47:45.130568 ignition[838]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:47:45.130882 ignition[838]: parsed url from cmdline: "" Mar 17 18:47:45.130886 ignition[838]: no config URL provided Mar 17 18:47:45.130891 ignition[838]: reading system config file "/usr/lib/ignition/user.ign" Mar 17 18:47:45.130900 ignition[838]: no config at "/usr/lib/ignition/user.ign" Mar 17 18:47:45.130931 ignition[838]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 Mar 17 18:47:45.213049 ignition[838]: GET result: OK Mar 17 18:47:45.213166 ignition[838]: config has been read from IMDS userdata Mar 17 18:47:45.213198 ignition[838]: parsing config with SHA512: 0cc7d02fb5aa5a1e08c8c403a33f91e18be392e2377bc3b88ef7125423db3636afaa59109849c7650f2e056dcacf95ac3f14fd189d941e6ecf5d2d3170d1b89c Mar 17 18:47:45.219623 unknown[838]: fetched base config from "system" Mar 17 18:47:45.219637 unknown[838]: fetched base config from "system" Mar 17 18:47:45.219647 unknown[838]: fetched user config from "azure" Mar 17 18:47:45.225752 ignition[838]: fetch: fetch complete Mar 17 18:47:45.225762 ignition[838]: fetch: fetch passed Mar 17 18:47:45.225816 ignition[838]: Ignition finished successfully Mar 17 18:47:45.231646 systemd[1]: Finished ignition-fetch.service. Mar 17 18:47:45.233000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.234445 systemd[1]: Starting ignition-kargs.service... Mar 17 18:47:45.255257 kernel: audit: type=1130 audit(1742237265.233:30): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.257396 ignition[844]: Ignition 2.14.0 Mar 17 18:47:45.257406 ignition[844]: Stage: kargs Mar 17 18:47:45.257579 ignition[844]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:47:45.257611 ignition[844]: parsing config with SHA512: 4824fd4a4e57848da530dc2b56e2d3e9f5f19634d1c84ef29f8fc49255520728d0377a861a375d7c8cb5301ed861ff4ede4b440b074b1d6a86e23be9cefc2f63 Mar 17 18:47:45.262386 ignition[844]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:47:45.264313 ignition[844]: kargs: kargs passed Mar 17 18:47:45.267000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.266632 systemd[1]: Finished ignition-kargs.service. Mar 17 18:47:45.284336 kernel: audit: type=1130 audit(1742237265.267:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.264357 ignition[844]: Ignition finished successfully Mar 17 18:47:45.269266 systemd[1]: Starting ignition-disks.service... Mar 17 18:47:45.293642 ignition[850]: Ignition 2.14.0 Mar 17 18:47:45.293653 ignition[850]: Stage: disks Mar 17 18:47:45.293788 ignition[850]: reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:47:45.293820 ignition[850]: parsing config with SHA512: 4824fd4a4e57848da530dc2b56e2d3e9f5f19634d1c84ef29f8fc49255520728d0377a861a375d7c8cb5301ed861ff4ede4b440b074b1d6a86e23be9cefc2f63 Mar 17 18:47:45.303057 ignition[850]: no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:47:45.304510 ignition[850]: disks: disks passed Mar 17 18:47:45.304554 ignition[850]: Ignition finished successfully Mar 17 18:47:45.309272 systemd[1]: Finished ignition-disks.service. Mar 17 18:47:45.310000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.311164 systemd[1]: Reached target initrd-root-device.target. Mar 17 18:47:45.335688 kernel: audit: type=1130 audit(1742237265.310:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.311241 systemd[1]: Reached target local-fs-pre.target. Mar 17 18:47:45.311604 systemd[1]: Reached target local-fs.target. Mar 17 18:47:45.311959 systemd[1]: Reached target sysinit.target. Mar 17 18:47:45.312313 systemd[1]: Reached target basic.target. Mar 17 18:47:45.313559 systemd[1]: Starting systemd-fsck-root.service... Mar 17 18:47:45.386537 systemd-fsck[858]: ROOT: clean, 623/7326000 files, 481078/7359488 blocks Mar 17 18:47:45.395038 systemd[1]: Finished systemd-fsck-root.service. Mar 17 18:47:45.419264 kernel: audit: type=1130 audit(1742237265.399:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.399000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:45.401080 systemd[1]: Mounting sysroot.mount... Mar 17 18:47:45.431204 kernel: EXT4-fs (sda9): mounted filesystem with ordered data mode. Opts: (null). Quota mode: none. Mar 17 18:47:45.427693 systemd[1]: Mounted sysroot.mount. Mar 17 18:47:45.431273 systemd[1]: Reached target initrd-root-fs.target. Mar 17 18:47:45.473290 systemd[1]: Mounting sysroot-usr.mount... Mar 17 18:47:45.479665 systemd[1]: Starting flatcar-metadata-hostname.service... Mar 17 18:47:45.484886 systemd[1]: ignition-remount-sysroot.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Mar 17 18:47:45.485003 systemd[1]: Reached target ignition-diskful.target. Mar 17 18:47:45.493980 systemd[1]: Mounted sysroot-usr.mount. Mar 17 18:47:45.558601 systemd[1]: Mounting sysroot-usr-share-oem.mount... Mar 17 18:47:45.564796 systemd[1]: Starting initrd-setup-root.service... Mar 17 18:47:45.571477 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/sda6 scanned by mount (869) Mar 17 18:47:45.581181 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 18:47:45.581222 kernel: BTRFS info (device sda6): using free space tree Mar 17 18:47:45.581243 kernel: BTRFS info (device sda6): has skinny extents Mar 17 18:47:45.588249 systemd[1]: Mounted sysroot-usr-share-oem.mount. Mar 17 18:47:45.596908 initrd-setup-root[874]: cut: /sysroot/etc/passwd: No such file or directory Mar 17 18:47:45.619699 initrd-setup-root[900]: cut: /sysroot/etc/group: No such file or directory Mar 17 18:47:45.639367 initrd-setup-root[908]: cut: /sysroot/etc/shadow: No such file or directory Mar 17 18:47:45.645767 initrd-setup-root[916]: cut: /sysroot/etc/gshadow: No such file or directory Mar 17 18:47:46.094277 systemd[1]: Finished initrd-setup-root.service. Mar 17 18:47:46.096000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:46.109287 systemd[1]: Starting ignition-mount.service... Mar 17 18:47:46.117793 kernel: audit: type=1130 audit(1742237266.096:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:46.113988 systemd[1]: Starting sysroot-boot.service... Mar 17 18:47:46.122784 systemd[1]: sysusr-usr-share-oem.mount: Deactivated successfully. Mar 17 18:47:46.122924 systemd[1]: sysroot-usr-share-oem.mount: Deactivated successfully. Mar 17 18:47:46.141091 ignition[935]: INFO : Ignition 2.14.0 Mar 17 18:47:46.141091 ignition[935]: INFO : Stage: mount Mar 17 18:47:46.145496 ignition[935]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:47:46.145496 ignition[935]: DEBUG : parsing config with SHA512: 4824fd4a4e57848da530dc2b56e2d3e9f5f19634d1c84ef29f8fc49255520728d0377a861a375d7c8cb5301ed861ff4ede4b440b074b1d6a86e23be9cefc2f63 Mar 17 18:47:46.157882 systemd[1]: Finished sysroot-boot.service. Mar 17 18:47:46.159000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:46.173485 kernel: audit: type=1130 audit(1742237266.159:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:46.176476 ignition[935]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:47:46.179898 ignition[935]: INFO : mount: mount passed Mar 17 18:47:46.179898 ignition[935]: INFO : Ignition finished successfully Mar 17 18:47:46.183676 systemd[1]: Finished ignition-mount.service. Mar 17 18:47:46.185000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:46.197478 kernel: audit: type=1130 audit(1742237266.185:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:46.934624 coreos-metadata[868]: Mar 17 18:47:46.934 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 Mar 17 18:47:46.951479 coreos-metadata[868]: Mar 17 18:47:46.951 INFO Fetch successful Mar 17 18:47:46.986853 coreos-metadata[868]: Mar 17 18:47:46.986 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 Mar 17 18:47:47.003000 coreos-metadata[868]: Mar 17 18:47:47.002 INFO Fetch successful Mar 17 18:47:47.023614 coreos-metadata[868]: Mar 17 18:47:47.023 INFO wrote hostname ci-3510.3.7-a-b312ad98ee to /sysroot/etc/hostname Mar 17 18:47:47.025914 systemd[1]: Finished flatcar-metadata-hostname.service. Mar 17 18:47:47.029000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:47.030987 systemd[1]: Starting ignition-files.service... Mar 17 18:47:47.047583 kernel: audit: type=1130 audit(1742237267.029:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:47.053100 systemd[1]: Mounting sysroot-usr-share-oem.mount... Mar 17 18:47:47.070175 kernel: BTRFS: device label OEM devid 1 transid 15 /dev/sda6 scanned by mount (947) Mar 17 18:47:47.070221 kernel: BTRFS info (device sda6): using crc32c (crc32c-intel) checksum algorithm Mar 17 18:47:47.070237 kernel: BTRFS info (device sda6): using free space tree Mar 17 18:47:47.076813 kernel: BTRFS info (device sda6): has skinny extents Mar 17 18:47:47.081155 systemd[1]: Mounted sysroot-usr-share-oem.mount. Mar 17 18:47:47.093098 ignition[966]: INFO : Ignition 2.14.0 Mar 17 18:47:47.093098 ignition[966]: INFO : Stage: files Mar 17 18:47:47.096484 ignition[966]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:47:47.096484 ignition[966]: DEBUG : parsing config with SHA512: 4824fd4a4e57848da530dc2b56e2d3e9f5f19634d1c84ef29f8fc49255520728d0377a861a375d7c8cb5301ed861ff4ede4b440b074b1d6a86e23be9cefc2f63 Mar 17 18:47:47.109189 ignition[966]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:47:47.125804 ignition[966]: DEBUG : files: compiled without relabeling support, skipping Mar 17 18:47:47.128952 ignition[966]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Mar 17 18:47:47.128952 ignition[966]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Mar 17 18:47:47.173188 ignition[966]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Mar 17 18:47:47.176867 ignition[966]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Mar 17 18:47:47.189031 unknown[966]: wrote ssh authorized keys file for user: core Mar 17 18:47:47.191526 ignition[966]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Mar 17 18:47:47.195210 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 18:47:47.199157 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/etc/flatcar-cgroupv1" Mar 17 18:47:47.203168 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 18:47:47.207526 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET https://get.helm.sh/helm-v3.13.2-linux-amd64.tar.gz: attempt #1 Mar 17 18:47:47.292130 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(4): GET result: OK Mar 17 18:47:47.378904 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-amd64.tar.gz" Mar 17 18:47:47.383837 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/install.sh" Mar 17 18:47:47.387918 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/install.sh" Mar 17 18:47:47.387918 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nginx.yaml" Mar 17 18:47:47.395992 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nginx.yaml" Mar 17 18:47:47.400396 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 18:47:47.404429 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Mar 17 18:47:47.408345 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 18:47:47.412496 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Mar 17 18:47:47.416753 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 18:47:47.421028 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing file "/sysroot/etc/flatcar/update.conf" Mar 17 18:47:47.425317 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 18:47:47.431372 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 18:47:47.447016 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(b): [started] writing file "/sysroot/etc/systemd/system/waagent.service" Mar 17 18:47:47.451332 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(b): oem config not found in "/usr/share/oem", looking on oem partition Mar 17 18:47:47.460607 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(b): op(c): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2651776284" Mar 17 18:47:47.465023 ignition[966]: CRITICAL : files: createFilesystemsFiles: createFiles: op(b): op(c): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2651776284": device or resource busy Mar 17 18:47:47.465023 ignition[966]: ERROR : files: createFilesystemsFiles: createFiles: op(b): failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem2651776284", trying btrfs: device or resource busy Mar 17 18:47:47.465023 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(b): op(d): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2651776284" Mar 17 18:47:47.484473 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(b): op(d): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem2651776284" Mar 17 18:47:47.488926 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(b): op(e): [started] unmounting "/mnt/oem2651776284" Mar 17 18:47:47.486463 systemd[1]: mnt-oem2651776284.mount: Deactivated successfully. Mar 17 18:47:47.495055 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(b): op(e): [finished] unmounting "/mnt/oem2651776284" Mar 17 18:47:47.495055 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(b): [finished] writing file "/sysroot/etc/systemd/system/waagent.service" Mar 17 18:47:47.495055 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(f): [started] writing file "/sysroot/etc/systemd/system/nvidia.service" Mar 17 18:47:47.495055 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(f): oem config not found in "/usr/share/oem", looking on oem partition Mar 17 18:47:47.495055 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(10): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3263360470" Mar 17 18:47:47.515983 ignition[966]: CRITICAL : files: createFilesystemsFiles: createFiles: op(f): op(10): [failed] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3263360470": device or resource busy Mar 17 18:47:47.515983 ignition[966]: ERROR : files: createFilesystemsFiles: createFiles: op(f): failed to mount ext4 device "/dev/disk/by-label/OEM" at "/mnt/oem3263360470", trying btrfs: device or resource busy Mar 17 18:47:47.515983 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(11): [started] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3263360470" Mar 17 18:47:47.515983 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(11): [finished] mounting "/dev/disk/by-label/OEM" at "/mnt/oem3263360470" Mar 17 18:47:47.515983 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(12): [started] unmounting "/mnt/oem3263360470" Mar 17 18:47:47.515983 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(f): op(12): [finished] unmounting "/mnt/oem3263360470" Mar 17 18:47:47.515983 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(f): [finished] writing file "/sysroot/etc/systemd/system/nvidia.service" Mar 17 18:47:47.515983 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(13): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 18:47:47.515983 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(13): GET https://github.com/flatcar/sysext-bakery/releases/download/latest/kubernetes-v1.30.1-x86-64.raw: attempt #1 Mar 17 18:47:47.509177 systemd[1]: mnt-oem3263360470.mount: Deactivated successfully. Mar 17 18:47:48.067470 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(13): GET result: OK Mar 17 18:47:48.509921 ignition[966]: INFO : files: createFilesystemsFiles: createFiles: op(13): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.30.1-x86-64.raw" Mar 17 18:47:48.509921 ignition[966]: INFO : files: op(14): [started] processing unit "waagent.service" Mar 17 18:47:48.509921 ignition[966]: INFO : files: op(14): [finished] processing unit "waagent.service" Mar 17 18:47:48.509921 ignition[966]: INFO : files: op(15): [started] processing unit "nvidia.service" Mar 17 18:47:48.509921 ignition[966]: INFO : files: op(15): [finished] processing unit "nvidia.service" Mar 17 18:47:48.509921 ignition[966]: INFO : files: op(16): [started] processing unit "containerd.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(16): op(17): [started] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(16): op(17): [finished] writing systemd drop-in "10-use-cgroupfs.conf" at "/sysroot/etc/systemd/system/containerd.service.d/10-use-cgroupfs.conf" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(16): [finished] processing unit "containerd.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(18): [started] processing unit "prepare-helm.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(18): op(19): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(18): op(19): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(18): [finished] processing unit "prepare-helm.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(1a): [started] setting preset to enabled for "waagent.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(1a): [finished] setting preset to enabled for "waagent.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(1b): [started] setting preset to enabled for "nvidia.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(1b): [finished] setting preset to enabled for "nvidia.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(1c): [started] setting preset to enabled for "prepare-helm.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: op(1c): [finished] setting preset to enabled for "prepare-helm.service" Mar 17 18:47:48.529393 ignition[966]: INFO : files: createResultFile: createFiles: op(1d): [started] writing file "/sysroot/etc/.ignition-result.json" Mar 17 18:47:48.594750 kernel: audit: type=1130 audit(1742237268.569:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.569000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.564685 systemd[1]: Finished ignition-files.service. Mar 17 18:47:48.600000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.600000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.600636 ignition[966]: INFO : files: createResultFile: createFiles: op(1d): [finished] writing file "/sysroot/etc/.ignition-result.json" Mar 17 18:47:48.600636 ignition[966]: INFO : files: files passed Mar 17 18:47:48.600636 ignition[966]: INFO : Ignition finished successfully Mar 17 18:47:48.581965 systemd[1]: Starting initrd-setup-root-after-ignition.service... Mar 17 18:47:48.590854 systemd[1]: torcx-profile-populate.service was skipped because of an unmet condition check (ConditionPathExists=/sysroot/etc/torcx/next-profile). Mar 17 18:47:48.591673 systemd[1]: Starting ignition-quench.service... Mar 17 18:47:48.596852 systemd[1]: ignition-quench.service: Deactivated successfully. Mar 17 18:47:48.596946 systemd[1]: Finished ignition-quench.service. Mar 17 18:47:48.620596 initrd-setup-root-after-ignition[991]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Mar 17 18:47:48.619351 systemd[1]: Finished initrd-setup-root-after-ignition.service. Mar 17 18:47:48.629000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.629908 systemd[1]: Reached target ignition-complete.target. Mar 17 18:47:48.634621 systemd[1]: Starting initrd-parse-etc.service... Mar 17 18:47:48.648036 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Mar 17 18:47:48.651000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.651000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.648144 systemd[1]: Finished initrd-parse-etc.service. Mar 17 18:47:48.652336 systemd[1]: Reached target initrd-fs.target. Mar 17 18:47:48.654154 systemd[1]: Reached target initrd.target. Mar 17 18:47:48.655851 systemd[1]: dracut-mount.service was skipped because no trigger condition checks were met. Mar 17 18:47:48.656607 systemd[1]: Starting dracut-pre-pivot.service... Mar 17 18:47:48.672719 systemd[1]: Finished dracut-pre-pivot.service. Mar 17 18:47:48.676000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.677237 systemd[1]: Starting initrd-cleanup.service... Mar 17 18:47:48.686662 systemd[1]: Stopped target nss-lookup.target. Mar 17 18:47:48.690740 systemd[1]: Stopped target remote-cryptsetup.target. Mar 17 18:47:48.692971 systemd[1]: Stopped target timers.target. Mar 17 18:47:48.696860 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Mar 17 18:47:48.700000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.696997 systemd[1]: Stopped dracut-pre-pivot.service. Mar 17 18:47:48.700691 systemd[1]: Stopped target initrd.target. Mar 17 18:47:48.704501 systemd[1]: Stopped target basic.target. Mar 17 18:47:48.708128 systemd[1]: Stopped target ignition-complete.target. Mar 17 18:47:48.711944 systemd[1]: Stopped target ignition-diskful.target. Mar 17 18:47:48.715608 systemd[1]: Stopped target initrd-root-device.target. Mar 17 18:47:48.719812 systemd[1]: Stopped target remote-fs.target. Mar 17 18:47:48.723582 systemd[1]: Stopped target remote-fs-pre.target. Mar 17 18:47:48.727824 systemd[1]: Stopped target sysinit.target. Mar 17 18:47:48.731510 systemd[1]: Stopped target local-fs.target. Mar 17 18:47:48.735205 systemd[1]: Stopped target local-fs-pre.target. Mar 17 18:47:48.738793 systemd[1]: Stopped target swap.target. Mar 17 18:47:48.746000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.742535 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Mar 17 18:47:48.742683 systemd[1]: Stopped dracut-pre-mount.service. Mar 17 18:47:48.753000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.746331 systemd[1]: Stopped target cryptsetup.target. Mar 17 18:47:48.759000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.750044 systemd[1]: dracut-initqueue.service: Deactivated successfully. Mar 17 18:47:48.762000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.750187 systemd[1]: Stopped dracut-initqueue.service. Mar 17 18:47:48.767000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.754415 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Mar 17 18:47:48.784468 ignition[1004]: INFO : Ignition 2.14.0 Mar 17 18:47:48.784468 ignition[1004]: INFO : Stage: umount Mar 17 18:47:48.784468 ignition[1004]: INFO : reading system config file "/usr/lib/ignition/base.d/base.ign" Mar 17 18:47:48.784468 ignition[1004]: DEBUG : parsing config with SHA512: 4824fd4a4e57848da530dc2b56e2d3e9f5f19634d1c84ef29f8fc49255520728d0377a861a375d7c8cb5301ed861ff4ede4b440b074b1d6a86e23be9cefc2f63 Mar 17 18:47:48.784000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.754559 systemd[1]: Stopped initrd-setup-root-after-ignition.service. Mar 17 18:47:48.798095 ignition[1004]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" Mar 17 18:47:48.798095 ignition[1004]: INFO : umount: umount passed Mar 17 18:47:48.798095 ignition[1004]: INFO : Ignition finished successfully Mar 17 18:47:48.759491 systemd[1]: ignition-files.service: Deactivated successfully. Mar 17 18:47:48.759622 systemd[1]: Stopped ignition-files.service. Mar 17 18:47:48.763142 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Mar 17 18:47:48.763275 systemd[1]: Stopped flatcar-metadata-hostname.service. Mar 17 18:47:48.768388 systemd[1]: Stopping ignition-mount.service... Mar 17 18:47:48.779593 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Mar 17 18:47:48.779822 systemd[1]: Stopped kmod-static-nodes.service. Mar 17 18:47:48.813073 systemd[1]: Stopping sysroot-boot.service... Mar 17 18:47:48.822379 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Mar 17 18:47:48.824000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.826000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.822580 systemd[1]: Stopped systemd-udev-trigger.service. Mar 17 18:47:48.824978 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Mar 17 18:47:48.825130 systemd[1]: Stopped dracut-pre-trigger.service. Mar 17 18:47:48.828726 systemd[1]: ignition-mount.service: Deactivated successfully. Mar 17 18:47:48.829023 systemd[1]: Stopped ignition-mount.service. Mar 17 18:47:48.841000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.842974 systemd[1]: initrd-cleanup.service: Deactivated successfully. Mar 17 18:47:48.843177 systemd[1]: Finished initrd-cleanup.service. Mar 17 18:47:48.848000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.848000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.850064 systemd[1]: ignition-disks.service: Deactivated successfully. Mar 17 18:47:48.850209 systemd[1]: Stopped ignition-disks.service. Mar 17 18:47:48.855000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.856171 systemd[1]: ignition-kargs.service: Deactivated successfully. Mar 17 18:47:48.856293 systemd[1]: Stopped ignition-kargs.service. Mar 17 18:47:48.861000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.861931 systemd[1]: ignition-fetch.service: Deactivated successfully. Mar 17 18:47:48.862069 systemd[1]: Stopped ignition-fetch.service. Mar 17 18:47:48.867000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.867630 systemd[1]: Stopped target network.target. Mar 17 18:47:48.871423 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Mar 17 18:47:48.871493 systemd[1]: Stopped ignition-fetch-offline.service. Mar 17 18:47:48.877000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.878209 systemd[1]: Stopped target paths.target. Mar 17 18:47:48.881954 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Mar 17 18:47:48.886526 systemd[1]: Stopped systemd-ask-password-console.path. Mar 17 18:47:48.888745 systemd[1]: Stopped target slices.target. Mar 17 18:47:48.894683 systemd[1]: Stopped target sockets.target. Mar 17 18:47:48.898143 systemd[1]: iscsid.socket: Deactivated successfully. Mar 17 18:47:48.898185 systemd[1]: Closed iscsid.socket. Mar 17 18:47:48.903260 systemd[1]: iscsiuio.socket: Deactivated successfully. Mar 17 18:47:48.903296 systemd[1]: Closed iscsiuio.socket. Mar 17 18:47:48.908132 systemd[1]: ignition-setup.service: Deactivated successfully. Mar 17 18:47:48.908898 systemd[1]: Stopped ignition-setup.service. Mar 17 18:47:48.912000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.912913 systemd[1]: Stopping systemd-networkd.service... Mar 17 18:47:48.915963 systemd[1]: Stopping systemd-resolved.service... Mar 17 18:47:48.919691 systemd-networkd[808]: eth0: DHCPv6 lease lost Mar 17 18:47:48.922794 systemd[1]: sysroot-boot.mount: Deactivated successfully. Mar 17 18:47:48.923284 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 18:47:48.929000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.923392 systemd[1]: Stopped systemd-networkd.service. Mar 17 18:47:48.932482 systemd[1]: systemd-resolved.service: Deactivated successfully. Mar 17 18:47:48.935000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.932582 systemd[1]: Stopped systemd-resolved.service. Mar 17 18:47:48.937000 audit: BPF prog-id=9 op=UNLOAD Mar 17 18:47:48.939000 audit: BPF prog-id=6 op=UNLOAD Mar 17 18:47:48.938393 systemd[1]: systemd-networkd.socket: Deactivated successfully. Mar 17 18:47:48.938432 systemd[1]: Closed systemd-networkd.socket. Mar 17 18:47:48.950000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.943091 systemd[1]: Stopping network-cleanup.service... Mar 17 18:47:48.946646 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Mar 17 18:47:48.954000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.946707 systemd[1]: Stopped parse-ip-for-networkd.service. Mar 17 18:47:48.961000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.950651 systemd[1]: systemd-sysctl.service: Deactivated successfully. Mar 17 18:47:48.950705 systemd[1]: Stopped systemd-sysctl.service. Mar 17 18:47:48.957547 systemd[1]: systemd-modules-load.service: Deactivated successfully. Mar 17 18:47:48.971000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.957588 systemd[1]: Stopped systemd-modules-load.service. Mar 17 18:47:48.961644 systemd[1]: Stopping systemd-udevd.service... Mar 17 18:47:48.968197 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Mar 17 18:47:48.968723 systemd[1]: systemd-udevd.service: Deactivated successfully. Mar 17 18:47:48.968853 systemd[1]: Stopped systemd-udevd.service. Mar 17 18:47:48.986000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.989000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.973768 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Mar 17 18:47:48.991000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.973816 systemd[1]: Closed systemd-udevd-control.socket. Mar 17 18:47:48.998000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.980126 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Mar 17 18:47:48.980173 systemd[1]: Closed systemd-udevd-kernel.socket. Mar 17 18:47:49.005000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:49.005000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:48.984447 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Mar 17 18:47:48.984526 systemd[1]: Stopped dracut-pre-udev.service. Mar 17 18:47:48.986280 systemd[1]: dracut-cmdline.service: Deactivated successfully. Mar 17 18:47:48.986318 systemd[1]: Stopped dracut-cmdline.service. Mar 17 18:47:48.989802 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Mar 17 18:47:48.989852 systemd[1]: Stopped dracut-cmdline-ask.service. Mar 17 18:47:49.026583 kernel: hv_netvsc 7c1e5237-d8af-7c1e-5237-d8af7c1e5237 eth0: Data path switched from VF: enP32323s1 Mar 17 18:47:48.992530 systemd[1]: Starting initrd-udevadm-cleanup-db.service... Mar 17 18:47:48.996347 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Mar 17 18:47:48.996407 systemd[1]: Stopped systemd-vconsole-setup.service. Mar 17 18:47:49.002688 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Mar 17 18:47:49.002776 systemd[1]: Finished initrd-udevadm-cleanup-db.service. Mar 17 18:47:49.041958 systemd[1]: network-cleanup.service: Deactivated successfully. Mar 17 18:47:49.046000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:49.042089 systemd[1]: Stopped network-cleanup.service. Mar 17 18:47:49.394149 systemd[1]: sysroot-boot.service: Deactivated successfully. Mar 17 18:47:49.394264 systemd[1]: Stopped sysroot-boot.service. Mar 17 18:47:49.400000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:49.400279 systemd[1]: Reached target initrd-switch-root.target. Mar 17 18:47:49.404755 systemd[1]: initrd-setup-root.service: Deactivated successfully. Mar 17 18:47:49.404823 systemd[1]: Stopped initrd-setup-root.service. Mar 17 18:47:49.410000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:47:49.411712 systemd[1]: Starting initrd-switch-root.service... Mar 17 18:47:49.532000 audit: BPF prog-id=5 op=UNLOAD Mar 17 18:47:49.532000 audit: BPF prog-id=4 op=UNLOAD Mar 17 18:47:49.532000 audit: BPF prog-id=3 op=UNLOAD Mar 17 18:47:49.532508 systemd[1]: Switching root. Mar 17 18:47:49.538000 audit: BPF prog-id=8 op=UNLOAD Mar 17 18:47:49.538000 audit: BPF prog-id=7 op=UNLOAD Mar 17 18:47:49.562634 systemd-journald[183]: Received SIGTERM from PID 1 (systemd). Mar 17 18:47:49.562708 iscsid[817]: iscsid shutting down. Mar 17 18:47:49.564521 systemd-journald[183]: Journal stopped Mar 17 18:48:04.607821 kernel: SELinux: Class mctp_socket not defined in policy. Mar 17 18:48:04.607847 kernel: SELinux: Class anon_inode not defined in policy. Mar 17 18:48:04.607858 kernel: SELinux: the above unknown classes and permissions will be allowed Mar 17 18:48:04.607866 kernel: SELinux: policy capability network_peer_controls=1 Mar 17 18:48:04.607874 kernel: SELinux: policy capability open_perms=1 Mar 17 18:48:04.607885 kernel: SELinux: policy capability extended_socket_class=1 Mar 17 18:48:04.607895 kernel: SELinux: policy capability always_check_network=0 Mar 17 18:48:04.607906 kernel: SELinux: policy capability cgroup_seclabel=1 Mar 17 18:48:04.607915 kernel: SELinux: policy capability nnp_nosuid_transition=1 Mar 17 18:48:04.607923 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Mar 17 18:48:04.607931 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Mar 17 18:48:04.607942 kernel: kauditd_printk_skb: 46 callbacks suppressed Mar 17 18:48:04.607951 kernel: audit: type=1403 audit(1742237272.806:85): auid=4294967295 ses=4294967295 lsm=selinux res=1 Mar 17 18:48:04.607964 systemd[1]: Successfully loaded SELinux policy in 306.080ms. Mar 17 18:48:04.607980 systemd[1]: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 24.877ms. Mar 17 18:48:04.607993 systemd[1]: systemd 252 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL -ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE -TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified) Mar 17 18:48:04.608003 systemd[1]: Detected virtualization microsoft. Mar 17 18:48:04.608014 systemd[1]: Detected architecture x86-64. Mar 17 18:48:04.608026 systemd[1]: Detected first boot. Mar 17 18:48:04.608038 systemd[1]: Hostname set to . Mar 17 18:48:04.608051 systemd[1]: Initializing machine ID from random generator. Mar 17 18:48:04.608063 kernel: audit: type=1400 audit(1742237273.512:86): avc: denied { integrity } for pid=1 comm="systemd" lockdown_reason="/dev/mem,kmem,port" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Mar 17 18:48:04.608072 kernel: SELinux: Context system_u:object_r:container_file_t:s0:c1022,c1023 is not valid (left unmapped). Mar 17 18:48:04.608084 kernel: audit: type=1400 audit(1742237274.892:87): avc: denied { associate } for pid=1056 comm="torcx-generator" name="docker" dev="tmpfs" ino=2 scontext=system_u:object_r:unlabeled_t:s0 tcontext=system_u:object_r:tmpfs_t:s0 tclass=filesystem permissive=1 srawcon="system_u:object_r:container_file_t:s0:c1022,c1023" Mar 17 18:48:04.608097 kernel: audit: type=1300 audit(1742237274.892:87): arch=c000003e syscall=188 success=yes exit=0 a0=c00018e5bc a1=c00018a7b0 a2=c00019c680 a3=32 items=0 ppid=1039 pid=1056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="torcx-generator" exe="/usr/lib/systemd/system-generators/torcx-generator" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:48:04.608108 kernel: audit: type=1327 audit(1742237274.892:87): proctitle=2F7573722F6C69622F73797374656D642F73797374656D2D67656E657261746F72732F746F7263782D67656E657261746F72002F72756E2F73797374656D642F67656E657261746F72002F72756E2F73797374656D642F67656E657261746F722E6561726C79002F72756E2F73797374656D642F67656E657261746F722E6C61 Mar 17 18:48:04.608120 kernel: audit: type=1400 audit(1742237274.899:88): avc: denied { associate } for pid=1056 comm="torcx-generator" name="lib" scontext=system_u:object_r:unlabeled_t:s0 tcontext=system_u:object_r:tmpfs_t:s0 tclass=filesystem permissive=1 Mar 17 18:48:04.608133 kernel: audit: type=1300 audit(1742237274.899:88): arch=c000003e syscall=258 success=yes exit=0 a0=ffffffffffffff9c a1=c00018e695 a2=1ed a3=0 items=2 ppid=1039 pid=1056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="torcx-generator" exe="/usr/lib/systemd/system-generators/torcx-generator" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:48:04.608142 kernel: audit: type=1307 audit(1742237274.899:88): cwd="/" Mar 17 18:48:04.608154 kernel: audit: type=1302 audit(1742237274.899:88): item=0 name=(null) inode=2 dev=00:2a mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:unlabeled_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:04.608165 kernel: audit: type=1302 audit(1742237274.899:88): item=1 name=(null) inode=3 dev=00:2a mode=040755 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:unlabeled_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:04.608176 systemd[1]: Populated /etc with preset unit settings. Mar 17 18:48:04.608189 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:48:04.608200 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:48:04.608212 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:48:04.608224 systemd[1]: Queued start job for default target multi-user.target. Mar 17 18:48:04.608235 systemd[1]: Unnecessary job was removed for dev-sda6.device. Mar 17 18:48:04.608248 systemd[1]: Created slice system-addon\x2dconfig.slice. Mar 17 18:48:04.608260 systemd[1]: Created slice system-addon\x2drun.slice. Mar 17 18:48:04.608273 systemd[1]: Created slice system-getty.slice. Mar 17 18:48:04.608285 systemd[1]: Created slice system-modprobe.slice. Mar 17 18:48:04.608297 systemd[1]: Created slice system-serial\x2dgetty.slice. Mar 17 18:48:04.608310 systemd[1]: Created slice system-system\x2dcloudinit.slice. Mar 17 18:48:04.608320 systemd[1]: Created slice system-systemd\x2dfsck.slice. Mar 17 18:48:04.608332 systemd[1]: Created slice user.slice. Mar 17 18:48:04.608345 systemd[1]: Started systemd-ask-password-console.path. Mar 17 18:48:04.608357 systemd[1]: Started systemd-ask-password-wall.path. Mar 17 18:48:04.608368 systemd[1]: Set up automount boot.automount. Mar 17 18:48:04.608381 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount. Mar 17 18:48:04.608391 systemd[1]: Reached target integritysetup.target. Mar 17 18:48:04.608403 systemd[1]: Reached target remote-cryptsetup.target. Mar 17 18:48:04.608416 systemd[1]: Reached target remote-fs.target. Mar 17 18:48:04.608426 systemd[1]: Reached target slices.target. Mar 17 18:48:04.608438 systemd[1]: Reached target swap.target. Mar 17 18:48:04.608513 systemd[1]: Reached target torcx.target. Mar 17 18:48:04.608530 systemd[1]: Reached target veritysetup.target. Mar 17 18:48:04.608541 systemd[1]: Listening on systemd-coredump.socket. Mar 17 18:48:04.608552 systemd[1]: Listening on systemd-initctl.socket. Mar 17 18:48:04.608564 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:48:04.608575 kernel: audit: type=1400 audit(1742237284.277:89): avc: denied { audit_read } for pid=1 comm="systemd" capability=37 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Mar 17 18:48:04.608586 systemd[1]: Listening on systemd-journald-audit.socket. Mar 17 18:48:04.608598 kernel: audit: type=1335 audit(1742237284.277:90): pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Mar 17 18:48:04.608612 systemd[1]: Listening on systemd-journald-dev-log.socket. Mar 17 18:48:04.608623 systemd[1]: Listening on systemd-journald.socket. Mar 17 18:48:04.608634 systemd[1]: Listening on systemd-networkd.socket. Mar 17 18:48:04.608645 systemd[1]: Listening on systemd-udevd-control.socket. Mar 17 18:48:04.608657 systemd[1]: Listening on systemd-udevd-kernel.socket. Mar 17 18:48:04.608679 systemd[1]: Listening on systemd-userdbd.socket. Mar 17 18:48:04.608692 systemd[1]: Mounting dev-hugepages.mount... Mar 17 18:48:04.608704 systemd[1]: Mounting dev-mqueue.mount... Mar 17 18:48:04.608714 systemd[1]: Mounting media.mount... Mar 17 18:48:04.608727 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:48:04.608739 systemd[1]: Mounting sys-kernel-debug.mount... Mar 17 18:48:04.608749 systemd[1]: Mounting sys-kernel-tracing.mount... Mar 17 18:48:04.608762 systemd[1]: Mounting tmp.mount... Mar 17 18:48:04.608773 systemd[1]: Starting flatcar-tmpfiles.service... Mar 17 18:48:04.608786 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:48:04.608799 systemd[1]: Starting kmod-static-nodes.service... Mar 17 18:48:04.608812 systemd[1]: Starting modprobe@configfs.service... Mar 17 18:48:04.608822 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:48:04.608835 systemd[1]: Starting modprobe@drm.service... Mar 17 18:48:04.608849 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:48:04.608858 systemd[1]: Starting modprobe@fuse.service... Mar 17 18:48:04.608871 systemd[1]: Starting modprobe@loop.service... Mar 17 18:48:04.608884 systemd[1]: setup-nsswitch.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Mar 17 18:48:04.608897 systemd[1]: systemd-journald.service: unit configures an IP firewall, but the local system does not support BPF/cgroup firewalling. Mar 17 18:48:04.608909 systemd[1]: (This warning is only shown for the first unit using IP firewalling.) Mar 17 18:48:04.608922 systemd[1]: Starting systemd-journald.service... Mar 17 18:48:04.608932 kernel: loop: module loaded Mar 17 18:48:04.608944 systemd[1]: Starting systemd-modules-load.service... Mar 17 18:48:04.608957 systemd[1]: Starting systemd-network-generator.service... Mar 17 18:48:04.608967 systemd[1]: Starting systemd-remount-fs.service... Mar 17 18:48:04.608980 systemd[1]: Starting systemd-udev-trigger.service... Mar 17 18:48:04.608994 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:48:04.609005 systemd[1]: Mounted dev-hugepages.mount. Mar 17 18:48:04.609017 systemd[1]: Mounted dev-mqueue.mount. Mar 17 18:48:04.609029 systemd[1]: Mounted media.mount. Mar 17 18:48:04.609040 systemd[1]: Mounted sys-kernel-debug.mount. Mar 17 18:48:04.609051 systemd[1]: Mounted sys-kernel-tracing.mount. Mar 17 18:48:04.609064 kernel: fuse: init (API version 7.34) Mar 17 18:48:04.609075 systemd[1]: Mounted tmp.mount. Mar 17 18:48:04.609086 systemd[1]: Finished flatcar-tmpfiles.service. Mar 17 18:48:04.609101 systemd[1]: Finished kmod-static-nodes.service. Mar 17 18:48:04.609113 kernel: audit: type=1130 audit(1742237284.602:91): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.609128 systemd-journald[1169]: Journal started Mar 17 18:48:04.609175 systemd-journald[1169]: Runtime Journal (/run/log/journal/7219972b8f0947a99ace88d218068f9e) is 8.0M, max 159.0M, 151.0M free. Mar 17 18:48:04.277000 audit[1]: AVC avc: denied { audit_read } for pid=1 comm="systemd" capability=37 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=1 Mar 17 18:48:04.277000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Mar 17 18:48:04.602000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.602000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Mar 17 18:48:04.602000 audit[1169]: SYSCALL arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffd1641f030 a2=4000 a3=7ffd1641f0cc items=0 ppid=1 pid=1169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:48:04.602000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Mar 17 18:48:04.625363 kernel: audit: type=1305 audit(1742237284.602:92): op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Mar 17 18:48:04.625403 kernel: audit: type=1300 audit(1742237284.602:92): arch=c000003e syscall=46 success=yes exit=60 a0=3 a1=7ffd1641f030 a2=4000 a3=7ffd1641f0cc items=0 ppid=1 pid=1169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:48:04.625418 kernel: audit: type=1327 audit(1742237284.602:92): proctitle="/usr/lib/systemd/systemd-journald" Mar 17 18:48:04.649000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.661485 kernel: audit: type=1130 audit(1742237284.649:93): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.661519 systemd[1]: Started systemd-journald.service. Mar 17 18:48:04.668047 systemd[1]: modprobe@configfs.service: Deactivated successfully. Mar 17 18:48:04.666000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.674642 systemd[1]: Finished modprobe@configfs.service. Mar 17 18:48:04.681474 kernel: audit: type=1130 audit(1742237284.666:94): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.682000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.683203 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:48:04.696463 kernel: audit: type=1130 audit(1742237284.682:95): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.683362 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:48:04.682000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.699053 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 18:48:04.699268 systemd[1]: Finished modprobe@drm.service. Mar 17 18:48:04.717006 kernel: audit: type=1131 audit(1742237284.682:96): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.698000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.698000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.713000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.713000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.716000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.716000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.714384 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:48:04.714589 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:48:04.716933 systemd[1]: modprobe@fuse.service: Deactivated successfully. Mar 17 18:48:04.719000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.719000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.723000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.723000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.726000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.717742 systemd[1]: Finished modprobe@fuse.service. Mar 17 18:48:04.720260 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:48:04.721993 systemd[1]: Finished modprobe@loop.service. Mar 17 18:48:04.724353 systemd[1]: Finished systemd-modules-load.service. Mar 17 18:48:04.726944 systemd[1]: Finished systemd-network-generator.service. Mar 17 18:48:04.728000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.731000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.729829 systemd[1]: Finished systemd-remount-fs.service. Mar 17 18:48:04.732292 systemd[1]: Finished systemd-udev-trigger.service. Mar 17 18:48:04.733000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.734851 systemd[1]: Reached target network-pre.target. Mar 17 18:48:04.738019 systemd[1]: Mounting sys-fs-fuse-connections.mount... Mar 17 18:48:04.741640 systemd[1]: Mounting sys-kernel-config.mount... Mar 17 18:48:04.743659 systemd[1]: remount-root.service was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Mar 17 18:48:04.763077 systemd[1]: Starting systemd-hwdb-update.service... Mar 17 18:48:04.766306 systemd[1]: Starting systemd-journal-flush.service... Mar 17 18:48:04.768411 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:48:04.769478 systemd[1]: Starting systemd-random-seed.service... Mar 17 18:48:04.771398 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:48:04.772491 systemd[1]: Starting systemd-sysctl.service... Mar 17 18:48:04.775574 systemd[1]: Starting systemd-sysusers.service... Mar 17 18:48:04.779072 systemd[1]: Starting systemd-udev-settle.service... Mar 17 18:48:04.785772 systemd[1]: Mounted sys-fs-fuse-connections.mount. Mar 17 18:48:04.788345 systemd[1]: Mounted sys-kernel-config.mount. Mar 17 18:48:04.793448 udevadm[1208]: systemd-udev-settle.service is deprecated. Please fix lvm2-activation.service, lvm2-activation-early.service not to pull it in. Mar 17 18:48:04.824000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.822669 systemd[1]: Finished systemd-random-seed.service. Mar 17 18:48:04.825589 systemd[1]: Reached target first-boot-complete.target. Mar 17 18:48:04.843946 systemd-journald[1169]: Time spent on flushing to /var/log/journal/7219972b8f0947a99ace88d218068f9e is 28.894ms for 1091 entries. Mar 17 18:48:04.843946 systemd-journald[1169]: System Journal (/var/log/journal/7219972b8f0947a99ace88d218068f9e) is 8.0M, max 2.6G, 2.6G free. Mar 17 18:48:04.915441 systemd-journald[1169]: Received client request to flush runtime journal. Mar 17 18:48:04.901000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.899679 systemd[1]: Finished systemd-sysctl.service. Mar 17 18:48:04.918000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:04.916490 systemd[1]: Finished systemd-journal-flush.service. Mar 17 18:48:05.356546 systemd[1]: Finished systemd-sysusers.service. Mar 17 18:48:05.359000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:05.361162 systemd[1]: Starting systemd-tmpfiles-setup-dev.service... Mar 17 18:48:05.709000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:05.707181 systemd[1]: Finished systemd-tmpfiles-setup-dev.service. Mar 17 18:48:05.962022 systemd[1]: Finished systemd-hwdb-update.service. Mar 17 18:48:05.964000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:05.966292 systemd[1]: Starting systemd-udevd.service... Mar 17 18:48:05.986679 systemd-udevd[1219]: Using default interface naming scheme 'v252'. Mar 17 18:48:06.262298 systemd[1]: Started systemd-udevd.service. Mar 17 18:48:06.264000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:06.267329 systemd[1]: Starting systemd-networkd.service... Mar 17 18:48:06.301832 systemd[1]: Found device dev-ttyS0.device. Mar 17 18:48:06.355474 kernel: mousedev: PS/2 mouse device common for all mice Mar 17 18:48:06.360399 systemd[1]: Starting systemd-userdbd.service... Mar 17 18:48:06.360000 audit[1229]: AVC avc: denied { confidentiality } for pid=1229 comm="(udev-worker)" lockdown_reason="use of tracefs" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=1 Mar 17 18:48:06.376478 kernel: hv_vmbus: registering driver hv_balloon Mar 17 18:48:06.394706 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 Mar 17 18:48:06.360000 audit[1229]: SYSCALL arch=c000003e syscall=175 success=yes exit=0 a0=564f7c71f3d0 a1=f884 a2=7f1dea67dbc5 a3=5 items=12 ppid=1219 pid=1229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="(udev-worker)" exe="/usr/bin/udevadm" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:48:06.406780 kernel: hv_vmbus: registering driver hyperv_fb Mar 17 18:48:06.360000 audit: CWD cwd="/" Mar 17 18:48:06.360000 audit: PATH item=0 name=(null) inode=235 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=1 name=(null) inode=14828 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=2 name=(null) inode=14828 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.417026 kernel: hyperv_fb: Synthvid Version major 3, minor 5 Mar 17 18:48:06.417078 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 Mar 17 18:48:06.360000 audit: PATH item=3 name=(null) inode=14829 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=4 name=(null) inode=14828 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=5 name=(null) inode=14830 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=6 name=(null) inode=14828 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=7 name=(null) inode=14831 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=8 name=(null) inode=14828 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=9 name=(null) inode=14832 dev=00:0b mode=0100640 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=10 name=(null) inode=14828 dev=00:0b mode=040750 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=PARENT cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PATH item=11 name=(null) inode=14833 dev=00:0b mode=0100440 ouid=0 ogid=0 rdev=00:00 obj=system_u:object_r:tracefs_t:s0 nametype=CREATE cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:48:06.360000 audit: PROCTITLE proctitle="(udev-worker)" Mar 17 18:48:06.424159 kernel: Console: switching to colour dummy device 80x25 Mar 17 18:48:06.428517 kernel: hv_utils: Registering HyperV Utility Driver Mar 17 18:48:06.739135 kernel: hv_vmbus: registering driver hv_utils Mar 17 18:48:06.739170 kernel: Console: switching to colour frame buffer device 128x48 Mar 17 18:48:06.739193 kernel: hv_utils: Shutdown IC version 3.2 Mar 17 18:48:06.739214 kernel: hv_utils: Heartbeat IC version 3.0 Mar 17 18:48:06.739235 kernel: hv_utils: TimeSync IC version 4.0 Mar 17 18:48:06.745970 systemd[1]: Started systemd-userdbd.service. Mar 17 18:48:06.746000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:07.009008 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device. Mar 17 18:48:07.055887 kernel: KVM: vmx: using Hyper-V Enlightened VMCS Mar 17 18:48:07.093870 systemd-networkd[1226]: lo: Link UP Mar 17 18:48:07.094160 systemd-networkd[1226]: lo: Gained carrier Mar 17 18:48:07.094654 systemd-networkd[1226]: Enumeration completed Mar 17 18:48:07.094851 systemd[1]: Started systemd-networkd.service. Mar 17 18:48:07.095000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:07.098780 systemd[1]: Starting systemd-networkd-wait-online.service... Mar 17 18:48:07.116181 systemd[1]: Finished systemd-udev-settle.service. Mar 17 18:48:07.117000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-settle comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:07.119946 systemd[1]: Starting lvm2-activation-early.service... Mar 17 18:48:07.124994 systemd-networkd[1226]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:48:07.176686 kernel: mlx5_core 7e43:00:02.0 enP32323s1: Link up Mar 17 18:48:07.197691 kernel: hv_netvsc 7c1e5237-d8af-7c1e-5237-d8af7c1e5237 eth0: Data path switched to VF: enP32323s1 Mar 17 18:48:07.198450 systemd-networkd[1226]: enP32323s1: Link UP Mar 17 18:48:07.198771 systemd-networkd[1226]: eth0: Link UP Mar 17 18:48:07.198889 systemd-networkd[1226]: eth0: Gained carrier Mar 17 18:48:07.203945 systemd-networkd[1226]: enP32323s1: Gained carrier Mar 17 18:48:07.231826 systemd-networkd[1226]: eth0: DHCPv4 address 10.200.8.36/24, gateway 10.200.8.1 acquired from 168.63.129.16 Mar 17 18:48:07.481239 lvm[1298]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 18:48:07.511187 systemd[1]: Finished lvm2-activation-early.service. Mar 17 18:48:07.512000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:07.514036 systemd[1]: Reached target cryptsetup.target. Mar 17 18:48:07.517897 systemd[1]: Starting lvm2-activation.service... Mar 17 18:48:07.523273 lvm[1300]: WARNING: Failed to connect to lvmetad. Falling back to device scanning. Mar 17 18:48:07.542000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=lvm2-activation comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:07.541827 systemd[1]: Finished lvm2-activation.service. Mar 17 18:48:07.544423 systemd[1]: Reached target local-fs-pre.target. Mar 17 18:48:07.546717 systemd[1]: var-lib-machines.mount was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Mar 17 18:48:07.546752 systemd[1]: Reached target local-fs.target. Mar 17 18:48:07.548889 systemd[1]: Reached target machines.target. Mar 17 18:48:07.552280 systemd[1]: Starting ldconfig.service... Mar 17 18:48:07.554773 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:48:07.554889 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:48:07.556077 systemd[1]: Starting systemd-boot-update.service... Mar 17 18:48:07.559223 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service... Mar 17 18:48:07.562812 systemd[1]: Starting systemd-machine-id-commit.service... Mar 17 18:48:07.566276 systemd[1]: Starting systemd-sysext.service... Mar 17 18:48:07.597593 systemd[1]: boot.automount: Got automount request for /boot, triggered by 1303 (bootctl) Mar 17 18:48:07.599163 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service... Mar 17 18:48:08.020324 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service. Mar 17 18:48:08.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.025488 systemd[1]: Unmounting usr-share-oem.mount... Mar 17 18:48:08.030939 systemd[1]: usr-share-oem.mount: Deactivated successfully. Mar 17 18:48:08.031265 systemd[1]: Unmounted usr-share-oem.mount. Mar 17 18:48:08.101688 kernel: loop0: detected capacity change from 0 to 210664 Mar 17 18:48:08.134689 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Mar 17 18:48:08.149691 kernel: loop1: detected capacity change from 0 to 210664 Mar 17 18:48:08.153577 (sd-sysext)[1318]: Using extensions 'kubernetes'. Mar 17 18:48:08.155072 (sd-sysext)[1318]: Merged extensions into '/usr'. Mar 17 18:48:08.173341 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:48:08.175262 systemd[1]: Mounting usr-share-oem.mount... Mar 17 18:48:08.177693 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:48:08.179513 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:48:08.183024 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:48:08.186653 systemd[1]: Starting modprobe@loop.service... Mar 17 18:48:08.188949 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:48:08.189166 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:48:08.189377 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:48:08.190780 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:48:08.194000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.194000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.190978 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:48:08.201886 systemd[1]: Mounted usr-share-oem.mount. Mar 17 18:48:08.207317 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:48:08.207520 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:48:08.208000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.208000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.210287 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:48:08.210493 systemd[1]: Finished modprobe@loop.service. Mar 17 18:48:08.211000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.211000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.214218 systemd[1]: Finished systemd-sysext.service. Mar 17 18:48:08.215000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.219844 systemd[1]: Starting ensure-sysext.service... Mar 17 18:48:08.222052 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:48:08.222249 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:48:08.223835 systemd[1]: Starting systemd-tmpfiles-setup.service... Mar 17 18:48:08.236095 systemd[1]: Reloading. Mar 17 18:48:08.264625 systemd-tmpfiles[1332]: /usr/lib/tmpfiles.d/legacy.conf:13: Duplicate line for path "/run/lock", ignoring. Mar 17 18:48:08.280598 systemd-tmpfiles[1332]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Mar 17 18:48:08.282110 /usr/lib/systemd/system-generators/torcx-generator[1353]: time="2025-03-17T18:48:08Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 18:48:08.282565 /usr/lib/systemd/system-generators/torcx-generator[1353]: time="2025-03-17T18:48:08Z" level=info msg="torcx already run" Mar 17 18:48:08.307232 systemd-tmpfiles[1332]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Mar 17 18:48:08.324087 systemd-networkd[1226]: eth0: Gained IPv6LL Mar 17 18:48:08.407611 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:48:08.407632 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:48:08.424710 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:48:08.486881 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Mar 17 18:48:08.497537 systemd[1]: Finished systemd-networkd-wait-online.service. Mar 17 18:48:08.499000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-networkd-wait-online comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.501129 systemd[1]: Finished systemd-machine-id-commit.service. Mar 17 18:48:08.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.513143 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:48:08.513478 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:48:08.514875 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:48:08.518640 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:48:08.522657 systemd[1]: Starting modprobe@loop.service... Mar 17 18:48:08.524833 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:48:08.525021 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:48:08.525210 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:48:08.526545 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:48:08.526757 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:48:08.527000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.527000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.530114 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:48:08.530404 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:48:08.531000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.531000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.533831 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:48:08.534134 systemd[1]: Finished modprobe@loop.service. Mar 17 18:48:08.535000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.535000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.542446 systemd[1]: proc-xen.mount was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:48:08.542809 systemd[1]: ignition-delete-config.service was skipped because no trigger condition checks were met. Mar 17 18:48:08.544101 systemd[1]: Starting modprobe@dm_mod.service... Mar 17 18:48:08.547562 systemd[1]: Starting modprobe@drm.service... Mar 17 18:48:08.550994 systemd[1]: Starting modprobe@efi_pstore.service... Mar 17 18:48:08.554567 systemd[1]: Starting modprobe@loop.service... Mar 17 18:48:08.556758 systemd[1]: systemd-binfmt.service was skipped because no trigger condition checks were met. Mar 17 18:48:08.557016 systemd[1]: systemd-boot-system-token.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/LoaderFeatures-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:48:08.557351 systemd[1]: xenserver-pv-version.service was skipped because of an unmet condition check (ConditionVirtualization=xen). Mar 17 18:48:08.559355 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Mar 17 18:48:08.559569 systemd[1]: Finished modprobe@dm_mod.service. Mar 17 18:48:08.560000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.560000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.562429 systemd[1]: modprobe@drm.service: Deactivated successfully. Mar 17 18:48:08.562611 systemd[1]: Finished modprobe@drm.service. Mar 17 18:48:08.563000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.563000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.565601 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Mar 17 18:48:08.565815 systemd[1]: Finished modprobe@efi_pstore.service. Mar 17 18:48:08.566000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.566000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.569149 systemd[1]: modprobe@loop.service: Deactivated successfully. Mar 17 18:48:08.569453 systemd[1]: Finished modprobe@loop.service. Mar 17 18:48:08.570000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.570000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.573800 systemd[1]: Finished ensure-sysext.service. Mar 17 18:48:08.574000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.577568 systemd[1]: systemd-pstore.service was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Mar 17 18:48:08.577750 systemd[1]: systemd-repart.service was skipped because no trigger condition checks were met. Mar 17 18:48:08.856385 systemd-fsck[1315]: fsck.fat 4.2 (2021-01-31) Mar 17 18:48:08.856385 systemd-fsck[1315]: /dev/sda1: 789 files, 119299/258078 clusters Mar 17 18:48:08.858694 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM.service. Mar 17 18:48:08.863275 systemd[1]: Mounting boot.mount... Mar 17 18:48:08.860000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-EFI\x2dSYSTEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:08.905326 systemd[1]: Mounted boot.mount. Mar 17 18:48:08.921484 systemd[1]: Finished systemd-boot-update.service. Mar 17 18:48:08.922000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-boot-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:09.040276 systemd[1]: Finished systemd-tmpfiles-setup.service. Mar 17 18:48:09.041000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:09.045114 systemd[1]: Starting audit-rules.service... Mar 17 18:48:09.048827 systemd[1]: Starting clean-ca-certificates.service... Mar 17 18:48:09.052556 systemd[1]: Starting systemd-journal-catalog-update.service... Mar 17 18:48:09.057005 systemd[1]: Starting systemd-resolved.service... Mar 17 18:48:09.061629 systemd[1]: Starting systemd-timesyncd.service... Mar 17 18:48:09.067462 systemd[1]: Starting systemd-update-utmp.service... Mar 17 18:48:09.070657 systemd[1]: Finished clean-ca-certificates.service. Mar 17 18:48:09.071000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:09.073641 systemd[1]: update-ca-certificates.service was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Mar 17 18:48:09.093000 audit[1456]: SYSTEM_BOOT pid=1456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Mar 17 18:48:09.098702 systemd[1]: Finished systemd-update-utmp.service. Mar 17 18:48:09.099000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:09.201992 systemd[1]: Started systemd-timesyncd.service. Mar 17 18:48:09.202000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-timesyncd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:09.204798 systemd[1]: Reached target time-set.target. Mar 17 18:48:09.239859 systemd[1]: Finished systemd-journal-catalog-update.service. Mar 17 18:48:09.241000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:09.274642 systemd-resolved[1454]: Positive Trust Anchors: Mar 17 18:48:09.274678 systemd-resolved[1454]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Mar 17 18:48:09.274727 systemd-resolved[1454]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa corp home internal intranet lan local private test Mar 17 18:48:09.324042 systemd-resolved[1454]: Using system hostname 'ci-3510.3.7-a-b312ad98ee'. Mar 17 18:48:09.325645 systemd[1]: Started systemd-resolved.service. Mar 17 18:48:09.326000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:48:09.327994 systemd[1]: Reached target network.target. Mar 17 18:48:09.329465 augenrules[1474]: No rules Mar 17 18:48:09.327000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Mar 17 18:48:09.327000 audit[1474]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7fff165262b0 a2=420 a3=0 items=0 ppid=1450 pid=1474 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:48:09.327000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Mar 17 18:48:09.330166 systemd[1]: Reached target network-online.target. Mar 17 18:48:09.332485 systemd[1]: Reached target nss-lookup.target. Mar 17 18:48:09.335163 systemd[1]: Finished audit-rules.service. Mar 17 18:48:09.475814 systemd-timesyncd[1455]: Contacted time server 193.1.8.106:123 (0.flatcar.pool.ntp.org). Mar 17 18:48:09.476030 systemd-timesyncd[1455]: Initial clock synchronization to Mon 2025-03-17 18:48:09.475393 UTC. Mar 17 18:48:14.968003 ldconfig[1302]: /sbin/ldconfig: /lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Mar 17 18:48:14.976682 systemd[1]: Finished ldconfig.service. Mar 17 18:48:14.980792 systemd[1]: Starting systemd-update-done.service... Mar 17 18:48:14.988759 systemd[1]: Finished systemd-update-done.service. Mar 17 18:48:14.991388 systemd[1]: Reached target sysinit.target. Mar 17 18:48:14.993561 systemd[1]: Started motdgen.path. Mar 17 18:48:14.995358 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path. Mar 17 18:48:14.998556 systemd[1]: Started logrotate.timer. Mar 17 18:48:15.000628 systemd[1]: Started mdadm.timer. Mar 17 18:48:15.002362 systemd[1]: Started systemd-tmpfiles-clean.timer. Mar 17 18:48:15.004645 systemd[1]: update-engine-stub.timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Mar 17 18:48:15.004709 systemd[1]: Reached target paths.target. Mar 17 18:48:15.006742 systemd[1]: Reached target timers.target. Mar 17 18:48:15.008772 systemd[1]: Listening on dbus.socket. Mar 17 18:48:15.011475 systemd[1]: Starting docker.socket... Mar 17 18:48:15.032391 systemd[1]: Listening on sshd.socket. Mar 17 18:48:15.034632 systemd[1]: systemd-pcrphase-sysinit.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:48:15.035207 systemd[1]: Listening on docker.socket. Mar 17 18:48:15.037259 systemd[1]: Reached target sockets.target. Mar 17 18:48:15.039346 systemd[1]: Reached target basic.target. Mar 17 18:48:15.041677 systemd[1]: System is tainted: cgroupsv1 Mar 17 18:48:15.041742 systemd[1]: addon-config@usr-share-oem.service was skipped because no trigger condition checks were met. Mar 17 18:48:15.041785 systemd[1]: addon-run@usr-share-oem.service was skipped because no trigger condition checks were met. Mar 17 18:48:15.042936 systemd[1]: Starting containerd.service... Mar 17 18:48:15.046413 systemd[1]: Starting dbus.service... Mar 17 18:48:15.049554 systemd[1]: Starting enable-oem-cloudinit.service... Mar 17 18:48:15.053705 systemd[1]: Starting extend-filesystems.service... Mar 17 18:48:15.055907 systemd[1]: flatcar-setup-environment.service was skipped because of an unmet condition check (ConditionPathExists=/usr/share/oem/bin/flatcar-setup-environment). Mar 17 18:48:15.057815 systemd[1]: Starting kubelet.service... Mar 17 18:48:15.061362 systemd[1]: Starting motdgen.service... Mar 17 18:48:15.065203 systemd[1]: Started nvidia.service. Mar 17 18:48:15.068965 systemd[1]: Starting prepare-helm.service... Mar 17 18:48:15.072396 systemd[1]: Starting ssh-key-proc-cmdline.service... Mar 17 18:48:15.076249 systemd[1]: Starting sshd-keygen.service... Mar 17 18:48:15.085104 systemd[1]: Starting systemd-logind.service... Mar 17 18:48:15.086871 systemd[1]: systemd-pcrphase.service was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f). Mar 17 18:48:15.086977 systemd[1]: tcsd.service was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Mar 17 18:48:15.089116 systemd[1]: Starting update-engine.service... Mar 17 18:48:15.092391 systemd[1]: Starting update-ssh-keys-after-ignition.service... Mar 17 18:48:15.100408 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Mar 17 18:48:15.100831 systemd[1]: Finished ssh-key-proc-cmdline.service. Mar 17 18:48:15.129879 jq[1488]: false Mar 17 18:48:15.130369 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Mar 17 18:48:15.132030 jq[1503]: true Mar 17 18:48:15.130706 systemd[1]: Condition check resulted in enable-oem-cloudinit.service being skipped. Mar 17 18:48:15.181219 jq[1516]: true Mar 17 18:48:15.210709 extend-filesystems[1489]: Found loop1 Mar 17 18:48:15.210709 extend-filesystems[1489]: Found sda Mar 17 18:48:15.210709 extend-filesystems[1489]: Found sda1 Mar 17 18:48:15.210709 extend-filesystems[1489]: Found sda2 Mar 17 18:48:15.210709 extend-filesystems[1489]: Found sda3 Mar 17 18:48:15.210709 extend-filesystems[1489]: Found usr Mar 17 18:48:15.210709 extend-filesystems[1489]: Found sda4 Mar 17 18:48:15.210709 extend-filesystems[1489]: Found sda6 Mar 17 18:48:15.210709 extend-filesystems[1489]: Found sda7 Mar 17 18:48:15.210709 extend-filesystems[1489]: Found sda9 Mar 17 18:48:15.210709 extend-filesystems[1489]: Checking size of /dev/sda9 Mar 17 18:48:15.216054 systemd[1]: motdgen.service: Deactivated successfully. Mar 17 18:48:15.271903 tar[1509]: linux-amd64/helm Mar 17 18:48:15.216364 systemd[1]: Finished motdgen.service. Mar 17 18:48:15.288467 extend-filesystems[1489]: Old size kept for /dev/sda9 Mar 17 18:48:15.290956 extend-filesystems[1489]: Found sr0 Mar 17 18:48:15.289328 systemd[1]: extend-filesystems.service: Deactivated successfully. Mar 17 18:48:15.289568 systemd[1]: Finished extend-filesystems.service. Mar 17 18:48:15.300042 systemd-logind[1500]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) Mar 17 18:48:15.301955 systemd-logind[1500]: New seat seat0. Mar 17 18:48:15.315298 env[1524]: time="2025-03-17T18:48:15.315249420Z" level=info msg="starting containerd" revision=92b3a9d6f1b3bcc6dc74875cfdea653fe39f09c2 version=1.6.16 Mar 17 18:48:15.390802 dbus-daemon[1487]: [system] SELinux support is enabled Mar 17 18:48:15.391041 systemd[1]: Started dbus.service. Mar 17 18:48:15.397223 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Mar 17 18:48:15.397269 systemd[1]: Reached target system-config.target. Mar 17 18:48:15.399556 systemd[1]: user-cloudinit-proc-cmdline.service was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Mar 17 18:48:15.399594 systemd[1]: Reached target user-config.target. Mar 17 18:48:15.403110 systemd[1]: Started systemd-logind.service. Mar 17 18:48:15.404570 dbus-daemon[1487]: [system] Successfully activated service 'org.freedesktop.systemd1' Mar 17 18:48:15.465135 systemd[1]: nvidia.service: Deactivated successfully. Mar 17 18:48:15.480922 env[1524]: time="2025-03-17T18:48:15.480879923Z" level=info msg="loading plugin \"io.containerd.content.v1.content\"..." type=io.containerd.content.v1 Mar 17 18:48:15.484703 bash[1550]: Updated "/home/core/.ssh/authorized_keys" Mar 17 18:48:15.485647 systemd[1]: Finished update-ssh-keys-after-ignition.service. Mar 17 18:48:15.493887 env[1524]: time="2025-03-17T18:48:15.493848544Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.aufs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:48:15.497015 env[1524]: time="2025-03-17T18:48:15.496961100Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.aufs\"..." error="aufs is not supported (modprobe aufs failed: exit status 1 \"modprobe: FATAL: Module aufs not found in directory /lib/modules/5.15.179-flatcar\\n\"): skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:48:15.497099 env[1524]: time="2025-03-17T18:48:15.497014500Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:48:15.497388 env[1524]: time="2025-03-17T18:48:15.497359395Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.btrfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:48:15.497458 env[1524]: time="2025-03-17T18:48:15.497391595Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.devmapper\"..." type=io.containerd.snapshotter.v1 Mar 17 18:48:15.497458 env[1524]: time="2025-03-17T18:48:15.497414294Z" level=warning msg="failed to load plugin io.containerd.snapshotter.v1.devmapper" error="devmapper not configured" Mar 17 18:48:15.497458 env[1524]: time="2025-03-17T18:48:15.497428794Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.native\"..." type=io.containerd.snapshotter.v1 Mar 17 18:48:15.497570 env[1524]: time="2025-03-17T18:48:15.497522493Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.overlayfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:48:15.499252 env[1524]: time="2025-03-17T18:48:15.497809689Z" level=info msg="loading plugin \"io.containerd.snapshotter.v1.zfs\"..." type=io.containerd.snapshotter.v1 Mar 17 18:48:15.499458 env[1524]: time="2025-03-17T18:48:15.499426066Z" level=info msg="skip loading plugin \"io.containerd.snapshotter.v1.zfs\"..." error="path /var/lib/containerd/io.containerd.snapshotter.v1.zfs must be a zfs filesystem to be used with the zfs snapshotter: skip plugin" type=io.containerd.snapshotter.v1 Mar 17 18:48:15.499522 env[1524]: time="2025-03-17T18:48:15.499462366Z" level=info msg="loading plugin \"io.containerd.metadata.v1.bolt\"..." type=io.containerd.metadata.v1 Mar 17 18:48:15.499568 env[1524]: time="2025-03-17T18:48:15.499534565Z" level=warning msg="could not use snapshotter devmapper in metadata plugin" error="devmapper not configured" Mar 17 18:48:15.499568 env[1524]: time="2025-03-17T18:48:15.499551265Z" level=info msg="metadata content store policy set" policy=shared Mar 17 18:48:15.521537 env[1524]: time="2025-03-17T18:48:15.521490860Z" level=info msg="loading plugin \"io.containerd.differ.v1.walking\"..." type=io.containerd.differ.v1 Mar 17 18:48:15.521692 env[1524]: time="2025-03-17T18:48:15.521545660Z" level=info msg="loading plugin \"io.containerd.event.v1.exchange\"..." type=io.containerd.event.v1 Mar 17 18:48:15.521692 env[1524]: time="2025-03-17T18:48:15.521573959Z" level=info msg="loading plugin \"io.containerd.gc.v1.scheduler\"..." type=io.containerd.gc.v1 Mar 17 18:48:15.521692 env[1524]: time="2025-03-17T18:48:15.521629958Z" level=info msg="loading plugin \"io.containerd.service.v1.introspection-service\"..." type=io.containerd.service.v1 Mar 17 18:48:15.521692 env[1524]: time="2025-03-17T18:48:15.521650058Z" level=info msg="loading plugin \"io.containerd.service.v1.containers-service\"..." type=io.containerd.service.v1 Mar 17 18:48:15.521886 env[1524]: time="2025-03-17T18:48:15.521724657Z" level=info msg="loading plugin \"io.containerd.service.v1.content-service\"..." type=io.containerd.service.v1 Mar 17 18:48:15.521886 env[1524]: time="2025-03-17T18:48:15.521746657Z" level=info msg="loading plugin \"io.containerd.service.v1.diff-service\"..." type=io.containerd.service.v1 Mar 17 18:48:15.521886 env[1524]: time="2025-03-17T18:48:15.521765557Z" level=info msg="loading plugin \"io.containerd.service.v1.images-service\"..." type=io.containerd.service.v1 Mar 17 18:48:15.521886 env[1524]: time="2025-03-17T18:48:15.521783356Z" level=info msg="loading plugin \"io.containerd.service.v1.leases-service\"..." type=io.containerd.service.v1 Mar 17 18:48:15.521886 env[1524]: time="2025-03-17T18:48:15.521804956Z" level=info msg="loading plugin \"io.containerd.service.v1.namespaces-service\"..." type=io.containerd.service.v1 Mar 17 18:48:15.521886 env[1524]: time="2025-03-17T18:48:15.521823956Z" level=info msg="loading plugin \"io.containerd.service.v1.snapshots-service\"..." type=io.containerd.service.v1 Mar 17 18:48:15.521886 env[1524]: time="2025-03-17T18:48:15.521841755Z" level=info msg="loading plugin \"io.containerd.runtime.v1.linux\"..." type=io.containerd.runtime.v1 Mar 17 18:48:15.522137 env[1524]: time="2025-03-17T18:48:15.521994053Z" level=info msg="loading plugin \"io.containerd.runtime.v2.task\"..." type=io.containerd.runtime.v2 Mar 17 18:48:15.522137 env[1524]: time="2025-03-17T18:48:15.522096752Z" level=info msg="loading plugin \"io.containerd.monitor.v1.cgroups\"..." type=io.containerd.monitor.v1 Mar 17 18:48:15.522630 env[1524]: time="2025-03-17T18:48:15.522593145Z" level=info msg="loading plugin \"io.containerd.service.v1.tasks-service\"..." type=io.containerd.service.v1 Mar 17 18:48:15.522726 env[1524]: time="2025-03-17T18:48:15.522648244Z" level=info msg="loading plugin \"io.containerd.grpc.v1.introspection\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.522726 env[1524]: time="2025-03-17T18:48:15.522682244Z" level=info msg="loading plugin \"io.containerd.internal.v1.restart\"..." type=io.containerd.internal.v1 Mar 17 18:48:15.522810 env[1524]: time="2025-03-17T18:48:15.522742643Z" level=info msg="loading plugin \"io.containerd.grpc.v1.containers\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.522810 env[1524]: time="2025-03-17T18:48:15.522763543Z" level=info msg="loading plugin \"io.containerd.grpc.v1.content\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.522810 env[1524]: time="2025-03-17T18:48:15.522782042Z" level=info msg="loading plugin \"io.containerd.grpc.v1.diff\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.522810 env[1524]: time="2025-03-17T18:48:15.522801442Z" level=info msg="loading plugin \"io.containerd.grpc.v1.events\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.522941 env[1524]: time="2025-03-17T18:48:15.522819942Z" level=info msg="loading plugin \"io.containerd.grpc.v1.healthcheck\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.522941 env[1524]: time="2025-03-17T18:48:15.522838642Z" level=info msg="loading plugin \"io.containerd.grpc.v1.images\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.522941 env[1524]: time="2025-03-17T18:48:15.522857241Z" level=info msg="loading plugin \"io.containerd.grpc.v1.leases\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.522941 env[1524]: time="2025-03-17T18:48:15.522874841Z" level=info msg="loading plugin \"io.containerd.grpc.v1.namespaces\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.522941 env[1524]: time="2025-03-17T18:48:15.522893141Z" level=info msg="loading plugin \"io.containerd.internal.v1.opt\"..." type=io.containerd.internal.v1 Mar 17 18:48:15.523126 env[1524]: time="2025-03-17T18:48:15.523093138Z" level=info msg="loading plugin \"io.containerd.grpc.v1.snapshots\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.523126 env[1524]: time="2025-03-17T18:48:15.523116838Z" level=info msg="loading plugin \"io.containerd.grpc.v1.tasks\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.523202 env[1524]: time="2025-03-17T18:48:15.523150637Z" level=info msg="loading plugin \"io.containerd.grpc.v1.version\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.523202 env[1524]: time="2025-03-17T18:48:15.523169337Z" level=info msg="loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." type=io.containerd.tracing.processor.v1 Mar 17 18:48:15.523202 env[1524]: time="2025-03-17T18:48:15.523190537Z" level=info msg="skip loading plugin \"io.containerd.tracing.processor.v1.otlp\"..." error="no OpenTelemetry endpoint: skip plugin" type=io.containerd.tracing.processor.v1 Mar 17 18:48:15.523305 env[1524]: time="2025-03-17T18:48:15.523209236Z" level=info msg="loading plugin \"io.containerd.internal.v1.tracing\"..." type=io.containerd.internal.v1 Mar 17 18:48:15.523305 env[1524]: time="2025-03-17T18:48:15.523245736Z" level=error msg="failed to initialize a tracing processor \"otlp\"" error="no OpenTelemetry endpoint: skip plugin" Mar 17 18:48:15.523305 env[1524]: time="2025-03-17T18:48:15.523287835Z" level=info msg="loading plugin \"io.containerd.grpc.v1.cri\"..." type=io.containerd.grpc.v1 Mar 17 18:48:15.523706 env[1524]: time="2025-03-17T18:48:15.523603831Z" level=info msg="Start cri plugin with config {PluginConfig:{ContainerdConfig:{Snapshotter:overlayfs DefaultRuntimeName:runc DefaultRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} UntrustedWorkloadRuntime:{Type: Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0} Runtimes:map[runc:{Type:io.containerd.runc.v2 Path: Engine: PodAnnotations:[] ContainerAnnotations:[] Root: Options:map[SystemdCgroup:false] PrivilegedWithoutHostDevices:false BaseRuntimeSpec: NetworkPluginConfDir: NetworkPluginMaxConfNum:0}] NoPivot:false DisableSnapshotAnnotations:true DiscardUnpackedLayers:false IgnoreRdtNotEnabledErrors:false} CniConfig:{NetworkPluginBinDir:/opt/cni/bin NetworkPluginConfDir:/etc/cni/net.d NetworkPluginMaxConfNum:1 NetworkPluginConfTemplate: IPPreference:} Registry:{ConfigPath: Mirrors:map[] Configs:map[] Auths:map[] Headers:map[]} ImageDecryption:{KeyModel:node} DisableTCPService:true StreamServerAddress:127.0.0.1 StreamServerPort:0 StreamIdleTimeout:4h0m0s EnableSelinux:false SelinuxCategoryRange:1024 SandboxImage:registry.k8s.io/pause:3.6 StatsCollectPeriod:10 SystemdCgroup:false EnableTLSStreaming:false X509KeyPairStreaming:{TLSCertFile: TLSKeyFile:} MaxContainerLogLineSize:16384 DisableCgroup:false DisableApparmor:false RestrictOOMScoreAdj:false MaxConcurrentDownloads:3 DisableProcMount:false UnsetSeccompProfile: TolerateMissingHugetlbController:true DisableHugetlbController:true DeviceOwnershipFromSecurityContext:false IgnoreImageDefinedVolumes:false NetNSMountsUnderStateDir:false EnableUnprivilegedPorts:false EnableUnprivilegedICMP:false} ContainerdRootDir:/var/lib/containerd ContainerdEndpoint:/run/containerd/containerd.sock RootDir:/var/lib/containerd/io.containerd.grpc.v1.cri StateDir:/run/containerd/io.containerd.grpc.v1.cri}" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.523721229Z" level=info msg="Connect containerd service" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.523784029Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\"" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.524540818Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.524857314Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.524907213Z" level=info msg=serving... address=/run/containerd/containerd.sock Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.527722874Z" level=info msg="Start subscribing containerd event" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.527777873Z" level=info msg="Start recovering state" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.527850372Z" level=info msg="Start event monitor" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.527865672Z" level=info msg="Start snapshots syncer" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.527878272Z" level=info msg="Start cni network conf syncer for default" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.527887872Z" level=info msg="Start streaming server" Mar 17 18:48:15.560264 env[1524]: time="2025-03-17T18:48:15.528105069Z" level=info msg="containerd successfully booted in 0.217057s" Mar 17 18:48:15.525068 systemd[1]: Started containerd.service. Mar 17 18:48:16.054637 update_engine[1502]: I0317 18:48:16.040513 1502 main.cc:92] Flatcar Update Engine starting Mar 17 18:48:16.146656 systemd[1]: Started update-engine.service. Mar 17 18:48:16.154856 update_engine[1502]: I0317 18:48:16.146739 1502 update_check_scheduler.cc:74] Next update check in 7m20s Mar 17 18:48:16.151877 systemd[1]: Started locksmithd.service. Mar 17 18:48:16.212773 tar[1509]: linux-amd64/LICENSE Mar 17 18:48:16.213014 tar[1509]: linux-amd64/README.md Mar 17 18:48:16.218918 systemd[1]: Finished prepare-helm.service. Mar 17 18:48:16.596911 systemd[1]: Started kubelet.service. Mar 17 18:48:17.387512 kubelet[1605]: E0317 18:48:17.387458 1605 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:48:17.388934 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:48:17.389129 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:48:17.719149 locksmithd[1597]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Mar 17 18:48:18.125133 sshd_keygen[1510]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Mar 17 18:48:18.145502 systemd[1]: Finished sshd-keygen.service. Mar 17 18:48:18.150221 systemd[1]: Starting issuegen.service... Mar 17 18:48:18.154001 systemd[1]: Started waagent.service. Mar 17 18:48:18.157887 systemd[1]: issuegen.service: Deactivated successfully. Mar 17 18:48:18.158203 systemd[1]: Finished issuegen.service. Mar 17 18:48:18.163204 systemd[1]: Starting systemd-user-sessions.service... Mar 17 18:48:18.185510 systemd[1]: Finished systemd-user-sessions.service. Mar 17 18:48:18.190350 systemd[1]: Started getty@tty1.service. Mar 17 18:48:18.194787 systemd[1]: Started serial-getty@ttyS0.service. Mar 17 18:48:18.197610 systemd[1]: Reached target getty.target. Mar 17 18:48:18.199841 systemd[1]: Reached target multi-user.target. Mar 17 18:48:18.203563 systemd[1]: Starting systemd-update-utmp-runlevel.service... Mar 17 18:48:18.211826 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully. Mar 17 18:48:18.212101 systemd[1]: Finished systemd-update-utmp-runlevel.service. Mar 17 18:48:18.215633 systemd[1]: Startup finished in 777ms (firmware) + 29.203s (loader) + 15.415s (kernel) + 25.655s (userspace) = 1min 11.050s. Mar 17 18:48:18.631325 login[1635]: pam_lastlog(login:session): file /var/log/lastlog is locked/write Mar 17 18:48:18.632175 login[1634]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 17 18:48:18.687417 systemd[1]: Created slice user-500.slice. Mar 17 18:48:18.689183 systemd[1]: Starting user-runtime-dir@500.service... Mar 17 18:48:18.692513 systemd-logind[1500]: New session 2 of user core. Mar 17 18:48:18.702976 systemd[1]: Finished user-runtime-dir@500.service. Mar 17 18:48:18.705441 systemd[1]: Starting user@500.service... Mar 17 18:48:18.760969 (systemd)[1641]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:48:18.949906 systemd[1641]: Queued start job for default target default.target. Mar 17 18:48:18.950525 systemd[1641]: Reached target paths.target. Mar 17 18:48:18.950701 systemd[1641]: Reached target sockets.target. Mar 17 18:48:18.950850 systemd[1641]: Reached target timers.target. Mar 17 18:48:18.950879 systemd[1641]: Reached target basic.target. Mar 17 18:48:18.951066 systemd[1]: Started user@500.service. Mar 17 18:48:18.951198 systemd[1641]: Reached target default.target. Mar 17 18:48:18.951356 systemd[1641]: Startup finished in 183ms. Mar 17 18:48:18.952720 systemd[1]: Started session-2.scope. Mar 17 18:48:19.633719 login[1635]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) Mar 17 18:48:19.637647 systemd-logind[1500]: New session 1 of user core. Mar 17 18:48:19.638873 systemd[1]: Started session-1.scope. Mar 17 18:48:25.054577 waagent[1627]: 2025-03-17T18:48:25.054459Z INFO Daemon Daemon Azure Linux Agent Version:2.6.0.2 Mar 17 18:48:25.066369 waagent[1627]: 2025-03-17T18:48:25.056640Z INFO Daemon Daemon OS: flatcar 3510.3.7 Mar 17 18:48:25.066369 waagent[1627]: 2025-03-17T18:48:25.057367Z INFO Daemon Daemon Python: 3.9.16 Mar 17 18:48:25.066369 waagent[1627]: 2025-03-17T18:48:25.058378Z INFO Daemon Daemon Run daemon Mar 17 18:48:25.066369 waagent[1627]: 2025-03-17T18:48:25.059520Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='3510.3.7' Mar 17 18:48:25.070915 waagent[1627]: 2025-03-17T18:48:25.070796Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 1. Mar 17 18:48:25.078403 waagent[1627]: 2025-03-17T18:48:25.078292Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 17 18:48:25.083042 waagent[1627]: 2025-03-17T18:48:25.082975Z INFO Daemon Daemon cloud-init is enabled: False Mar 17 18:48:25.092119 waagent[1627]: 2025-03-17T18:48:25.083978Z INFO Daemon Daemon Using waagent for provisioning Mar 17 18:48:25.092119 waagent[1627]: 2025-03-17T18:48:25.085280Z INFO Daemon Daemon Activate resource disk Mar 17 18:48:25.092119 waagent[1627]: 2025-03-17T18:48:25.086027Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb Mar 17 18:48:25.093692 waagent[1627]: 2025-03-17T18:48:25.093617Z INFO Daemon Daemon Found device: None Mar 17 18:48:25.107655 waagent[1627]: 2025-03-17T18:48:25.095259Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology Mar 17 18:48:25.107655 waagent[1627]: 2025-03-17T18:48:25.096641Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 Mar 17 18:48:25.107655 waagent[1627]: 2025-03-17T18:48:25.098384Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 17 18:48:25.107655 waagent[1627]: 2025-03-17T18:48:25.099103Z INFO Daemon Daemon Running default provisioning handler Mar 17 18:48:25.108612 waagent[1627]: 2025-03-17T18:48:25.108496Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 1. Mar 17 18:48:25.115958 waagent[1627]: 2025-03-17T18:48:25.115856Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' Mar 17 18:48:25.120191 waagent[1627]: 2025-03-17T18:48:25.120129Z INFO Daemon Daemon cloud-init is enabled: False Mar 17 18:48:25.122567 waagent[1627]: 2025-03-17T18:48:25.122507Z INFO Daemon Daemon Copying ovf-env.xml Mar 17 18:48:25.269225 waagent[1627]: 2025-03-17T18:48:25.268226Z INFO Daemon Daemon Successfully mounted dvd Mar 17 18:48:25.384917 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. Mar 17 18:48:25.405710 waagent[1627]: 2025-03-17T18:48:25.405548Z INFO Daemon Daemon Detect protocol endpoint Mar 17 18:48:25.420122 waagent[1627]: 2025-03-17T18:48:25.407258Z INFO Daemon Daemon Clean protocol and wireserver endpoint Mar 17 18:48:25.420122 waagent[1627]: 2025-03-17T18:48:25.408255Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler Mar 17 18:48:25.420122 waagent[1627]: 2025-03-17T18:48:25.409162Z INFO Daemon Daemon Test for route to 168.63.129.16 Mar 17 18:48:25.420122 waagent[1627]: 2025-03-17T18:48:25.410405Z INFO Daemon Daemon Route to 168.63.129.16 exists Mar 17 18:48:25.420122 waagent[1627]: 2025-03-17T18:48:25.410749Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 Mar 17 18:48:25.529071 waagent[1627]: 2025-03-17T18:48:25.528999Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 Mar 17 18:48:25.536243 waagent[1627]: 2025-03-17T18:48:25.530843Z INFO Daemon Daemon Wire protocol version:2012-11-30 Mar 17 18:48:25.536243 waagent[1627]: 2025-03-17T18:48:25.531494Z INFO Daemon Daemon Server preferred version:2015-04-05 Mar 17 18:48:25.885364 waagent[1627]: 2025-03-17T18:48:25.885210Z INFO Daemon Daemon Initializing goal state during protocol detection Mar 17 18:48:25.895645 waagent[1627]: 2025-03-17T18:48:25.895560Z INFO Daemon Daemon Forcing an update of the goal state.. Mar 17 18:48:25.900626 waagent[1627]: 2025-03-17T18:48:25.896829Z INFO Daemon Daemon Fetching goal state [incarnation 1] Mar 17 18:48:25.987824 waagent[1627]: 2025-03-17T18:48:25.987657Z INFO Daemon Daemon Found private key matching thumbprint 21941C2D807910C8B6B07F8B0E9CB0880C5FCDFF Mar 17 18:48:25.997872 waagent[1627]: 2025-03-17T18:48:25.989388Z INFO Daemon Daemon Certificate with thumbprint 4016A80D0561C251388267BC8A44866F24A05316 has no matching private key. Mar 17 18:48:25.997872 waagent[1627]: 2025-03-17T18:48:25.990494Z INFO Daemon Daemon Fetch goal state completed Mar 17 18:48:26.038935 waagent[1627]: 2025-03-17T18:48:26.038847Z INFO Daemon Daemon Fetched new vmSettings [correlation ID: 654ada4e-ca17-4736-9062-5ca74368e015 New eTag: 13548318903217794960] Mar 17 18:48:26.042954 waagent[1627]: 2025-03-17T18:48:26.042865Z INFO Daemon Daemon Status Blob type 'None' is not valid, assuming BlockBlob Mar 17 18:48:26.054320 waagent[1627]: 2025-03-17T18:48:26.054249Z INFO Daemon Daemon Starting provisioning Mar 17 18:48:26.060584 waagent[1627]: 2025-03-17T18:48:26.055588Z INFO Daemon Daemon Handle ovf-env.xml. Mar 17 18:48:26.060584 waagent[1627]: 2025-03-17T18:48:26.056414Z INFO Daemon Daemon Set hostname [ci-3510.3.7-a-b312ad98ee] Mar 17 18:48:26.098629 waagent[1627]: 2025-03-17T18:48:26.098470Z INFO Daemon Daemon Publish hostname [ci-3510.3.7-a-b312ad98ee] Mar 17 18:48:26.105861 waagent[1627]: 2025-03-17T18:48:26.100299Z INFO Daemon Daemon Examine /proc/net/route for primary interface Mar 17 18:48:26.105861 waagent[1627]: 2025-03-17T18:48:26.101501Z INFO Daemon Daemon Primary interface is [eth0] Mar 17 18:48:26.115701 systemd[1]: systemd-networkd-wait-online.service: Deactivated successfully. Mar 17 18:48:26.116018 systemd[1]: Stopped systemd-networkd-wait-online.service. Mar 17 18:48:26.116090 systemd[1]: Stopping systemd-networkd-wait-online.service... Mar 17 18:48:26.116382 systemd[1]: Stopping systemd-networkd.service... Mar 17 18:48:26.120714 systemd-networkd[1226]: eth0: DHCPv6 lease lost Mar 17 18:48:26.122234 systemd[1]: systemd-networkd.service: Deactivated successfully. Mar 17 18:48:26.122428 systemd[1]: Stopped systemd-networkd.service. Mar 17 18:48:26.125088 systemd[1]: Starting systemd-networkd.service... Mar 17 18:48:26.163079 systemd-networkd[1687]: enP32323s1: Link UP Mar 17 18:48:26.163092 systemd-networkd[1687]: enP32323s1: Gained carrier Mar 17 18:48:26.164438 systemd-networkd[1687]: eth0: Link UP Mar 17 18:48:26.164448 systemd-networkd[1687]: eth0: Gained carrier Mar 17 18:48:26.164891 systemd-networkd[1687]: lo: Link UP Mar 17 18:48:26.164899 systemd-networkd[1687]: lo: Gained carrier Mar 17 18:48:26.165215 systemd-networkd[1687]: eth0: Gained IPv6LL Mar 17 18:48:26.165494 systemd-networkd[1687]: Enumeration completed Mar 17 18:48:26.165649 systemd[1]: Started systemd-networkd.service. Mar 17 18:48:26.168235 systemd[1]: Starting systemd-networkd-wait-online.service... Mar 17 18:48:26.170867 waagent[1627]: 2025-03-17T18:48:26.170695Z INFO Daemon Daemon Create user account if not exists Mar 17 18:48:26.171814 systemd-networkd[1687]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Mar 17 18:48:26.175629 waagent[1627]: 2025-03-17T18:48:26.175531Z INFO Daemon Daemon User core already exists, skip useradd Mar 17 18:48:26.178575 waagent[1627]: 2025-03-17T18:48:26.178489Z INFO Daemon Daemon Configure sudoer Mar 17 18:48:26.181379 waagent[1627]: 2025-03-17T18:48:26.181298Z INFO Daemon Daemon Configure sshd Mar 17 18:48:26.183583 waagent[1627]: 2025-03-17T18:48:26.183518Z INFO Daemon Daemon Deploy ssh public key. Mar 17 18:48:26.209740 systemd-networkd[1687]: eth0: DHCPv4 address 10.200.8.36/24, gateway 10.200.8.1 acquired from 168.63.129.16 Mar 17 18:48:26.212897 systemd[1]: Finished systemd-networkd-wait-online.service. Mar 17 18:48:27.330414 waagent[1627]: 2025-03-17T18:48:27.330314Z INFO Daemon Daemon Provisioning complete Mar 17 18:48:27.345928 waagent[1627]: 2025-03-17T18:48:27.345852Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping Mar 17 18:48:27.348778 waagent[1627]: 2025-03-17T18:48:27.348712Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. Mar 17 18:48:27.353611 waagent[1627]: 2025-03-17T18:48:27.353543Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.6.0.2 is the most current agent Mar 17 18:48:27.495215 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Mar 17 18:48:27.495478 systemd[1]: Stopped kubelet.service. Mar 17 18:48:27.497495 systemd[1]: Starting kubelet.service... Mar 17 18:48:27.622739 systemd[1]: Started kubelet.service. Mar 17 18:48:27.673967 waagent[1697]: 2025-03-17T18:48:27.673862Z INFO ExtHandler ExtHandler Agent WALinuxAgent-2.6.0.2 is running as the goal state agent Mar 17 18:48:27.674766 waagent[1697]: 2025-03-17T18:48:27.674699Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 18:48:27.674918 waagent[1697]: 2025-03-17T18:48:27.674868Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 18:48:27.686160 waagent[1697]: 2025-03-17T18:48:27.686093Z INFO ExtHandler ExtHandler Forcing an update of the goal state.. Mar 17 18:48:27.686328 waagent[1697]: 2025-03-17T18:48:27.686277Z INFO ExtHandler ExtHandler Fetching goal state [incarnation 1] Mar 17 18:48:27.747821 waagent[1697]: 2025-03-17T18:48:27.747690Z INFO ExtHandler ExtHandler Found private key matching thumbprint 21941C2D807910C8B6B07F8B0E9CB0880C5FCDFF Mar 17 18:48:27.748057 waagent[1697]: 2025-03-17T18:48:27.747994Z INFO ExtHandler ExtHandler Certificate with thumbprint 4016A80D0561C251388267BC8A44866F24A05316 has no matching private key. Mar 17 18:48:27.748290 waagent[1697]: 2025-03-17T18:48:27.748240Z INFO ExtHandler ExtHandler Fetch goal state completed Mar 17 18:48:27.762565 waagent[1697]: 2025-03-17T18:48:27.762501Z INFO ExtHandler ExtHandler Fetched new vmSettings [correlation ID: 35772c33-677a-44ec-8000-ceaa13cfe55c New eTag: 13548318903217794960] Mar 17 18:48:27.763116 waagent[1697]: 2025-03-17T18:48:27.763056Z INFO ExtHandler ExtHandler Status Blob type 'None' is not valid, assuming BlockBlob Mar 17 18:48:28.215772 kubelet[1707]: E0317 18:48:28.215727 1707 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:48:28.218978 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:48:28.219161 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:48:28.233030 waagent[1697]: 2025-03-17T18:48:28.232886Z INFO ExtHandler ExtHandler Distro: flatcar-3510.3.7; OSUtil: CoreOSUtil; AgentService: waagent; Python: 3.9.16; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1; Mar 17 18:48:28.241908 waagent[1697]: 2025-03-17T18:48:28.241827Z INFO ExtHandler ExtHandler WALinuxAgent-2.6.0.2 running as process 1697 Mar 17 18:48:28.245332 waagent[1697]: 2025-03-17T18:48:28.245258Z INFO ExtHandler ExtHandler Cgroup monitoring is not supported on ['flatcar', '3510.3.7', '', 'Flatcar Container Linux by Kinvolk'] Mar 17 18:48:28.246600 waagent[1697]: 2025-03-17T18:48:28.246539Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 17 18:48:28.344098 waagent[1697]: 2025-03-17T18:48:28.344032Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 17 18:48:28.344506 waagent[1697]: 2025-03-17T18:48:28.344441Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 17 18:48:28.352615 waagent[1697]: 2025-03-17T18:48:28.352557Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 17 18:48:28.353127 waagent[1697]: 2025-03-17T18:48:28.353065Z ERROR ExtHandler ExtHandler Unable to setup the persistent firewall rules: [Errno 30] Read-only file system: '/lib/systemd/system/waagent-network-setup.service' Mar 17 18:48:28.354215 waagent[1697]: 2025-03-17T18:48:28.354148Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: configuration enabled [False], cgroups enabled [False], python supported: [True] Mar 17 18:48:28.355503 waagent[1697]: 2025-03-17T18:48:28.355444Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 17 18:48:28.355977 waagent[1697]: 2025-03-17T18:48:28.355920Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 18:48:28.356133 waagent[1697]: 2025-03-17T18:48:28.356085Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 18:48:28.356646 waagent[1697]: 2025-03-17T18:48:28.356589Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 17 18:48:28.357029 waagent[1697]: 2025-03-17T18:48:28.356969Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 17 18:48:28.357029 waagent[1697]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 17 18:48:28.357029 waagent[1697]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 Mar 17 18:48:28.357029 waagent[1697]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 17 18:48:28.357029 waagent[1697]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 17 18:48:28.357029 waagent[1697]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 18:48:28.357029 waagent[1697]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 18:48:28.359818 waagent[1697]: 2025-03-17T18:48:28.359608Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 18:48:28.360030 waagent[1697]: 2025-03-17T18:48:28.359966Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 17 18:48:28.360834 waagent[1697]: 2025-03-17T18:48:28.360770Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 17 18:48:28.361064 waagent[1697]: 2025-03-17T18:48:28.361010Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 18:48:28.361909 waagent[1697]: 2025-03-17T18:48:28.361851Z INFO EnvHandler ExtHandler Configure routes Mar 17 18:48:28.362094 waagent[1697]: 2025-03-17T18:48:28.362040Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 17 18:48:28.362281 waagent[1697]: 2025-03-17T18:48:28.362229Z INFO EnvHandler ExtHandler Gateway:None Mar 17 18:48:28.362534 waagent[1697]: 2025-03-17T18:48:28.362485Z INFO EnvHandler ExtHandler Routes:None Mar 17 18:48:28.363377 waagent[1697]: 2025-03-17T18:48:28.363315Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 17 18:48:28.363753 waagent[1697]: 2025-03-17T18:48:28.363662Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 17 18:48:28.366524 waagent[1697]: 2025-03-17T18:48:28.366471Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 17 18:48:28.376572 waagent[1697]: 2025-03-17T18:48:28.376516Z INFO ExtHandler ExtHandler Checking for agent updates (family: Prod) Mar 17 18:48:28.377142 waagent[1697]: 2025-03-17T18:48:28.377094Z WARNING ExtHandler ExtHandler Fetch failed: [HttpError] HTTPS is unavailable and required Mar 17 18:48:28.377954 waagent[1697]: 2025-03-17T18:48:28.377896Z INFO ExtHandler ExtHandler [PERIODIC] Request failed using the direct channel. Error: 'NoneType' object has no attribute 'getheaders' Mar 17 18:48:28.403001 waagent[1697]: 2025-03-17T18:48:28.402879Z ERROR EnvHandler ExtHandler Failed to get the PID of the DHCP client: invalid literal for int() with base 10: 'MainPID=1687' Mar 17 18:48:28.420387 waagent[1697]: 2025-03-17T18:48:28.420309Z INFO ExtHandler ExtHandler Default channel changed to HostGA channel. Mar 17 18:48:28.502877 waagent[1697]: 2025-03-17T18:48:28.502763Z INFO MonitorHandler ExtHandler Network interfaces: Mar 17 18:48:28.502877 waagent[1697]: Executing ['ip', '-a', '-o', 'link']: Mar 17 18:48:28.502877 waagent[1697]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 17 18:48:28.502877 waagent[1697]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:37:d8:af brd ff:ff:ff:ff:ff:ff Mar 17 18:48:28.502877 waagent[1697]: 3: enP32323s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:37:d8:af brd ff:ff:ff:ff:ff:ff\ altname enP32323p0s2 Mar 17 18:48:28.502877 waagent[1697]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 17 18:48:28.502877 waagent[1697]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 17 18:48:28.502877 waagent[1697]: 2: eth0 inet 10.200.8.36/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 17 18:48:28.502877 waagent[1697]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 17 18:48:28.502877 waagent[1697]: 1: lo inet6 ::1/128 scope host \ valid_lft forever preferred_lft forever Mar 17 18:48:28.502877 waagent[1697]: 2: eth0 inet6 fe80::7e1e:52ff:fe37:d8af/64 scope link \ valid_lft forever preferred_lft forever Mar 17 18:48:28.593464 waagent[1697]: 2025-03-17T18:48:28.593393Z INFO ExtHandler ExtHandler Agent WALinuxAgent-2.6.0.2 discovered update WALinuxAgent-2.12.0.2 -- exiting Mar 17 18:48:29.358523 waagent[1627]: 2025-03-17T18:48:29.358195Z INFO Daemon Daemon Agent WALinuxAgent-2.6.0.2 launched with command '/usr/share/oem/python/bin/python -u /usr/share/oem/bin/waagent -run-exthandlers' is successfully running Mar 17 18:48:29.362858 waagent[1627]: 2025-03-17T18:48:29.362792Z INFO Daemon Daemon Determined Agent WALinuxAgent-2.12.0.2 to be the latest agent Mar 17 18:48:30.433106 waagent[1747]: 2025-03-17T18:48:30.432995Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.2) Mar 17 18:48:30.433862 waagent[1747]: 2025-03-17T18:48:30.433794Z INFO ExtHandler ExtHandler OS: flatcar 3510.3.7 Mar 17 18:48:30.434009 waagent[1747]: 2025-03-17T18:48:30.433956Z INFO ExtHandler ExtHandler Python: 3.9.16 Mar 17 18:48:30.434153 waagent[1747]: 2025-03-17T18:48:30.434107Z INFO ExtHandler ExtHandler CPU Arch: x86_64 Mar 17 18:48:30.443798 waagent[1747]: 2025-03-17T18:48:30.443682Z INFO ExtHandler ExtHandler Distro: flatcar-3510.3.7; OSUtil: CoreOSUtil; AgentService: waagent; Python: 3.9.16; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.20.1; Mar 17 18:48:30.444205 waagent[1747]: 2025-03-17T18:48:30.444143Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 18:48:30.444369 waagent[1747]: 2025-03-17T18:48:30.444320Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 18:48:30.456488 waagent[1747]: 2025-03-17T18:48:30.456407Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] Mar 17 18:48:30.465910 waagent[1747]: 2025-03-17T18:48:30.465844Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 Mar 17 18:48:30.466889 waagent[1747]: 2025-03-17T18:48:30.466831Z INFO ExtHandler Mar 17 18:48:30.467042 waagent[1747]: 2025-03-17T18:48:30.466990Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: ca825419-83ca-40ce-ab5a-a2bc5a9f095b eTag: 13548318903217794960 source: Fabric] Mar 17 18:48:30.467748 waagent[1747]: 2025-03-17T18:48:30.467693Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 17 18:48:30.468832 waagent[1747]: 2025-03-17T18:48:30.468775Z INFO ExtHandler Mar 17 18:48:30.468971 waagent[1747]: 2025-03-17T18:48:30.468921Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] Mar 17 18:48:30.475834 waagent[1747]: 2025-03-17T18:48:30.475781Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 17 18:48:30.476287 waagent[1747]: 2025-03-17T18:48:30.476239Z WARNING ExtHandler ExtHandler Fetch failed: [HttpError] HTTPS is unavailable and required Mar 17 18:48:30.495340 waagent[1747]: 2025-03-17T18:48:30.495270Z INFO ExtHandler ExtHandler Default channel changed to HostGAPlugin channel. Mar 17 18:48:30.562697 waagent[1747]: 2025-03-17T18:48:30.562543Z INFO ExtHandler Downloaded certificate {'thumbprint': '4016A80D0561C251388267BC8A44866F24A05316', 'hasPrivateKey': False} Mar 17 18:48:30.563728 waagent[1747]: 2025-03-17T18:48:30.563645Z INFO ExtHandler Downloaded certificate {'thumbprint': '21941C2D807910C8B6B07F8B0E9CB0880C5FCDFF', 'hasPrivateKey': True} Mar 17 18:48:30.564691 waagent[1747]: 2025-03-17T18:48:30.564628Z INFO ExtHandler Fetch goal state completed Mar 17 18:48:30.585489 waagent[1747]: 2025-03-17T18:48:30.585379Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.0.15 3 Sep 2024 (Library: OpenSSL 3.0.15 3 Sep 2024) Mar 17 18:48:30.597244 waagent[1747]: 2025-03-17T18:48:30.597144Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.2 running as process 1747 Mar 17 18:48:30.600248 waagent[1747]: 2025-03-17T18:48:30.600178Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '3510.3.7', '', 'Flatcar Container Linux by Kinvolk'] Mar 17 18:48:30.601259 waagent[1747]: 2025-03-17T18:48:30.601197Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '3510.3.7', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported Mar 17 18:48:30.601581 waagent[1747]: 2025-03-17T18:48:30.601501Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False Mar 17 18:48:30.603450 waagent[1747]: 2025-03-17T18:48:30.603393Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules Mar 17 18:48:30.608442 waagent[1747]: 2025-03-17T18:48:30.608389Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service Mar 17 18:48:30.608813 waagent[1747]: 2025-03-17T18:48:30.608758Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup Mar 17 18:48:30.616843 waagent[1747]: 2025-03-17T18:48:30.616791Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now Mar 17 18:48:30.617284 waagent[1747]: 2025-03-17T18:48:30.617229Z ERROR ExtHandler ExtHandler Unable to setup the persistent firewall rules: [Errno 30] Read-only file system: '/lib/systemd/system/waagent-network-setup.service' Mar 17 18:48:30.623033 waagent[1747]: 2025-03-17T18:48:30.622944Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. Mar 17 18:48:30.623987 waagent[1747]: 2025-03-17T18:48:30.623924Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] Mar 17 18:48:30.625367 waagent[1747]: 2025-03-17T18:48:30.625309Z INFO ExtHandler ExtHandler Starting env monitor service. Mar 17 18:48:30.626208 waagent[1747]: 2025-03-17T18:48:30.626152Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 18:48:30.626373 waagent[1747]: 2025-03-17T18:48:30.626326Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 18:48:30.626923 waagent[1747]: 2025-03-17T18:48:30.626867Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. Mar 17 18:48:30.627284 waagent[1747]: 2025-03-17T18:48:30.627230Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. Mar 17 18:48:30.627887 waagent[1747]: 2025-03-17T18:48:30.627836Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file Mar 17 18:48:30.628396 waagent[1747]: 2025-03-17T18:48:30.628343Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: Mar 17 18:48:30.628396 waagent[1747]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT Mar 17 18:48:30.628396 waagent[1747]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 Mar 17 18:48:30.628396 waagent[1747]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 Mar 17 18:48:30.628396 waagent[1747]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 Mar 17 18:48:30.628396 waagent[1747]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 18:48:30.628396 waagent[1747]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 Mar 17 18:48:30.628704 waagent[1747]: 2025-03-17T18:48:30.628492Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 Mar 17 18:48:30.628967 waagent[1747]: 2025-03-17T18:48:30.628911Z INFO EnvHandler ExtHandler Configure routes Mar 17 18:48:30.629174 waagent[1747]: 2025-03-17T18:48:30.629125Z INFO EnvHandler ExtHandler Gateway:None Mar 17 18:48:30.631226 waagent[1747]: 2025-03-17T18:48:30.631117Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread Mar 17 18:48:30.631953 waagent[1747]: 2025-03-17T18:48:30.631871Z INFO ExtHandler ExtHandler Start Extension Telemetry service. Mar 17 18:48:30.632165 waagent[1747]: 2025-03-17T18:48:30.632111Z INFO EnvHandler ExtHandler Routes:None Mar 17 18:48:30.638044 waagent[1747]: 2025-03-17T18:48:30.637780Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True Mar 17 18:48:30.638287 waagent[1747]: 2025-03-17T18:48:30.638185Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. Mar 17 18:48:30.639328 waagent[1747]: 2025-03-17T18:48:30.639256Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread Mar 17 18:48:30.655769 waagent[1747]: 2025-03-17T18:48:30.655700Z INFO MonitorHandler ExtHandler Network interfaces: Mar 17 18:48:30.655769 waagent[1747]: Executing ['ip', '-a', '-o', 'link']: Mar 17 18:48:30.655769 waagent[1747]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 Mar 17 18:48:30.655769 waagent[1747]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:37:d8:af brd ff:ff:ff:ff:ff:ff Mar 17 18:48:30.655769 waagent[1747]: 3: enP32323s1: mtu 1500 qdisc mq master eth0 state UP mode DEFAULT group default qlen 1000\ link/ether 7c:1e:52:37:d8:af brd ff:ff:ff:ff:ff:ff\ altname enP32323p0s2 Mar 17 18:48:30.655769 waagent[1747]: Executing ['ip', '-4', '-a', '-o', 'address']: Mar 17 18:48:30.655769 waagent[1747]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever Mar 17 18:48:30.655769 waagent[1747]: 2: eth0 inet 10.200.8.36/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever Mar 17 18:48:30.655769 waagent[1747]: Executing ['ip', '-6', '-a', '-o', 'address']: Mar 17 18:48:30.655769 waagent[1747]: 1: lo inet6 ::1/128 scope host \ valid_lft forever preferred_lft forever Mar 17 18:48:30.655769 waagent[1747]: 2: eth0 inet6 fe80::7e1e:52ff:fe37:d8af/64 scope link \ valid_lft forever preferred_lft forever Mar 17 18:48:30.657018 waagent[1747]: 2025-03-17T18:48:30.656962Z INFO ExtHandler ExtHandler Downloading agent manifest Mar 17 18:48:30.680591 waagent[1747]: 2025-03-17T18:48:30.680520Z INFO ExtHandler ExtHandler Mar 17 18:48:30.682090 waagent[1747]: 2025-03-17T18:48:30.682023Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: 0792215f-0ed4-4737-8121-42bb4828fcc8 correlation 4c90e8bf-51cc-4eb8-bebf-4f00b2b45bc7 created: 2025-03-17T18:46:57.747330Z] Mar 17 18:48:30.693210 waagent[1747]: 2025-03-17T18:48:30.693109Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 17 18:48:30.697523 waagent[1747]: 2025-03-17T18:48:30.697469Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 16 ms] Mar 17 18:48:30.727980 waagent[1747]: 2025-03-17T18:48:30.727908Z INFO ExtHandler ExtHandler Looking for existing remote access users. Mar 17 18:48:30.741458 waagent[1747]: 2025-03-17T18:48:30.741389Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.2 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 00789480-5985-477C-A676-02D1C0EF296D;DroppedPackets: 0;UpdateGSErrors: 0;AutoUpdate: 1;UpdateMode: SelfUpdate;] Mar 17 18:48:30.807077 waagent[1747]: 2025-03-17T18:48:30.806961Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: Mar 17 18:48:30.807077 waagent[1747]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:48:30.807077 waagent[1747]: pkts bytes target prot opt in out source destination Mar 17 18:48:30.807077 waagent[1747]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:48:30.807077 waagent[1747]: pkts bytes target prot opt in out source destination Mar 17 18:48:30.807077 waagent[1747]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:48:30.807077 waagent[1747]: pkts bytes target prot opt in out source destination Mar 17 18:48:30.807077 waagent[1747]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 17 18:48:30.807077 waagent[1747]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 17 18:48:30.807077 waagent[1747]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 17 18:48:30.814105 waagent[1747]: 2025-03-17T18:48:30.814005Z INFO EnvHandler ExtHandler Current Firewall rules: Mar 17 18:48:30.814105 waagent[1747]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:48:30.814105 waagent[1747]: pkts bytes target prot opt in out source destination Mar 17 18:48:30.814105 waagent[1747]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:48:30.814105 waagent[1747]: pkts bytes target prot opt in out source destination Mar 17 18:48:30.814105 waagent[1747]: Chain OUTPUT (policy ACCEPT 0 packets, 0 bytes) Mar 17 18:48:30.814105 waagent[1747]: pkts bytes target prot opt in out source destination Mar 17 18:48:30.814105 waagent[1747]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 Mar 17 18:48:30.814105 waagent[1747]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 Mar 17 18:48:30.814105 waagent[1747]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW Mar 17 18:48:30.814641 waagent[1747]: 2025-03-17T18:48:30.814586Z INFO EnvHandler ExtHandler Set block dev timeout: sda with timeout: 300 Mar 17 18:48:38.245118 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Mar 17 18:48:38.245407 systemd[1]: Stopped kubelet.service. Mar 17 18:48:38.247551 systemd[1]: Starting kubelet.service... Mar 17 18:48:38.333998 systemd[1]: Started kubelet.service. Mar 17 18:48:38.973631 kubelet[1805]: E0317 18:48:38.973589 1805 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:48:38.975583 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:48:38.975806 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:48:48.995191 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Mar 17 18:48:48.995490 systemd[1]: Stopped kubelet.service. Mar 17 18:48:48.997644 systemd[1]: Starting kubelet.service... Mar 17 18:48:49.083836 systemd[1]: Started kubelet.service. Mar 17 18:48:49.699573 kubelet[1821]: E0317 18:48:49.699523 1821 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:48:49.701050 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:48:49.701257 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:48:54.834250 kernel: hv_balloon: Max. dynamic memory size: 8192 MB Mar 17 18:48:59.745217 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Mar 17 18:48:59.745541 systemd[1]: Stopped kubelet.service. Mar 17 18:48:59.747714 systemd[1]: Starting kubelet.service... Mar 17 18:48:59.834615 systemd[1]: Started kubelet.service. Mar 17 18:49:00.476304 kubelet[1836]: E0317 18:49:00.476257 1836 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:49:00.477932 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:49:00.478067 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:49:01.175954 update_engine[1502]: I0317 18:49:01.175891 1502 update_attempter.cc:509] Updating boot flags... Mar 17 18:49:04.708913 systemd[1]: Created slice system-sshd.slice. Mar 17 18:49:04.710850 systemd[1]: Started sshd@0-10.200.8.36:22-10.200.16.10:39946.service. Mar 17 18:49:05.595632 sshd[1882]: Accepted publickey for core from 10.200.16.10 port 39946 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:49:05.597361 sshd[1882]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:49:05.601828 systemd-logind[1500]: New session 3 of user core. Mar 17 18:49:05.603311 systemd[1]: Started session-3.scope. Mar 17 18:49:06.142641 systemd[1]: Started sshd@1-10.200.8.36:22-10.200.16.10:39954.service. Mar 17 18:49:06.762548 sshd[1887]: Accepted publickey for core from 10.200.16.10 port 39954 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:49:06.763956 sshd[1887]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:49:06.768572 systemd[1]: Started session-4.scope. Mar 17 18:49:06.768872 systemd-logind[1500]: New session 4 of user core. Mar 17 18:49:07.214391 sshd[1887]: pam_unix(sshd:session): session closed for user core Mar 17 18:49:07.218160 systemd[1]: sshd@1-10.200.8.36:22-10.200.16.10:39954.service: Deactivated successfully. Mar 17 18:49:07.219728 systemd[1]: session-4.scope: Deactivated successfully. Mar 17 18:49:07.219753 systemd-logind[1500]: Session 4 logged out. Waiting for processes to exit. Mar 17 18:49:07.221136 systemd-logind[1500]: Removed session 4. Mar 17 18:49:07.317551 systemd[1]: Started sshd@2-10.200.8.36:22-10.200.16.10:39970.service. Mar 17 18:49:07.938739 sshd[1897]: Accepted publickey for core from 10.200.16.10 port 39970 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:49:07.940376 sshd[1897]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:49:07.946191 systemd[1]: Started session-5.scope. Mar 17 18:49:07.946503 systemd-logind[1500]: New session 5 of user core. Mar 17 18:49:08.391490 sshd[1897]: pam_unix(sshd:session): session closed for user core Mar 17 18:49:08.394921 systemd[1]: sshd@2-10.200.8.36:22-10.200.16.10:39970.service: Deactivated successfully. Mar 17 18:49:08.396448 systemd[1]: session-5.scope: Deactivated successfully. Mar 17 18:49:08.396485 systemd-logind[1500]: Session 5 logged out. Waiting for processes to exit. Mar 17 18:49:08.397952 systemd-logind[1500]: Removed session 5. Mar 17 18:49:08.494707 systemd[1]: Started sshd@3-10.200.8.36:22-10.200.16.10:50900.service. Mar 17 18:49:09.116179 sshd[1904]: Accepted publickey for core from 10.200.16.10 port 50900 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:49:09.117834 sshd[1904]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:49:09.123641 systemd[1]: Started session-6.scope. Mar 17 18:49:09.123977 systemd-logind[1500]: New session 6 of user core. Mar 17 18:49:09.568471 sshd[1904]: pam_unix(sshd:session): session closed for user core Mar 17 18:49:09.571694 systemd[1]: sshd@3-10.200.8.36:22-10.200.16.10:50900.service: Deactivated successfully. Mar 17 18:49:09.573081 systemd-logind[1500]: Session 6 logged out. Waiting for processes to exit. Mar 17 18:49:09.573213 systemd[1]: session-6.scope: Deactivated successfully. Mar 17 18:49:09.574726 systemd-logind[1500]: Removed session 6. Mar 17 18:49:09.671785 systemd[1]: Started sshd@4-10.200.8.36:22-10.200.16.10:50912.service. Mar 17 18:49:10.294941 sshd[1911]: Accepted publickey for core from 10.200.16.10 port 50912 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:49:10.296641 sshd[1911]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:49:10.302366 systemd[1]: Started session-7.scope. Mar 17 18:49:10.302790 systemd-logind[1500]: New session 7 of user core. Mar 17 18:49:10.495190 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Mar 17 18:49:10.495476 systemd[1]: Stopped kubelet.service. Mar 17 18:49:10.497603 systemd[1]: Starting kubelet.service... Mar 17 18:49:10.584918 systemd[1]: Started kubelet.service. Mar 17 18:49:11.178123 kubelet[1923]: E0317 18:49:11.178073 1923 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:49:11.179830 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:49:11.180033 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:49:11.429988 sudo[1929]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Mar 17 18:49:11.430598 sudo[1929]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:49:11.454910 dbus-daemon[1487]: \xd0\xfd\u0006ԶU: received setenforce notice (enforcing=-1470225392) Mar 17 18:49:11.457116 sudo[1929]: pam_unix(sudo:session): session closed for user root Mar 17 18:49:11.574798 sshd[1911]: pam_unix(sshd:session): session closed for user core Mar 17 18:49:11.578629 systemd[1]: sshd@4-10.200.8.36:22-10.200.16.10:50912.service: Deactivated successfully. Mar 17 18:49:11.579896 systemd[1]: session-7.scope: Deactivated successfully. Mar 17 18:49:11.581891 systemd-logind[1500]: Session 7 logged out. Waiting for processes to exit. Mar 17 18:49:11.583495 systemd-logind[1500]: Removed session 7. Mar 17 18:49:11.677797 systemd[1]: Started sshd@5-10.200.8.36:22-10.200.16.10:50928.service. Mar 17 18:49:12.300598 sshd[1934]: Accepted publickey for core from 10.200.16.10 port 50928 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:49:12.302358 sshd[1934]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:49:12.308067 systemd[1]: Started session-8.scope. Mar 17 18:49:12.308318 systemd-logind[1500]: New session 8 of user core. Mar 17 18:49:12.645651 sudo[1939]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Mar 17 18:49:12.645955 sudo[1939]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:49:12.648894 sudo[1939]: pam_unix(sudo:session): session closed for user root Mar 17 18:49:12.653459 sudo[1938]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/bin/systemctl restart audit-rules Mar 17 18:49:12.653756 sudo[1938]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:49:12.662541 systemd[1]: Stopping audit-rules.service... Mar 17 18:49:12.662000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Mar 17 18:49:12.666900 kernel: kauditd_printk_skb: 79 callbacks suppressed Mar 17 18:49:12.666957 kernel: audit: type=1305 audit(1742237352.662:159): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Mar 17 18:49:12.667130 auditctl[1942]: No rules Mar 17 18:49:12.667552 systemd[1]: audit-rules.service: Deactivated successfully. Mar 17 18:49:12.667787 systemd[1]: Stopped audit-rules.service. Mar 17 18:49:12.669418 systemd[1]: Starting audit-rules.service... Mar 17 18:49:12.662000 audit[1942]: SYSCALL arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd44f57f10 a2=420 a3=0 items=0 ppid=1 pid=1942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:12.687604 kernel: audit: type=1300 audit(1742237352.662:159): arch=c000003e syscall=44 success=yes exit=1056 a0=3 a1=7ffd44f57f10 a2=420 a3=0 items=0 ppid=1 pid=1942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/sbin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:12.662000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D44 Mar 17 18:49:12.666000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.697203 augenrules[1960]: No rules Mar 17 18:49:12.698141 systemd[1]: Finished audit-rules.service. Mar 17 18:49:12.699148 sudo[1938]: pam_unix(sudo:session): session closed for user root Mar 17 18:49:12.701571 kernel: audit: type=1327 audit(1742237352.662:159): proctitle=2F7362696E2F617564697463746C002D44 Mar 17 18:49:12.701634 kernel: audit: type=1131 audit(1742237352.666:160): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.701663 kernel: audit: type=1130 audit(1742237352.696:161): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.696000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.696000 audit[1938]: USER_END pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.721793 kernel: audit: type=1106 audit(1742237352.696:162): pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.722693 kernel: audit: type=1104 audit(1742237352.696:163): pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.696000 audit[1938]: CRED_DISP pid=1938 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.805302 sshd[1934]: pam_unix(sshd:session): session closed for user core Mar 17 18:49:12.805000 audit[1934]: USER_END pid=1934 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:12.808787 systemd[1]: sshd@5-10.200.8.36:22-10.200.16.10:50928.service: Deactivated successfully. Mar 17 18:49:12.809612 systemd[1]: session-8.scope: Deactivated successfully. Mar 17 18:49:12.811008 systemd-logind[1500]: Session 8 logged out. Waiting for processes to exit. Mar 17 18:49:12.812087 systemd-logind[1500]: Removed session 8. Mar 17 18:49:12.805000 audit[1934]: CRED_DISP pid=1934 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:12.831934 kernel: audit: type=1106 audit(1742237352.805:164): pid=1934 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:12.832002 kernel: audit: type=1104 audit(1742237352.805:165): pid=1934 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:12.832031 kernel: audit: type=1131 audit(1742237352.805:166): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.200.8.36:22-10.200.16.10:50928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.805000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.200.8.36:22-10.200.16.10:50928 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:12.907799 systemd[1]: Started sshd@6-10.200.8.36:22-10.200.16.10:50942.service. Mar 17 18:49:12.906000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.200.8.36:22-10.200.16.10:50942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:13.678000 audit[1967]: USER_ACCT pid=1967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:13.680308 sshd[1967]: Accepted publickey for core from 10.200.16.10 port 50942 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:49:13.679000 audit[1967]: CRED_ACQ pid=1967 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:13.679000 audit[1967]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffb6c2eb30 a2=3 a3=0 items=0 ppid=1 pid=1967 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:13.679000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:49:13.682020 sshd[1967]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:49:13.687769 systemd[1]: Started session-9.scope. Mar 17 18:49:13.688724 systemd-logind[1500]: New session 9 of user core. Mar 17 18:49:13.692000 audit[1967]: USER_START pid=1967 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:13.693000 audit[1970]: CRED_ACQ pid=1970 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:14.023000 audit[1971]: USER_ACCT pid=1971 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:14.025431 sudo[1971]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Mar 17 18:49:14.023000 audit[1971]: CRED_REFR pid=1971 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:14.025870 sudo[1971]: pam_unix(sudo:session): session opened for user root(uid=0) by (uid=500) Mar 17 18:49:14.025000 audit[1971]: USER_START pid=1971 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:14.050552 systemd[1]: Starting docker.service... Mar 17 18:49:14.086616 env[1981]: time="2025-03-17T18:49:14.086573654Z" level=info msg="Starting up" Mar 17 18:49:14.088177 env[1981]: time="2025-03-17T18:49:14.088151113Z" level=info msg="parsed scheme: \"unix\"" module=grpc Mar 17 18:49:14.088309 env[1981]: time="2025-03-17T18:49:14.088296018Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Mar 17 18:49:14.088376 env[1981]: time="2025-03-17T18:49:14.088364420Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Mar 17 18:49:14.088420 env[1981]: time="2025-03-17T18:49:14.088413022Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Mar 17 18:49:14.090290 env[1981]: time="2025-03-17T18:49:14.090272992Z" level=info msg="parsed scheme: \"unix\"" module=grpc Mar 17 18:49:14.090369 env[1981]: time="2025-03-17T18:49:14.090358295Z" level=info msg="scheme \"unix\" not registered, fallback to default scheme" module=grpc Mar 17 18:49:14.090425 env[1981]: time="2025-03-17T18:49:14.090414297Z" level=info msg="ccResolverWrapper: sending update to cc: {[{unix:///var/run/docker/libcontainerd/docker-containerd.sock 0 }] }" module=grpc Mar 17 18:49:14.090471 env[1981]: time="2025-03-17T18:49:14.090463199Z" level=info msg="ClientConn switching balancer to \"pick_first\"" module=grpc Mar 17 18:49:14.101037 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3805045763-merged.mount: Deactivated successfully. Mar 17 18:49:14.211063 env[1981]: time="2025-03-17T18:49:14.211025185Z" level=warning msg="Your kernel does not support cgroup blkio weight" Mar 17 18:49:14.211063 env[1981]: time="2025-03-17T18:49:14.211050586Z" level=warning msg="Your kernel does not support cgroup blkio weight_device" Mar 17 18:49:14.211333 env[1981]: time="2025-03-17T18:49:14.211265394Z" level=info msg="Loading containers: start." Mar 17 18:49:14.282000 audit[2008]: NETFILTER_CFG table=nat:5 family=2 entries=2 op=nft_register_chain pid=2008 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.282000 audit[2008]: SYSCALL arch=c000003e syscall=46 success=yes exit=116 a0=3 a1=7ffcae2a8710 a2=0 a3=7ffcae2a86fc items=0 ppid=1981 pid=2008 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.282000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Mar 17 18:49:14.284000 audit[2010]: NETFILTER_CFG table=filter:6 family=2 entries=2 op=nft_register_chain pid=2010 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.284000 audit[2010]: SYSCALL arch=c000003e syscall=46 success=yes exit=124 a0=3 a1=7fff3ee59730 a2=0 a3=7fff3ee5971c items=0 ppid=1981 pid=2010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.284000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Mar 17 18:49:14.286000 audit[2012]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2012 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.286000 audit[2012]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff7315fb70 a2=0 a3=7fff7315fb5c items=0 ppid=1981 pid=2012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.286000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Mar 17 18:49:14.288000 audit[2014]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2014 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.288000 audit[2014]: SYSCALL arch=c000003e syscall=46 success=yes exit=112 a0=3 a1=7fff9cd49f60 a2=0 a3=7fff9cd49f4c items=0 ppid=1981 pid=2014 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.288000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Mar 17 18:49:14.289000 audit[2016]: NETFILTER_CFG table=filter:9 family=2 entries=1 op=nft_register_rule pid=2016 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.289000 audit[2016]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffe4f39ed30 a2=0 a3=7ffe4f39ed1c items=0 ppid=1981 pid=2016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.289000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6A0052455455524E Mar 17 18:49:14.291000 audit[2018]: NETFILTER_CFG table=filter:10 family=2 entries=1 op=nft_register_rule pid=2018 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.291000 audit[2018]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffd30550640 a2=0 a3=7ffd3055062c items=0 ppid=1981 pid=2018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.291000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D49534F4C4154494F4E2D53544147452D32002D6A0052455455524E Mar 17 18:49:14.307000 audit[2020]: NETFILTER_CFG table=filter:11 family=2 entries=1 op=nft_register_chain pid=2020 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.307000 audit[2020]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7fff16d9aa50 a2=0 a3=7fff16d9aa3c items=0 ppid=1981 pid=2020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.307000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Mar 17 18:49:14.309000 audit[2022]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2022 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.309000 audit[2022]: SYSCALL arch=c000003e syscall=46 success=yes exit=212 a0=3 a1=7ffd6a56d7c0 a2=0 a3=7ffd6a56d7ac items=0 ppid=1981 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.309000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Mar 17 18:49:14.311000 audit[2024]: NETFILTER_CFG table=filter:13 family=2 entries=2 op=nft_register_chain pid=2024 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.311000 audit[2024]: SYSCALL arch=c000003e syscall=46 success=yes exit=308 a0=3 a1=7fff844553a0 a2=0 a3=7fff8445538c items=0 ppid=1981 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.311000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:49:14.325000 audit[2028]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_unregister_rule pid=2028 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.325000 audit[2028]: SYSCALL arch=c000003e syscall=46 success=yes exit=216 a0=3 a1=7ffe1fb56c30 a2=0 a3=7ffe1fb56c1c items=0 ppid=1981 pid=2028 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.325000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:49:14.329000 audit[2029]: NETFILTER_CFG table=filter:15 family=2 entries=1 op=nft_register_rule pid=2029 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.329000 audit[2029]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7ffebd2a7170 a2=0 a3=7ffebd2a715c items=0 ppid=1981 pid=2029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.329000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:49:14.373948 kernel: Initializing XFRM netlink socket Mar 17 18:49:14.398415 env[1981]: time="2025-03-17T18:49:14.398382458Z" level=info msg="Default bridge (docker0) is assigned with an IP address 172.17.0.0/16. Daemon option --bip can be used to set a preferred IP address" Mar 17 18:49:14.464000 audit[2037]: NETFILTER_CFG table=nat:16 family=2 entries=2 op=nft_register_chain pid=2037 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.464000 audit[2037]: SYSCALL arch=c000003e syscall=46 success=yes exit=492 a0=3 a1=7ffd84e88160 a2=0 a3=7ffd84e8814c items=0 ppid=1981 pid=2037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.464000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Mar 17 18:49:14.488000 audit[2040]: NETFILTER_CFG table=nat:17 family=2 entries=1 op=nft_register_rule pid=2040 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.488000 audit[2040]: SYSCALL arch=c000003e syscall=46 success=yes exit=288 a0=3 a1=7ffd6cf31390 a2=0 a3=7ffd6cf3137c items=0 ppid=1981 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.488000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Mar 17 18:49:14.491000 audit[2043]: NETFILTER_CFG table=filter:18 family=2 entries=1 op=nft_register_rule pid=2043 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.491000 audit[2043]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffc04f22b90 a2=0 a3=7ffc04f22b7c items=0 ppid=1981 pid=2043 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.491000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B657230002D6F00646F636B657230002D6A00414343455054 Mar 17 18:49:14.493000 audit[2045]: NETFILTER_CFG table=filter:19 family=2 entries=1 op=nft_register_rule pid=2045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.493000 audit[2045]: SYSCALL arch=c000003e syscall=46 success=yes exit=376 a0=3 a1=7ffffea06e90 a2=0 a3=7ffffea06e7c items=0 ppid=1981 pid=2045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.493000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6900646F636B6572300000002D6F00646F636B657230002D6A00414343455054 Mar 17 18:49:14.495000 audit[2047]: NETFILTER_CFG table=nat:20 family=2 entries=2 op=nft_register_chain pid=2047 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.495000 audit[2047]: SYSCALL arch=c000003e syscall=46 success=yes exit=356 a0=3 a1=7ffd4d0fc720 a2=0 a3=7ffd4d0fc70c items=0 ppid=1981 pid=2047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.495000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Mar 17 18:49:14.497000 audit[2049]: NETFILTER_CFG table=nat:21 family=2 entries=2 op=nft_register_chain pid=2049 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.497000 audit[2049]: SYSCALL arch=c000003e syscall=46 success=yes exit=444 a0=3 a1=7ffe53949710 a2=0 a3=7ffe539496fc items=0 ppid=1981 pid=2049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.497000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Mar 17 18:49:14.499000 audit[2051]: NETFILTER_CFG table=filter:22 family=2 entries=1 op=nft_register_rule pid=2051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.499000 audit[2051]: SYSCALL arch=c000003e syscall=46 success=yes exit=304 a0=3 a1=7ffe237f9570 a2=0 a3=7ffe237f955c items=0 ppid=1981 pid=2051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.499000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6A00444F434B4552 Mar 17 18:49:14.501000 audit[2053]: NETFILTER_CFG table=filter:23 family=2 entries=1 op=nft_register_rule pid=2053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.501000 audit[2053]: SYSCALL arch=c000003e syscall=46 success=yes exit=508 a0=3 a1=7ffcab7ed530 a2=0 a3=7ffcab7ed51c items=0 ppid=1981 pid=2053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.501000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Mar 17 18:49:14.503000 audit[2055]: NETFILTER_CFG table=filter:24 family=2 entries=1 op=nft_register_rule pid=2055 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.503000 audit[2055]: SYSCALL arch=c000003e syscall=46 success=yes exit=240 a0=3 a1=7fffb3083990 a2=0 a3=7fffb308397c items=0 ppid=1981 pid=2055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.503000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Mar 17 18:49:14.505000 audit[2057]: NETFILTER_CFG table=filter:25 family=2 entries=1 op=nft_register_rule pid=2057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.505000 audit[2057]: SYSCALL arch=c000003e syscall=46 success=yes exit=428 a0=3 a1=7fff5ec8fbb0 a2=0 a3=7fff5ec8fb9c items=0 ppid=1981 pid=2057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.505000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Mar 17 18:49:14.507000 audit[2059]: NETFILTER_CFG table=filter:26 family=2 entries=1 op=nft_register_rule pid=2059 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.507000 audit[2059]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd726909a0 a2=0 a3=7ffd7269098c items=0 ppid=1981 pid=2059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.507000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Mar 17 18:49:14.509743 systemd-networkd[1687]: docker0: Link UP Mar 17 18:49:14.523000 audit[2063]: NETFILTER_CFG table=filter:27 family=2 entries=1 op=nft_unregister_rule pid=2063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.523000 audit[2063]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fff95b04d50 a2=0 a3=7fff95b04d3c items=0 ppid=1981 pid=2063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.523000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4400464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:49:14.527000 audit[2064]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_rule pid=2064 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:14.527000 audit[2064]: SYSCALL arch=c000003e syscall=46 success=yes exit=224 a0=3 a1=7fffb3119270 a2=0 a3=7fffb311925c items=0 ppid=1981 pid=2064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:14.527000 audit: PROCTITLE proctitle=2F7573722F7362696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Mar 17 18:49:14.529996 env[1981]: time="2025-03-17T18:49:14.529957054Z" level=info msg="Loading containers: done." Mar 17 18:49:14.542346 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2090129851-merged.mount: Deactivated successfully. Mar 17 18:49:14.594276 env[1981]: time="2025-03-17T18:49:14.594216545Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Mar 17 18:49:14.594576 env[1981]: time="2025-03-17T18:49:14.594544558Z" level=info msg="Docker daemon" commit=112bdf3343 graphdriver(s)=overlay2 version=20.10.23 Mar 17 18:49:14.594736 env[1981]: time="2025-03-17T18:49:14.594712264Z" level=info msg="Daemon has completed initialization" Mar 17 18:49:14.624562 systemd[1]: Started docker.service. Mar 17 18:49:14.623000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:14.635912 env[1981]: time="2025-03-17T18:49:14.635767892Z" level=info msg="API listen on /run/docker.sock" Mar 17 18:49:21.245185 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Mar 17 18:49:21.245507 systemd[1]: Stopped kubelet.service. Mar 17 18:49:21.256420 kernel: kauditd_printk_skb: 84 callbacks suppressed Mar 17 18:49:21.256524 kernel: audit: type=1130 audit(1742237361.243:201): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:21.243000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:21.247509 systemd[1]: Starting kubelet.service... Mar 17 18:49:21.243000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:21.268712 kernel: audit: type=1131 audit(1742237361.243:202): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:21.871000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:21.872522 systemd[1]: Started kubelet.service. Mar 17 18:49:21.885706 kernel: audit: type=1130 audit(1742237361.871:203): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:21.917473 kubelet[2110]: E0317 18:49:21.917427 2110 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:49:21.918985 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:49:21.919189 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:49:21.917000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:49:21.933307 kernel: audit: type=1131 audit(1742237361.917:204): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:49:23.202957 env[1524]: time="2025-03-17T18:49:23.202890783Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\"" Mar 17 18:49:23.881522 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1516539577.mount: Deactivated successfully. Mar 17 18:49:25.947856 env[1524]: time="2025-03-17T18:49:25.947789740Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:25.952903 env[1524]: time="2025-03-17T18:49:25.952859779Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:25.956517 env[1524]: time="2025-03-17T18:49:25.956476479Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-apiserver:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:25.960366 env[1524]: time="2025-03-17T18:49:25.960334084Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-apiserver@sha256:77c54346965036acc7ac95c3200597ede36db9246179248dde21c1a3ecc1caf0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:25.960987 env[1524]: time="2025-03-17T18:49:25.960956202Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.30.11\" returns image reference \"sha256:4db5a05c271eac8f5da2f95895ea1ccb9a38f48db3135ba3bdfe35941a396ea8\"" Mar 17 18:49:25.974604 env[1524]: time="2025-03-17T18:49:25.974565075Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\"" Mar 17 18:49:28.033257 env[1524]: time="2025-03-17T18:49:28.033193490Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:28.040485 env[1524]: time="2025-03-17T18:49:28.040389772Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:28.049418 env[1524]: time="2025-03-17T18:49:28.049380400Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-controller-manager:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:28.055144 env[1524]: time="2025-03-17T18:49:28.055109545Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-controller-manager@sha256:d8874f3fb45591ecdac67a3035c730808f18b3ab13147495c7d77eb1960d4f6f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:28.055822 env[1524]: time="2025-03-17T18:49:28.055791463Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.30.11\" returns image reference \"sha256:de1025c2d496829d3250130380737609ffcdd10a4dce6f2dcd03f23a85a15e6a\"" Mar 17 18:49:28.066104 env[1524]: time="2025-03-17T18:49:28.066068723Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\"" Mar 17 18:49:29.460815 env[1524]: time="2025-03-17T18:49:29.460759965Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:29.474589 env[1524]: time="2025-03-17T18:49:29.474530005Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:29.482514 env[1524]: time="2025-03-17T18:49:29.482472401Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-scheduler:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:29.487680 env[1524]: time="2025-03-17T18:49:29.487631529Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-scheduler@sha256:c699f8c97ae7ec819c8bd878d3db104ba72fc440d810d9030e09286b696017b5,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:29.488337 env[1524]: time="2025-03-17T18:49:29.488307145Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.30.11\" returns image reference \"sha256:11492f0faf138e933cadd6f533f03e401da9a35e53711e833f18afa6b185b2b7\"" Mar 17 18:49:29.498151 env[1524]: time="2025-03-17T18:49:29.498124688Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\"" Mar 17 18:49:30.696518 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3678815681.mount: Deactivated successfully. Mar 17 18:49:31.291327 env[1524]: time="2025-03-17T18:49:31.291274436Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:31.296575 env[1524]: time="2025-03-17T18:49:31.296532059Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:31.301008 env[1524]: time="2025-03-17T18:49:31.300973763Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/kube-proxy:v1.30.11,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:31.304537 env[1524]: time="2025-03-17T18:49:31.304452744Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/kube-proxy@sha256:ea4da798040a18ed3f302e8d5f67307c7275a2a53bcf3d51bcec223acda84a55,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:31.305022 env[1524]: time="2025-03-17T18:49:31.304991957Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.30.11\" returns image reference \"sha256:01045f200a8856c3f5ccfa7be03d72274f1f16fc7a047659e709d603d5c019dc\"" Mar 17 18:49:31.315142 env[1524]: time="2025-03-17T18:49:31.315111794Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\"" Mar 17 18:49:31.865756 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1176407290.mount: Deactivated successfully. Mar 17 18:49:31.995052 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 7. Mar 17 18:49:32.010323 kernel: audit: type=1130 audit(1742237371.994:205): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:31.994000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:31.995332 systemd[1]: Stopped kubelet.service. Mar 17 18:49:31.997288 systemd[1]: Starting kubelet.service... Mar 17 18:49:31.994000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:32.024792 kernel: audit: type=1131 audit(1742237371.994:206): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:32.091344 systemd[1]: Started kubelet.service. Mar 17 18:49:32.091000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:32.105692 kernel: audit: type=1130 audit(1742237372.091:207): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:32.137505 kubelet[2147]: E0317 18:49:32.137022 2147 run.go:74] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Mar 17 18:49:32.139008 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Mar 17 18:49:32.139199 systemd[1]: kubelet.service: Failed with result 'exit-code'. Mar 17 18:49:32.138000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:49:32.150695 kernel: audit: type=1131 audit(1742237372.138:208): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Mar 17 18:49:33.772257 env[1524]: time="2025-03-17T18:49:33.772201418Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns:v1.11.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:33.778341 env[1524]: time="2025-03-17T18:49:33.778295253Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:33.782885 env[1524]: time="2025-03-17T18:49:33.782847654Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/coredns/coredns:v1.11.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:33.787147 env[1524]: time="2025-03-17T18:49:33.787110949Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/coredns/coredns@sha256:1eeb4c7316bacb1d4c8ead65571cd92dd21e27359f0d4917f1a5822a73b75db1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:33.787857 env[1524]: time="2025-03-17T18:49:33.787826365Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.1\" returns image reference \"sha256:cbb01a7bd410dc08ba382018ab909a674fb0e48687f0c00797ed5bc34fcc6bb4\"" Mar 17 18:49:33.797477 env[1524]: time="2025-03-17T18:49:33.797445679Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\"" Mar 17 18:49:34.213033 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1456138967.mount: Deactivated successfully. Mar 17 18:49:34.232965 env[1524]: time="2025-03-17T18:49:34.232914335Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:34.240375 env[1524]: time="2025-03-17T18:49:34.240337996Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:34.258067 env[1524]: time="2025-03-17T18:49:34.258021779Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:34.265771 env[1524]: time="2025-03-17T18:49:34.265729546Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:7031c1b283388d2c2e09b57badb803c05ebed362dc88d84b480cc47f72a21097,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:34.266361 env[1524]: time="2025-03-17T18:49:34.266327459Z" level=info msg="PullImage \"registry.k8s.io/pause:3.9\" returns image reference \"sha256:e6f1816883972d4be47bd48879a08919b96afcd344132622e4d444987919323c\"" Mar 17 18:49:34.275997 env[1524]: time="2025-03-17T18:49:34.275970468Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\"" Mar 17 18:49:34.839915 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount120236356.mount: Deactivated successfully. Mar 17 18:49:37.645934 env[1524]: time="2025-03-17T18:49:37.645875325Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd:3.5.12-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:37.651085 env[1524]: time="2025-03-17T18:49:37.651044629Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:37.655772 env[1524]: time="2025-03-17T18:49:37.655736823Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/etcd:3.5.12-0,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:37.659846 env[1524]: time="2025-03-17T18:49:37.659808905Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/etcd@sha256:44a8e24dcbba3470ee1fee21d5e88d128c936e9b55d4bc51fbef8086f8ed123b,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:37.660585 env[1524]: time="2025-03-17T18:49:37.660548620Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.12-0\" returns image reference \"sha256:3861cfcd7c04ccac1f062788eca39487248527ef0c0cfd477a83d7691a75a899\"" Mar 17 18:49:40.392000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:40.392632 systemd[1]: Stopped kubelet.service. Mar 17 18:49:40.396002 systemd[1]: Starting kubelet.service... Mar 17 18:49:40.392000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:40.419412 kernel: audit: type=1130 audit(1742237380.392:209): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:40.419504 kernel: audit: type=1131 audit(1742237380.392:210): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:40.419847 systemd[1]: Reloading. Mar 17 18:49:40.529230 /usr/lib/systemd/system-generators/torcx-generator[2250]: time="2025-03-17T18:49:40Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 18:49:40.529267 /usr/lib/systemd/system-generators/torcx-generator[2250]: time="2025-03-17T18:49:40Z" level=info msg="torcx already run" Mar 17 18:49:40.635933 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:49:40.635959 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:49:40.657611 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:49:40.758194 systemd[1]: Started kubelet.service. Mar 17 18:49:40.757000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:40.772687 kernel: audit: type=1130 audit(1742237380.757:211): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:40.780155 systemd[1]: Stopping kubelet.service... Mar 17 18:49:40.781985 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 18:49:40.782299 systemd[1]: Stopped kubelet.service. Mar 17 18:49:40.785509 systemd[1]: Starting kubelet.service... Mar 17 18:49:40.781000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:40.800695 kernel: audit: type=1131 audit(1742237380.781:212): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:41.052010 systemd[1]: Started kubelet.service. Mar 17 18:49:41.051000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:41.069699 kernel: audit: type=1130 audit(1742237381.051:213): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:41.098302 kubelet[2339]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:49:41.098696 kubelet[2339]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 18:49:41.098758 kubelet[2339]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:49:41.098861 kubelet[2339]: I0317 18:49:41.098844 2339 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 18:49:41.433271 kubelet[2339]: I0317 18:49:41.432892 2339 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 18:49:41.433271 kubelet[2339]: I0317 18:49:41.432918 2339 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 18:49:41.433466 kubelet[2339]: I0317 18:49:41.433374 2339 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 18:49:41.759621 kubelet[2339]: I0317 18:49:41.758954 2339 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 18:49:41.767231 kubelet[2339]: E0317 18:49:41.767208 2339 certificate_manager.go:562] kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post "https://10.200.8.36:6443/apis/certificates.k8s.io/v1/certificatesigningrequests": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:41.782872 kubelet[2339]: I0317 18:49:41.782846 2339 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 18:49:41.786716 kubelet[2339]: I0317 18:49:41.786681 2339 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 18:49:41.786908 kubelet[2339]: I0317 18:49:41.786717 2339 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-3510.3.7-a-b312ad98ee","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 18:49:41.787313 kubelet[2339]: I0317 18:49:41.787293 2339 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 18:49:41.787373 kubelet[2339]: I0317 18:49:41.787317 2339 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 18:49:41.787461 kubelet[2339]: I0317 18:49:41.787446 2339 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:49:41.788448 kubelet[2339]: I0317 18:49:41.788431 2339 kubelet.go:400] "Attempting to sync node with API server" Mar 17 18:49:41.788539 kubelet[2339]: I0317 18:49:41.788454 2339 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 18:49:41.788539 kubelet[2339]: I0317 18:49:41.788483 2339 kubelet.go:312] "Adding apiserver pod source" Mar 17 18:49:41.788539 kubelet[2339]: I0317 18:49:41.788502 2339 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 18:49:41.795605 kubelet[2339]: W0317 18:49:41.795141 2339 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.36:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:41.795605 kubelet[2339]: E0317 18:49:41.795205 2339 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.8.36:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:41.795605 kubelet[2339]: W0317 18:49:41.795276 2339 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.7-a-b312ad98ee&limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:41.795605 kubelet[2339]: E0317 18:49:41.795313 2339 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.8.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.7-a-b312ad98ee&limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:41.795854 kubelet[2339]: I0317 18:49:41.795775 2339 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Mar 17 18:49:41.803052 kubelet[2339]: I0317 18:49:41.803028 2339 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 18:49:41.803136 kubelet[2339]: W0317 18:49:41.803083 2339 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Mar 17 18:49:41.803974 kubelet[2339]: I0317 18:49:41.803869 2339 server.go:1264] "Started kubelet" Mar 17 18:49:41.815541 kubelet[2339]: E0317 18:49:41.815419 2339 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.36:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.36:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-3510.3.7-a-b312ad98ee.182dabab00d32941 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-3510.3.7-a-b312ad98ee,UID:ci-3510.3.7-a-b312ad98ee,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-3510.3.7-a-b312ad98ee,},FirstTimestamp:2025-03-17 18:49:41.803845953 +0000 UTC m=+0.745247443,LastTimestamp:2025-03-17 18:49:41.803845953 +0000 UTC m=+0.745247443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-3510.3.7-a-b312ad98ee,}" Mar 17 18:49:41.815826 kubelet[2339]: I0317 18:49:41.815783 2339 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 18:49:41.816633 kubelet[2339]: I0317 18:49:41.816218 2339 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 18:49:41.831725 kernel: audit: type=1400 audit(1742237381.816:214): avc: denied { mac_admin } for pid=2339 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:41.816000 audit[2339]: AVC avc: denied { mac_admin } for pid=2339 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:41.831911 kubelet[2339]: I0317 18:49:41.817565 2339 kubelet.go:1419] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Mar 17 18:49:41.831911 kubelet[2339]: I0317 18:49:41.817609 2339 kubelet.go:1423] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Mar 17 18:49:41.831911 kubelet[2339]: I0317 18:49:41.817680 2339 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 18:49:41.831911 kubelet[2339]: E0317 18:49:41.819884 2339 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 18:49:41.831911 kubelet[2339]: I0317 18:49:41.819956 2339 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 18:49:41.831911 kubelet[2339]: I0317 18:49:41.820899 2339 server.go:455] "Adding debug handlers to kubelet server" Mar 17 18:49:41.831911 kubelet[2339]: I0317 18:49:41.825264 2339 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 18:49:41.831911 kubelet[2339]: I0317 18:49:41.825369 2339 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 18:49:41.831911 kubelet[2339]: I0317 18:49:41.825422 2339 reconciler.go:26] "Reconciler: start to sync state" Mar 17 18:49:41.831911 kubelet[2339]: W0317 18:49:41.825659 2339 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:41.831911 kubelet[2339]: E0317 18:49:41.825708 2339 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.8.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:41.832414 kubelet[2339]: E0317 18:49:41.826752 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.7-a-b312ad98ee?timeout=10s\": dial tcp 10.200.8.36:6443: connect: connection refused" interval="200ms" Mar 17 18:49:41.832414 kubelet[2339]: I0317 18:49:41.827104 2339 factory.go:221] Registration of the systemd container factory successfully Mar 17 18:49:41.832414 kubelet[2339]: I0317 18:49:41.827159 2339 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 18:49:41.832414 kubelet[2339]: I0317 18:49:41.828182 2339 factory.go:221] Registration of the containerd container factory successfully Mar 17 18:49:41.816000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:49:41.841048 kernel: audit: type=1401 audit(1742237381.816:214): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:49:41.816000 audit[2339]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000975710 a1=c000973758 a2=c0009756e0 a3=25 items=0 ppid=1 pid=2339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.816000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:49:41.876980 kernel: audit: type=1300 audit(1742237381.816:214): arch=c000003e syscall=188 success=no exit=-22 a0=c000975710 a1=c000973758 a2=c0009756e0 a3=25 items=0 ppid=1 pid=2339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.877074 kernel: audit: type=1327 audit(1742237381.816:214): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:49:41.877107 kernel: audit: type=1400 audit(1742237381.816:215): avc: denied { mac_admin } for pid=2339 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:41.816000 audit[2339]: AVC avc: denied { mac_admin } for pid=2339 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:41.816000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:49:41.816000 audit[2339]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000a0c140 a1=c000973770 a2=c0009757a0 a3=25 items=0 ppid=1 pid=2339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.816000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:49:41.840000 audit[2349]: NETFILTER_CFG table=mangle:29 family=2 entries=2 op=nft_register_chain pid=2349 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:41.840000 audit[2349]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffcd7024170 a2=0 a3=7ffcd702415c items=0 ppid=2339 pid=2349 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.840000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Mar 17 18:49:41.851000 audit[2350]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_chain pid=2350 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:41.851000 audit[2350]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffca61ee090 a2=0 a3=7ffca61ee07c items=0 ppid=2339 pid=2350 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.851000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Mar 17 18:49:41.851000 audit[2352]: NETFILTER_CFG table=filter:31 family=2 entries=2 op=nft_register_chain pid=2352 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:41.851000 audit[2352]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffc28eb8590 a2=0 a3=7ffc28eb857c items=0 ppid=2339 pid=2352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.851000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:49:41.851000 audit[2354]: NETFILTER_CFG table=filter:32 family=2 entries=2 op=nft_register_chain pid=2354 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:41.851000 audit[2354]: SYSCALL arch=c000003e syscall=46 success=yes exit=312 a0=3 a1=7ffd98c52200 a2=0 a3=7ffd98c521ec items=0 ppid=2339 pid=2354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.851000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:49:41.899000 audit[2360]: NETFILTER_CFG table=filter:33 family=2 entries=1 op=nft_register_rule pid=2360 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:41.899000 audit[2360]: SYSCALL arch=c000003e syscall=46 success=yes exit=924 a0=3 a1=7ffca39c8c00 a2=0 a3=7ffca39c8bec items=0 ppid=2339 pid=2360 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.899000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Mar 17 18:49:41.900724 kubelet[2339]: I0317 18:49:41.900641 2339 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 18:49:41.901000 audit[2363]: NETFILTER_CFG table=mangle:34 family=2 entries=1 op=nft_register_chain pid=2363 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:41.901000 audit[2363]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff26610fb0 a2=0 a3=7fff26610f9c items=0 ppid=2339 pid=2363 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.901000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Mar 17 18:49:41.901000 audit[2362]: NETFILTER_CFG table=mangle:35 family=10 entries=2 op=nft_register_chain pid=2362 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:49:41.901000 audit[2362]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd7e2a9890 a2=0 a3=10e3 items=0 ppid=2339 pid=2362 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.901000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Mar 17 18:49:41.902901 kubelet[2339]: I0317 18:49:41.902873 2339 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 18:49:41.903071 kubelet[2339]: I0317 18:49:41.903052 2339 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 18:49:41.903211 kubelet[2339]: I0317 18:49:41.903196 2339 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 18:49:41.903382 kubelet[2339]: E0317 18:49:41.903354 2339 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 18:49:41.903000 audit[2366]: NETFILTER_CFG table=nat:36 family=2 entries=1 op=nft_register_chain pid=2366 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:41.903000 audit[2366]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdead0d9d0 a2=0 a3=7ffdead0d9bc items=0 ppid=2339 pid=2366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.903000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Mar 17 18:49:41.904000 audit[2367]: NETFILTER_CFG table=mangle:37 family=10 entries=1 op=nft_register_chain pid=2367 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:49:41.904000 audit[2367]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fffe62e4b90 a2=0 a3=7fffe62e4b7c items=0 ppid=2339 pid=2367 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.904000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Mar 17 18:49:41.905389 kubelet[2339]: W0317 18:49:41.905323 2339 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:41.905485 kubelet[2339]: E0317 18:49:41.905406 2339 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.8.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:41.905000 audit[2369]: NETFILTER_CFG table=nat:38 family=10 entries=2 op=nft_register_chain pid=2369 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:49:41.905000 audit[2369]: SYSCALL arch=c000003e syscall=46 success=yes exit=128 a0=3 a1=7ffc2ae291c0 a2=0 a3=7ffc2ae291ac items=0 ppid=2339 pid=2369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.905000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Mar 17 18:49:41.906000 audit[2368]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_chain pid=2368 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:49:41.906000 audit[2368]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe126d6140 a2=0 a3=7ffe126d612c items=0 ppid=2339 pid=2368 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.906000 audit[2370]: NETFILTER_CFG table=filter:40 family=10 entries=2 op=nft_register_chain pid=2370 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:49:41.906000 audit[2370]: SYSCALL arch=c000003e syscall=46 success=yes exit=136 a0=3 a1=7ffd80e888c0 a2=0 a3=7ffd80e888ac items=0 ppid=2339 pid=2370 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.906000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Mar 17 18:49:41.906000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Mar 17 18:49:41.937228 kubelet[2339]: I0317 18:49:41.937195 2339 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:41.937969 kubelet[2339]: E0317 18:49:41.937936 2339 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.36:6443/api/v1/nodes\": dial tcp 10.200.8.36:6443: connect: connection refused" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:41.938107 kubelet[2339]: I0317 18:49:41.938055 2339 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 18:49:41.938107 kubelet[2339]: I0317 18:49:41.938070 2339 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 18:49:41.938107 kubelet[2339]: I0317 18:49:41.938091 2339 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:49:41.944565 kubelet[2339]: I0317 18:49:41.944528 2339 policy_none.go:49] "None policy: Start" Mar 17 18:49:41.945614 kubelet[2339]: I0317 18:49:41.945590 2339 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 18:49:41.945614 kubelet[2339]: I0317 18:49:41.945613 2339 state_mem.go:35] "Initializing new in-memory state store" Mar 17 18:49:41.954093 kubelet[2339]: I0317 18:49:41.954068 2339 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 18:49:41.953000 audit[2339]: AVC avc: denied { mac_admin } for pid=2339 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:41.953000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:49:41.953000 audit[2339]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000cfc2a0 a1=c00080f878 a2=c000cfc270 a3=25 items=0 ppid=1 pid=2339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:41.953000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:49:41.954436 kubelet[2339]: I0317 18:49:41.954123 2339 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Mar 17 18:49:41.954436 kubelet[2339]: I0317 18:49:41.954235 2339 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 18:49:41.954436 kubelet[2339]: I0317 18:49:41.954350 2339 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 18:49:41.958269 kubelet[2339]: E0317 18:49:41.958250 2339 eviction_manager.go:282] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-3510.3.7-a-b312ad98ee\" not found" Mar 17 18:49:42.003782 kubelet[2339]: I0317 18:49:42.003631 2339 topology_manager.go:215] "Topology Admit Handler" podUID="c0366fd132b6280150276f513e236bc5" podNamespace="kube-system" podName="kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.005653 kubelet[2339]: I0317 18:49:42.005621 2339 topology_manager.go:215] "Topology Admit Handler" podUID="d04837916cdbc90c12e369201ae01ea9" podNamespace="kube-system" podName="kube-scheduler-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.007879 kubelet[2339]: I0317 18:49:42.007843 2339 topology_manager.go:215] "Topology Admit Handler" podUID="1ced6517b82b8186f353c59555a4eab9" podNamespace="kube-system" podName="kube-apiserver-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.026892 kubelet[2339]: I0317 18:49:42.026796 2339 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1ced6517b82b8186f353c59555a4eab9-k8s-certs\") pod \"kube-apiserver-ci-3510.3.7-a-b312ad98ee\" (UID: \"1ced6517b82b8186f353c59555a4eab9\") " pod="kube-system/kube-apiserver-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.026892 kubelet[2339]: I0317 18:49:42.026831 2339 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1ced6517b82b8186f353c59555a4eab9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.7-a-b312ad98ee\" (UID: \"1ced6517b82b8186f353c59555a4eab9\") " pod="kube-system/kube-apiserver-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.026892 kubelet[2339]: I0317 18:49:42.026857 2339 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.027561 kubelet[2339]: E0317 18:49:42.027521 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.7-a-b312ad98ee?timeout=10s\": dial tcp 10.200.8.36:6443: connect: connection refused" interval="400ms" Mar 17 18:49:42.028779 kubelet[2339]: I0317 18:49:42.026877 2339 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.028779 kubelet[2339]: I0317 18:49:42.028550 2339 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1ced6517b82b8186f353c59555a4eab9-ca-certs\") pod \"kube-apiserver-ci-3510.3.7-a-b312ad98ee\" (UID: \"1ced6517b82b8186f353c59555a4eab9\") " pod="kube-system/kube-apiserver-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.028779 kubelet[2339]: I0317 18:49:42.028579 2339 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d04837916cdbc90c12e369201ae01ea9-kubeconfig\") pod \"kube-scheduler-ci-3510.3.7-a-b312ad98ee\" (UID: \"d04837916cdbc90c12e369201ae01ea9\") " pod="kube-system/kube-scheduler-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.028779 kubelet[2339]: I0317 18:49:42.028602 2339 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-ca-certs\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.028779 kubelet[2339]: I0317 18:49:42.028650 2339 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.029131 kubelet[2339]: I0317 18:49:42.028689 2339 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.140330 kubelet[2339]: I0317 18:49:42.140290 2339 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.140994 kubelet[2339]: E0317 18:49:42.140960 2339 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.36:6443/api/v1/nodes\": dial tcp 10.200.8.36:6443: connect: connection refused" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.314301 env[1524]: time="2025-03-17T18:49:42.313960987Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.7-a-b312ad98ee,Uid:c0366fd132b6280150276f513e236bc5,Namespace:kube-system,Attempt:0,}" Mar 17 18:49:42.316513 env[1524]: time="2025-03-17T18:49:42.316468732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.7-a-b312ad98ee,Uid:d04837916cdbc90c12e369201ae01ea9,Namespace:kube-system,Attempt:0,}" Mar 17 18:49:42.318701 env[1524]: time="2025-03-17T18:49:42.318443867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.7-a-b312ad98ee,Uid:1ced6517b82b8186f353c59555a4eab9,Namespace:kube-system,Attempt:0,}" Mar 17 18:49:42.428765 kubelet[2339]: E0317 18:49:42.428704 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.7-a-b312ad98ee?timeout=10s\": dial tcp 10.200.8.36:6443: connect: connection refused" interval="800ms" Mar 17 18:49:42.543093 kubelet[2339]: I0317 18:49:42.543056 2339 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.543494 kubelet[2339]: E0317 18:49:42.543447 2339 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.36:6443/api/v1/nodes\": dial tcp 10.200.8.36:6443: connect: connection refused" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:42.668696 kubelet[2339]: W0317 18:49:42.668553 2339 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.36:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:42.668696 kubelet[2339]: E0317 18:49:42.668616 2339 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get "https://10.200.8.36:6443/api/v1/services?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:42.844678 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1142696480.mount: Deactivated successfully. Mar 17 18:49:42.875723 env[1524]: time="2025-03-17T18:49:42.875654248Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.878504 env[1524]: time="2025-03-17T18:49:42.878463398Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.885728 env[1524]: time="2025-03-17T18:49:42.885693126Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.888817 env[1524]: time="2025-03-17T18:49:42.888784181Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.892805 env[1524]: time="2025-03-17T18:49:42.892774652Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.898089 env[1524]: time="2025-03-17T18:49:42.898057645Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.902085 env[1524]: time="2025-03-17T18:49:42.902051816Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:6270bb605e12e581514ada5fd5b3216f727db55dc87d5889c790e4c760683fee,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.905588 env[1524]: time="2025-03-17T18:49:42.905555878Z" level=info msg="ImageCreate event &ImageCreate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.909101 env[1524]: time="2025-03-17T18:49:42.909066040Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.911947 env[1524]: time="2025-03-17T18:49:42.911918191Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause:3.6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.917466 env[1524]: time="2025-03-17T18:49:42.917432489Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.922653 env[1524]: time="2025-03-17T18:49:42.921874768Z" level=info msg="ImageUpdate event &ImageUpdate{Name:registry.k8s.io/pause@sha256:3d380ca8864549e74af4b29c10f9cb0956236dfb01c40ca076fb6c37253234db,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:49:42.946345 kubelet[2339]: W0317 18:49:42.946288 2339 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.7-a-b312ad98ee&limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:42.947998 kubelet[2339]: E0317 18:49:42.946353 2339 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get "https://10.200.8.36:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-3510.3.7-a-b312ad98ee&limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:43.019659 env[1524]: time="2025-03-17T18:49:43.017323853Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:49:43.019659 env[1524]: time="2025-03-17T18:49:43.017407055Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:49:43.019659 env[1524]: time="2025-03-17T18:49:43.017422155Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:49:43.019659 env[1524]: time="2025-03-17T18:49:43.017568357Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/8c0b24e22a427858d9105aa20a3d37b6e331b7da5a13e8fa1537c695bee597e6 pid=2378 runtime=io.containerd.runc.v2 Mar 17 18:49:43.024043 env[1524]: time="2025-03-17T18:49:43.023979968Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:49:43.024166 env[1524]: time="2025-03-17T18:49:43.024062770Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:49:43.024166 env[1524]: time="2025-03-17T18:49:43.024091870Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:49:43.024303 env[1524]: time="2025-03-17T18:49:43.024268873Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/4b33d51540d0ac272ba6c4ee8f4244c8c10cf689a1e5f0f2216d29d1e5bb8787 pid=2397 runtime=io.containerd.runc.v2 Mar 17 18:49:43.027405 kubelet[2339]: W0317 18:49:43.027155 2339 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:43.027405 kubelet[2339]: E0317 18:49:43.027202 2339 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get "https://10.200.8.36:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:43.030254 env[1524]: time="2025-03-17T18:49:43.030193576Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:49:43.030980 env[1524]: time="2025-03-17T18:49:43.030937889Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:49:43.031133 env[1524]: time="2025-03-17T18:49:43.031108692Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:49:43.032470 env[1524]: time="2025-03-17T18:49:43.032421715Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/a1371289b974c2c3bbc8ccd471b19b81035d66a269a9c4151e41e85a352397dd pid=2409 runtime=io.containerd.runc.v2 Mar 17 18:49:43.125597 env[1524]: time="2025-03-17T18:49:43.125543526Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-3510.3.7-a-b312ad98ee,Uid:c0366fd132b6280150276f513e236bc5,Namespace:kube-system,Attempt:0,} returns sandbox id \"8c0b24e22a427858d9105aa20a3d37b6e331b7da5a13e8fa1537c695bee597e6\"" Mar 17 18:49:43.131634 env[1524]: time="2025-03-17T18:49:43.131598431Z" level=info msg="CreateContainer within sandbox \"8c0b24e22a427858d9105aa20a3d37b6e331b7da5a13e8fa1537c695bee597e6\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Mar 17 18:49:43.148910 env[1524]: time="2025-03-17T18:49:43.148859230Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-3510.3.7-a-b312ad98ee,Uid:d04837916cdbc90c12e369201ae01ea9,Namespace:kube-system,Attempt:0,} returns sandbox id \"a1371289b974c2c3bbc8ccd471b19b81035d66a269a9c4151e41e85a352397dd\"" Mar 17 18:49:43.152719 env[1524]: time="2025-03-17T18:49:43.152682196Z" level=info msg="CreateContainer within sandbox \"a1371289b974c2c3bbc8ccd471b19b81035d66a269a9c4151e41e85a352397dd\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Mar 17 18:49:43.157211 env[1524]: time="2025-03-17T18:49:43.157163374Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-3510.3.7-a-b312ad98ee,Uid:1ced6517b82b8186f353c59555a4eab9,Namespace:kube-system,Attempt:0,} returns sandbox id \"4b33d51540d0ac272ba6c4ee8f4244c8c10cf689a1e5f0f2216d29d1e5bb8787\"" Mar 17 18:49:43.160114 env[1524]: time="2025-03-17T18:49:43.160089524Z" level=info msg="CreateContainer within sandbox \"4b33d51540d0ac272ba6c4ee8f4244c8c10cf689a1e5f0f2216d29d1e5bb8787\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Mar 17 18:49:43.194154 env[1524]: time="2025-03-17T18:49:43.194040312Z" level=info msg="CreateContainer within sandbox \"8c0b24e22a427858d9105aa20a3d37b6e331b7da5a13e8fa1537c695bee597e6\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5be4ce917b16ed15c4477ccc5430af884a3a2a523c8bbd9cad292c0835fe548d\"" Mar 17 18:49:43.194999 env[1524]: time="2025-03-17T18:49:43.194967328Z" level=info msg="StartContainer for \"5be4ce917b16ed15c4477ccc5430af884a3a2a523c8bbd9cad292c0835fe548d\"" Mar 17 18:49:43.229853 kubelet[2339]: E0317 18:49:43.229801 2339 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.36:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-3510.3.7-a-b312ad98ee?timeout=10s\": dial tcp 10.200.8.36:6443: connect: connection refused" interval="1.6s" Mar 17 18:49:43.233481 env[1524]: time="2025-03-17T18:49:43.233434794Z" level=info msg="CreateContainer within sandbox \"4b33d51540d0ac272ba6c4ee8f4244c8c10cf689a1e5f0f2216d29d1e5bb8787\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"e75132d5e2fc5bd1e21dc4469a83d49a4ad7150697d0056f4df362e623579243\"" Mar 17 18:49:43.234123 env[1524]: time="2025-03-17T18:49:43.234094505Z" level=info msg="StartContainer for \"e75132d5e2fc5bd1e21dc4469a83d49a4ad7150697d0056f4df362e623579243\"" Mar 17 18:49:43.239835 env[1524]: time="2025-03-17T18:49:43.239800704Z" level=info msg="CreateContainer within sandbox \"a1371289b974c2c3bbc8ccd471b19b81035d66a269a9c4151e41e85a352397dd\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"34bef78bc4e91afdb861df6b1990f9fd9ade9cbccba965928ee7745a9dc86583\"" Mar 17 18:49:43.240340 env[1524]: time="2025-03-17T18:49:43.240316913Z" level=info msg="StartContainer for \"34bef78bc4e91afdb861df6b1990f9fd9ade9cbccba965928ee7745a9dc86583\"" Mar 17 18:49:43.250110 kubelet[2339]: W0317 18:49:43.250003 2339 reflector.go:547] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:43.250110 kubelet[2339]: E0317 18:49:43.250075 2339 reflector.go:150] k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get "https://10.200.8.36:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.36:6443: connect: connection refused Mar 17 18:49:43.300974 env[1524]: time="2025-03-17T18:49:43.300921962Z" level=info msg="StartContainer for \"5be4ce917b16ed15c4477ccc5430af884a3a2a523c8bbd9cad292c0835fe548d\" returns successfully" Mar 17 18:49:43.346187 kubelet[2339]: I0317 18:49:43.345719 2339 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:43.346187 kubelet[2339]: E0317 18:49:43.346148 2339 kubelet_node_status.go:96] "Unable to register node with API server" err="Post \"https://10.200.8.36:6443/api/v1/nodes\": dial tcp 10.200.8.36:6443: connect: connection refused" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:43.360203 env[1524]: time="2025-03-17T18:49:43.360161087Z" level=info msg="StartContainer for \"e75132d5e2fc5bd1e21dc4469a83d49a4ad7150697d0056f4df362e623579243\" returns successfully" Mar 17 18:49:43.408578 env[1524]: time="2025-03-17T18:49:43.408522024Z" level=info msg="StartContainer for \"34bef78bc4e91afdb861df6b1990f9fd9ade9cbccba965928ee7745a9dc86583\" returns successfully" Mar 17 18:49:44.948029 kubelet[2339]: I0317 18:49:44.947993 2339 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:45.135091 kubelet[2339]: E0317 18:49:45.135042 2339 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-3510.3.7-a-b312ad98ee\" not found" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:45.380409 kubelet[2339]: I0317 18:49:45.380373 2339 kubelet_node_status.go:76] "Successfully registered node" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:45.796515 kubelet[2339]: I0317 18:49:45.796476 2339 apiserver.go:52] "Watching apiserver" Mar 17 18:49:45.826095 kubelet[2339]: I0317 18:49:45.826060 2339 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 18:49:47.198788 systemd[1]: Reloading. Mar 17 18:49:47.296753 /usr/lib/systemd/system-generators/torcx-generator[2630]: time="2025-03-17T18:49:47Z" level=debug msg="common configuration parsed" base_dir=/var/lib/torcx/ conf_dir=/etc/torcx/ run_dir=/run/torcx/ store_paths="[/usr/share/torcx/store /usr/share/oem/torcx/store/3510.3.7 /usr/share/oem/torcx/store /var/lib/torcx/store/3510.3.7 /var/lib/torcx/store]" Mar 17 18:49:47.297255 /usr/lib/systemd/system-generators/torcx-generator[2630]: time="2025-03-17T18:49:47Z" level=info msg="torcx already run" Mar 17 18:49:47.380252 systemd[1]: /usr/lib/systemd/system/locksmithd.service:8: Unit uses CPUShares=; please use CPUWeight= instead. Support for CPUShares= will be removed soon. Mar 17 18:49:47.380274 systemd[1]: /usr/lib/systemd/system/locksmithd.service:9: Unit uses MemoryLimit=; please use MemoryMax= instead. Support for MemoryLimit= will be removed soon. Mar 17 18:49:47.397597 systemd[1]: /run/systemd/system/docker.socket:8: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Mar 17 18:49:47.494401 kubelet[2339]: E0317 18:49:47.494074 2339 event.go:319] "Unable to write event (broadcaster is shut down)" event="&Event{ObjectMeta:{ci-3510.3.7-a-b312ad98ee.182dabab00d32941 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-3510.3.7-a-b312ad98ee,UID:ci-3510.3.7-a-b312ad98ee,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-3510.3.7-a-b312ad98ee,},FirstTimestamp:2025-03-17 18:49:41.803845953 +0000 UTC m=+0.745247443,LastTimestamp:2025-03-17 18:49:41.803845953 +0000 UTC m=+0.745247443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-3510.3.7-a-b312ad98ee,}" Mar 17 18:49:47.496970 systemd[1]: Stopping kubelet.service... Mar 17 18:49:47.526063 kernel: kauditd_printk_skb: 43 callbacks suppressed Mar 17 18:49:47.526170 kernel: audit: type=1131 audit(1742237387.511:229): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:47.511000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:47.512160 systemd[1]: kubelet.service: Deactivated successfully. Mar 17 18:49:47.512529 systemd[1]: Stopped kubelet.service. Mar 17 18:49:47.519027 systemd[1]: Starting kubelet.service... Mar 17 18:49:47.737000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:47.737376 systemd[1]: Started kubelet.service. Mar 17 18:49:47.753722 kernel: audit: type=1130 audit(1742237387.737:230): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:48.171217 kubelet[2704]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:49:48.171217 kubelet[2704]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 17 18:49:48.171217 kubelet[2704]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 17 18:49:48.171217 kubelet[2704]: I0317 18:49:48.171078 2704 server.go:205] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 17 18:49:48.176100 kubelet[2704]: I0317 18:49:48.176068 2704 server.go:484] "Kubelet version" kubeletVersion="v1.30.1" Mar 17 18:49:48.176100 kubelet[2704]: I0317 18:49:48.176095 2704 server.go:486] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 17 18:49:48.177782 kubelet[2704]: I0317 18:49:48.176928 2704 server.go:927] "Client rotation is on, will bootstrap in background" Mar 17 18:49:48.180630 kubelet[2704]: I0317 18:49:48.180597 2704 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 17 18:49:48.183508 kubelet[2704]: I0317 18:49:48.183492 2704 dynamic_cafile_content.go:157] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Mar 17 18:49:48.199771 kubelet[2704]: I0317 18:49:48.199747 2704 server.go:742] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Mar 17 18:49:48.200348 kubelet[2704]: I0317 18:49:48.200312 2704 container_manager_linux.go:265] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 17 18:49:48.200524 kubelet[2704]: I0317 18:49:48.200344 2704 container_manager_linux.go:270] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-3510.3.7-a-b312ad98ee","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"cgroupfs","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null} Mar 17 18:49:48.200650 kubelet[2704]: I0317 18:49:48.200543 2704 topology_manager.go:138] "Creating topology manager with none policy" Mar 17 18:49:48.200650 kubelet[2704]: I0317 18:49:48.200558 2704 container_manager_linux.go:301] "Creating device plugin manager" Mar 17 18:49:48.200650 kubelet[2704]: I0317 18:49:48.200613 2704 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:49:48.200815 kubelet[2704]: I0317 18:49:48.200716 2704 kubelet.go:400] "Attempting to sync node with API server" Mar 17 18:49:48.200815 kubelet[2704]: I0317 18:49:48.200732 2704 kubelet.go:301] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 17 18:49:48.200815 kubelet[2704]: I0317 18:49:48.200760 2704 kubelet.go:312] "Adding apiserver pod source" Mar 17 18:49:48.200815 kubelet[2704]: I0317 18:49:48.200782 2704 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 17 18:49:48.210052 kubelet[2704]: I0317 18:49:48.207093 2704 kuberuntime_manager.go:261] "Container runtime initialized" containerRuntime="containerd" version="1.6.16" apiVersion="v1" Mar 17 18:49:48.210052 kubelet[2704]: I0317 18:49:48.207273 2704 kubelet.go:815] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 17 18:49:48.210052 kubelet[2704]: I0317 18:49:48.207780 2704 server.go:1264] "Started kubelet" Mar 17 18:49:48.225196 kernel: audit: type=1400 audit(1742237388.209:231): avc: denied { mac_admin } for pid=2704 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:48.209000 audit[2704]: AVC avc: denied { mac_admin } for pid=2704 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.210105 2704 kubelet.go:1419] "Unprivileged containerized plugins might not work, could not set selinux context on plugin registration dir" path="/var/lib/kubelet/plugins_registry" err="setxattr /var/lib/kubelet/plugins_registry: invalid argument" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.210144 2704 kubelet.go:1423] "Unprivileged containerized plugins might not work, could not set selinux context on plugins dir" path="/var/lib/kubelet/plugins" err="setxattr /var/lib/kubelet/plugins: invalid argument" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.210172 2704 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.214883 2704 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.215688 2704 server.go:455] "Adding debug handlers to kubelet server" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.216389 2704 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.216543 2704 server.go:227] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.217913 2704 volume_manager.go:291] "Starting Kubelet Volume Manager" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.219677 2704 desired_state_of_world_populator.go:149] "Desired state populator starts to run" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.219788 2704 reconciler.go:26] "Reconciler: start to sync state" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.221384 2704 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.222525 2704 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.222550 2704 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 17 18:49:48.225375 kubelet[2704]: I0317 18:49:48.222566 2704 kubelet.go:2337] "Starting kubelet main sync loop" Mar 17 18:49:48.225375 kubelet[2704]: E0317 18:49:48.222606 2704 kubelet.go:2361] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 17 18:49:48.209000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:49:48.246735 kernel: audit: type=1401 audit(1742237388.209:231): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:49:48.246787 kubelet[2704]: I0317 18:49:48.236450 2704 factory.go:221] Registration of the systemd container factory successfully Mar 17 18:49:48.246787 kubelet[2704]: I0317 18:49:48.236533 2704 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Mar 17 18:49:48.246787 kubelet[2704]: I0317 18:49:48.239169 2704 factory.go:221] Registration of the containerd container factory successfully Mar 17 18:49:48.209000 audit[2704]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000bc0900 a1=c000b61248 a2=c000bc08d0 a3=25 items=0 ppid=1 pid=2704 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:48.274757 kernel: audit: type=1300 audit(1742237388.209:231): arch=c000003e syscall=188 success=no exit=-22 a0=c000bc0900 a1=c000b61248 a2=c000bc08d0 a3=25 items=0 ppid=1 pid=2704 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:48.209000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:49:48.290877 kubelet[2704]: E0317 18:49:48.278445 2704 kubelet.go:1467] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Mar 17 18:49:48.293744 kernel: audit: type=1327 audit(1742237388.209:231): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:49:48.209000 audit[2704]: AVC avc: denied { mac_admin } for pid=2704 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:48.312520 kernel: audit: type=1400 audit(1742237388.209:232): avc: denied { mac_admin } for pid=2704 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:48.320062 kernel: audit: type=1401 audit(1742237388.209:232): op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:49:48.209000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:49:48.209000 audit[2704]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c000b6dc40 a1=c000b61260 a2=c000bc0990 a3=25 items=0 ppid=1 pid=2704 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:48.322709 kubelet[2704]: I0317 18:49:48.322690 2704 kubelet_node_status.go:73] "Attempting to register node" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.322927 kubelet[2704]: E0317 18:49:48.322914 2704 kubelet.go:2361] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 17 18:49:48.340703 kernel: audit: type=1300 audit(1742237388.209:232): arch=c000003e syscall=188 success=no exit=-22 a0=c000b6dc40 a1=c000b61260 a2=c000bc0990 a3=25 items=0 ppid=1 pid=2704 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:48.209000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:49:48.359258 kubelet[2704]: I0317 18:49:48.355479 2704 cpu_manager.go:214] "Starting CPU manager" policy="none" Mar 17 18:49:48.359258 kubelet[2704]: I0317 18:49:48.355495 2704 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Mar 17 18:49:48.359258 kubelet[2704]: I0317 18:49:48.355526 2704 state_mem.go:36] "Initialized new in-memory state store" Mar 17 18:49:48.359258 kubelet[2704]: I0317 18:49:48.355713 2704 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 17 18:49:48.359258 kubelet[2704]: I0317 18:49:48.355727 2704 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 17 18:49:48.359258 kubelet[2704]: I0317 18:49:48.355750 2704 policy_none.go:49] "None policy: Start" Mar 17 18:49:48.359258 kubelet[2704]: I0317 18:49:48.355891 2704 kubelet_node_status.go:112] "Node was previously registered" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.359258 kubelet[2704]: I0317 18:49:48.355973 2704 kubelet_node_status.go:76] "Successfully registered node" node="ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.359849 kernel: audit: type=1327 audit(1742237388.209:232): proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:49:48.360045 kubelet[2704]: I0317 18:49:48.360033 2704 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 17 18:49:48.360144 kubelet[2704]: I0317 18:49:48.360136 2704 state_mem.go:35] "Initializing new in-memory state store" Mar 17 18:49:48.360359 kubelet[2704]: I0317 18:49:48.360348 2704 state_mem.go:75] "Updated machine memory state" Mar 17 18:49:48.361607 kubelet[2704]: I0317 18:49:48.361589 2704 manager.go:479] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 17 18:49:48.361789 kubelet[2704]: I0317 18:49:48.361770 2704 server.go:88] "Unprivileged containerized plugins might not work. Could not set selinux context on socket dir" path="/var/lib/kubelet/device-plugins/" err="setxattr /var/lib/kubelet/device-plugins/: invalid argument" Mar 17 18:49:48.361000 audit[2704]: AVC avc: denied { mac_admin } for pid=2704 comm="kubelet" capability=33 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:49:48.361000 audit: SELINUX_ERR op=setxattr invalid_context="system_u:object_r:container_file_t:s0" Mar 17 18:49:48.361000 audit[2704]: SYSCALL arch=c000003e syscall=188 success=no exit=-22 a0=c001258ae0 a1=c0012056c8 a2=c001258ab0 a3=25 items=0 ppid=1 pid=2704 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="kubelet" exe="/usr/bin/kubelet" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:49:48.361000 audit: PROCTITLE proctitle=2F7573722F62696E2F6B7562656C6574002D2D626F6F7473747261702D6B756265636F6E6669673D2F6574632F6B756265726E657465732F626F6F7473747261702D6B7562656C65742E636F6E66002D2D6B756265636F6E6669673D2F6574632F6B756265726E657465732F6B7562656C65742E636F6E66002D2D636F6E6669 Mar 17 18:49:48.362216 kubelet[2704]: I0317 18:49:48.361994 2704 container_log_manager.go:186] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 17 18:49:48.362216 kubelet[2704]: I0317 18:49:48.362096 2704 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 17 18:49:48.523896 kubelet[2704]: I0317 18:49:48.523401 2704 topology_manager.go:215] "Topology Admit Handler" podUID="c0366fd132b6280150276f513e236bc5" podNamespace="kube-system" podName="kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.523896 kubelet[2704]: I0317 18:49:48.523517 2704 topology_manager.go:215] "Topology Admit Handler" podUID="d04837916cdbc90c12e369201ae01ea9" podNamespace="kube-system" podName="kube-scheduler-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.523896 kubelet[2704]: I0317 18:49:48.523584 2704 topology_manager.go:215] "Topology Admit Handler" podUID="1ced6517b82b8186f353c59555a4eab9" podNamespace="kube-system" podName="kube-apiserver-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.544072 kubelet[2704]: W0317 18:49:48.543998 2704 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 18:49:48.547594 kubelet[2704]: W0317 18:49:48.547176 2704 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 18:49:48.547594 kubelet[2704]: W0317 18:49:48.547347 2704 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 18:49:48.623005 kubelet[2704]: I0317 18:49:48.622967 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/1ced6517b82b8186f353c59555a4eab9-usr-share-ca-certificates\") pod \"kube-apiserver-ci-3510.3.7-a-b312ad98ee\" (UID: \"1ced6517b82b8186f353c59555a4eab9\") " pod="kube-system/kube-apiserver-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.623005 kubelet[2704]: I0317 18:49:48.623009 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-ca-certs\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.623238 kubelet[2704]: I0317 18:49:48.623033 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-flexvolume-dir\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.623238 kubelet[2704]: I0317 18:49:48.623054 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-k8s-certs\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.623238 kubelet[2704]: I0317 18:49:48.623082 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-kubeconfig\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.623238 kubelet[2704]: I0317 18:49:48.623103 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/d04837916cdbc90c12e369201ae01ea9-kubeconfig\") pod \"kube-scheduler-ci-3510.3.7-a-b312ad98ee\" (UID: \"d04837916cdbc90c12e369201ae01ea9\") " pod="kube-system/kube-scheduler-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.623238 kubelet[2704]: I0317 18:49:48.623126 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/1ced6517b82b8186f353c59555a4eab9-ca-certs\") pod \"kube-apiserver-ci-3510.3.7-a-b312ad98ee\" (UID: \"1ced6517b82b8186f353c59555a4eab9\") " pod="kube-system/kube-apiserver-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.623372 kubelet[2704]: I0317 18:49:48.623145 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/1ced6517b82b8186f353c59555a4eab9-k8s-certs\") pod \"kube-apiserver-ci-3510.3.7-a-b312ad98ee\" (UID: \"1ced6517b82b8186f353c59555a4eab9\") " pod="kube-system/kube-apiserver-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:48.623372 kubelet[2704]: I0317 18:49:48.623169 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c0366fd132b6280150276f513e236bc5-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-3510.3.7-a-b312ad98ee\" (UID: \"c0366fd132b6280150276f513e236bc5\") " pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:49.202398 kubelet[2704]: I0317 18:49:49.202358 2704 apiserver.go:52] "Watching apiserver" Mar 17 18:49:49.220662 kubelet[2704]: I0317 18:49:49.220623 2704 desired_state_of_world_populator.go:157] "Finished populating initial desired state of world" Mar 17 18:49:49.312905 kubelet[2704]: W0317 18:49:49.312814 2704 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] Mar 17 18:49:49.312905 kubelet[2704]: E0317 18:49:49.312899 2704 kubelet.go:1928] "Failed creating a mirror pod for" err="pods \"kube-apiserver-ci-3510.3.7-a-b312ad98ee\" already exists" pod="kube-system/kube-apiserver-ci-3510.3.7-a-b312ad98ee" Mar 17 18:49:49.378841 kubelet[2704]: I0317 18:49:49.378780 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-3510.3.7-a-b312ad98ee" podStartSLOduration=1.378759074 podStartE2EDuration="1.378759074s" podCreationTimestamp="2025-03-17 18:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:49:49.344015752 +0000 UTC m=+1.592115950" watchObservedRunningTime="2025-03-17 18:49:49.378759074 +0000 UTC m=+1.626859172" Mar 17 18:49:49.404488 kubelet[2704]: I0317 18:49:49.404412 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-3510.3.7-a-b312ad98ee" podStartSLOduration=1.404387359 podStartE2EDuration="1.404387359s" podCreationTimestamp="2025-03-17 18:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:49:49.37853767 +0000 UTC m=+1.626637768" watchObservedRunningTime="2025-03-17 18:49:49.404387359 +0000 UTC m=+1.652487557" Mar 17 18:49:49.436133 kubelet[2704]: I0317 18:49:49.436069 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-3510.3.7-a-b312ad98ee" podStartSLOduration=1.436048234 podStartE2EDuration="1.436048234s" podCreationTimestamp="2025-03-17 18:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:49:49.405712678 +0000 UTC m=+1.653812776" watchObservedRunningTime="2025-03-17 18:49:49.436048234 +0000 UTC m=+1.684148432" Mar 17 18:49:53.327869 sudo[1971]: pam_unix(sudo:session): session closed for user root Mar 17 18:49:53.327000 audit[1971]: USER_END pid=1971 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:53.332693 kernel: kauditd_printk_skb: 4 callbacks suppressed Mar 17 18:49:53.332797 kernel: audit: type=1106 audit(1742237393.327:234): pid=1971 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:53.328000 audit[1971]: CRED_DISP pid=1971 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:53.358494 kernel: audit: type=1104 audit(1742237393.328:235): pid=1971 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Mar 17 18:49:53.437114 sshd[1967]: pam_unix(sshd:session): session closed for user core Mar 17 18:49:53.439000 audit[1967]: USER_END pid=1967 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:53.442548 systemd-logind[1500]: Session 9 logged out. Waiting for processes to exit. Mar 17 18:49:53.444063 systemd[1]: sshd@6-10.200.8.36:22-10.200.16.10:50942.service: Deactivated successfully. Mar 17 18:49:53.444858 systemd[1]: session-9.scope: Deactivated successfully. Mar 17 18:49:53.446262 systemd-logind[1500]: Removed session 9. Mar 17 18:49:53.439000 audit[1967]: CRED_DISP pid=1967 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:53.471890 kernel: audit: type=1106 audit(1742237393.439:236): pid=1967 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:53.471972 kernel: audit: type=1104 audit(1742237393.439:237): pid=1967 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:49:53.472001 kernel: audit: type=1131 audit(1742237393.439:238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.200.8.36:22-10.200.16.10:50942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:49:53.439000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.200.8.36:22-10.200.16.10:50942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:50:03.613308 kubelet[2704]: I0317 18:50:03.613113 2704 kuberuntime_manager.go:1523] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Mar 17 18:50:03.614162 env[1524]: time="2025-03-17T18:50:03.614120140Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Mar 17 18:50:03.615414 kubelet[2704]: I0317 18:50:03.614830 2704 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Mar 17 18:50:04.145224 kubelet[2704]: I0317 18:50:04.145182 2704 topology_manager.go:215] "Topology Admit Handler" podUID="beabd790-26be-4ddb-aa41-8a5244e655da" podNamespace="kube-system" podName="kube-proxy-dlwwd" Mar 17 18:50:04.221973 kubelet[2704]: I0317 18:50:04.221934 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/beabd790-26be-4ddb-aa41-8a5244e655da-kube-proxy\") pod \"kube-proxy-dlwwd\" (UID: \"beabd790-26be-4ddb-aa41-8a5244e655da\") " pod="kube-system/kube-proxy-dlwwd" Mar 17 18:50:04.222175 kubelet[2704]: I0317 18:50:04.221981 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/beabd790-26be-4ddb-aa41-8a5244e655da-xtables-lock\") pod \"kube-proxy-dlwwd\" (UID: \"beabd790-26be-4ddb-aa41-8a5244e655da\") " pod="kube-system/kube-proxy-dlwwd" Mar 17 18:50:04.222175 kubelet[2704]: I0317 18:50:04.222021 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/beabd790-26be-4ddb-aa41-8a5244e655da-lib-modules\") pod \"kube-proxy-dlwwd\" (UID: \"beabd790-26be-4ddb-aa41-8a5244e655da\") " pod="kube-system/kube-proxy-dlwwd" Mar 17 18:50:04.222175 kubelet[2704]: I0317 18:50:04.222046 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bp5p\" (UniqueName: \"kubernetes.io/projected/beabd790-26be-4ddb-aa41-8a5244e655da-kube-api-access-5bp5p\") pod \"kube-proxy-dlwwd\" (UID: \"beabd790-26be-4ddb-aa41-8a5244e655da\") " pod="kube-system/kube-proxy-dlwwd" Mar 17 18:50:04.336538 kubelet[2704]: E0317 18:50:04.336506 2704 projected.go:294] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Mar 17 18:50:04.336538 kubelet[2704]: E0317 18:50:04.336536 2704 projected.go:200] Error preparing data for projected volume kube-api-access-5bp5p for pod kube-system/kube-proxy-dlwwd: configmap "kube-root-ca.crt" not found Mar 17 18:50:04.336803 kubelet[2704]: E0317 18:50:04.336608 2704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/beabd790-26be-4ddb-aa41-8a5244e655da-kube-api-access-5bp5p podName:beabd790-26be-4ddb-aa41-8a5244e655da nodeName:}" failed. No retries permitted until 2025-03-17 18:50:04.836586398 +0000 UTC m=+17.084686496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-5bp5p" (UniqueName: "kubernetes.io/projected/beabd790-26be-4ddb-aa41-8a5244e655da-kube-api-access-5bp5p") pod "kube-proxy-dlwwd" (UID: "beabd790-26be-4ddb-aa41-8a5244e655da") : configmap "kube-root-ca.crt" not found Mar 17 18:50:04.663488 kubelet[2704]: I0317 18:50:04.663427 2704 topology_manager.go:215] "Topology Admit Handler" podUID="61dcfcc3-4100-401a-a1cb-0a001eb5776e" podNamespace="tigera-operator" podName="tigera-operator-7bc55997bb-4cnnf" Mar 17 18:50:04.725561 kubelet[2704]: I0317 18:50:04.725503 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/61dcfcc3-4100-401a-a1cb-0a001eb5776e-var-lib-calico\") pod \"tigera-operator-7bc55997bb-4cnnf\" (UID: \"61dcfcc3-4100-401a-a1cb-0a001eb5776e\") " pod="tigera-operator/tigera-operator-7bc55997bb-4cnnf" Mar 17 18:50:04.725561 kubelet[2704]: I0317 18:50:04.725559 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5crpm\" (UniqueName: \"kubernetes.io/projected/61dcfcc3-4100-401a-a1cb-0a001eb5776e-kube-api-access-5crpm\") pod \"tigera-operator-7bc55997bb-4cnnf\" (UID: \"61dcfcc3-4100-401a-a1cb-0a001eb5776e\") " pod="tigera-operator/tigera-operator-7bc55997bb-4cnnf" Mar 17 18:50:04.968881 env[1524]: time="2025-03-17T18:50:04.968570185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-4cnnf,Uid:61dcfcc3-4100-401a-a1cb-0a001eb5776e,Namespace:tigera-operator,Attempt:0,}" Mar 17 18:50:05.007605 env[1524]: time="2025-03-17T18:50:05.007367007Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:05.007605 env[1524]: time="2025-03-17T18:50:05.007415108Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:05.007605 env[1524]: time="2025-03-17T18:50:05.007432108Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:05.007840 env[1524]: time="2025-03-17T18:50:05.007650410Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/0ccd01856585c11ca679603a0a671346402501ae7c2d25d4afa8a93060549383 pid=2786 runtime=io.containerd.runc.v2 Mar 17 18:50:05.052182 env[1524]: time="2025-03-17T18:50:05.052137986Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dlwwd,Uid:beabd790-26be-4ddb-aa41-8a5244e655da,Namespace:kube-system,Attempt:0,}" Mar 17 18:50:05.069217 env[1524]: time="2025-03-17T18:50:05.069169368Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7bc55997bb-4cnnf,Uid:61dcfcc3-4100-401a-a1cb-0a001eb5776e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"0ccd01856585c11ca679603a0a671346402501ae7c2d25d4afa8a93060549383\"" Mar 17 18:50:05.071181 env[1524]: time="2025-03-17T18:50:05.071150489Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\"" Mar 17 18:50:05.090363 env[1524]: time="2025-03-17T18:50:05.090288293Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:05.090363 env[1524]: time="2025-03-17T18:50:05.090333394Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:05.090854 env[1524]: time="2025-03-17T18:50:05.090348994Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:05.091127 env[1524]: time="2025-03-17T18:50:05.091063202Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/e7dd41698e03a0bf0fdbedfcbbd9f8aaa8c09d1cdaa1ce62aff9b8c6dd0bb691 pid=2832 runtime=io.containerd.runc.v2 Mar 17 18:50:05.126222 env[1524]: time="2025-03-17T18:50:05.126168177Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dlwwd,Uid:beabd790-26be-4ddb-aa41-8a5244e655da,Namespace:kube-system,Attempt:0,} returns sandbox id \"e7dd41698e03a0bf0fdbedfcbbd9f8aaa8c09d1cdaa1ce62aff9b8c6dd0bb691\"" Mar 17 18:50:05.129810 env[1524]: time="2025-03-17T18:50:05.129773615Z" level=info msg="CreateContainer within sandbox \"e7dd41698e03a0bf0fdbedfcbbd9f8aaa8c09d1cdaa1ce62aff9b8c6dd0bb691\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Mar 17 18:50:05.177341 env[1524]: time="2025-03-17T18:50:05.177292423Z" level=info msg="CreateContainer within sandbox \"e7dd41698e03a0bf0fdbedfcbbd9f8aaa8c09d1cdaa1ce62aff9b8c6dd0bb691\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"2226535077ecb8ec0cd7d5b5eab59b64499163afbf03c4335cc28fb41b753eea\"" Mar 17 18:50:05.179336 env[1524]: time="2025-03-17T18:50:05.177904530Z" level=info msg="StartContainer for \"2226535077ecb8ec0cd7d5b5eab59b64499163afbf03c4335cc28fb41b753eea\"" Mar 17 18:50:05.228017 env[1524]: time="2025-03-17T18:50:05.227383458Z" level=info msg="StartContainer for \"2226535077ecb8ec0cd7d5b5eab59b64499163afbf03c4335cc28fb41b753eea\" returns successfully" Mar 17 18:50:05.294000 audit[2926]: NETFILTER_CFG table=mangle:41 family=2 entries=1 op=nft_register_chain pid=2926 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.306697 kernel: audit: type=1325 audit(1742237405.294:239): table=mangle:41 family=2 entries=1 op=nft_register_chain pid=2926 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.294000 audit[2926]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd8fe91500 a2=0 a3=7ffd8fe914ec items=0 ppid=2885 pid=2926 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.294000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:50:05.335483 kernel: audit: type=1300 audit(1742237405.294:239): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd8fe91500 a2=0 a3=7ffd8fe914ec items=0 ppid=2885 pid=2926 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.335598 kernel: audit: type=1327 audit(1742237405.294:239): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:50:05.303000 audit[2927]: NETFILTER_CFG table=mangle:42 family=10 entries=1 op=nft_register_chain pid=2927 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.351397 kernel: audit: type=1325 audit(1742237405.303:240): table=mangle:42 family=10 entries=1 op=nft_register_chain pid=2927 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.351503 kernel: audit: type=1300 audit(1742237405.303:240): arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff34a06cb0 a2=0 a3=7fff34a06c9c items=0 ppid=2885 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.303000 audit[2927]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7fff34a06cb0 a2=0 a3=7fff34a06c9c items=0 ppid=2885 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.303000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:50:05.310000 audit[2929]: NETFILTER_CFG table=nat:43 family=2 entries=1 op=nft_register_chain pid=2929 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.381679 kernel: audit: type=1327 audit(1742237405.303:240): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Mar 17 18:50:05.381805 kernel: audit: type=1325 audit(1742237405.310:241): table=nat:43 family=2 entries=1 op=nft_register_chain pid=2929 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.310000 audit[2929]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdb841ba10 a2=0 a3=7ffdb841b9fc items=0 ppid=2885 pid=2929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.310000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 18:50:05.410053 kernel: audit: type=1300 audit(1742237405.310:241): arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffdb841ba10 a2=0 a3=7ffdb841b9fc items=0 ppid=2885 pid=2929 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.410142 kernel: audit: type=1327 audit(1742237405.310:241): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 18:50:05.311000 audit[2928]: NETFILTER_CFG table=nat:44 family=10 entries=1 op=nft_register_chain pid=2928 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.311000 audit[2928]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffc4c2f0b40 a2=0 a3=7ffc4c2f0b2c items=0 ppid=2885 pid=2928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.311000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Mar 17 18:50:05.323000 audit[2930]: NETFILTER_CFG table=filter:45 family=10 entries=1 op=nft_register_chain pid=2930 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.323000 audit[2930]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe1e908ea0 a2=0 a3=7ffe1e908e8c items=0 ppid=2885 pid=2930 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.323000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 18:50:05.421717 kernel: audit: type=1325 audit(1742237405.311:242): table=nat:44 family=10 entries=1 op=nft_register_chain pid=2928 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.325000 audit[2931]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_chain pid=2931 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.325000 audit[2931]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffd045a09e0 a2=0 a3=7ffd045a09cc items=0 ppid=2885 pid=2931 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.325000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Mar 17 18:50:05.398000 audit[2932]: NETFILTER_CFG table=filter:47 family=2 entries=1 op=nft_register_chain pid=2932 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.398000 audit[2932]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7ffd7b4d3580 a2=0 a3=7ffd7b4d356c items=0 ppid=2885 pid=2932 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.398000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Mar 17 18:50:05.398000 audit[2934]: NETFILTER_CFG table=filter:48 family=2 entries=1 op=nft_register_rule pid=2934 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.398000 audit[2934]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffead612440 a2=0 a3=7ffead61242c items=0 ppid=2885 pid=2934 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.398000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Mar 17 18:50:05.403000 audit[2937]: NETFILTER_CFG table=filter:49 family=2 entries=1 op=nft_register_rule pid=2937 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.403000 audit[2937]: SYSCALL arch=c000003e syscall=46 success=yes exit=752 a0=3 a1=7ffff16947a0 a2=0 a3=7ffff169478c items=0 ppid=2885 pid=2937 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.403000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Mar 17 18:50:05.409000 audit[2938]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_chain pid=2938 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.409000 audit[2938]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffe5f05c5a0 a2=0 a3=7ffe5f05c58c items=0 ppid=2885 pid=2938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.409000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Mar 17 18:50:05.409000 audit[2940]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_rule pid=2940 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.409000 audit[2940]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe0efef510 a2=0 a3=7ffe0efef4fc items=0 ppid=2885 pid=2940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.409000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Mar 17 18:50:05.409000 audit[2941]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2941 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.409000 audit[2941]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffeb8dd02d0 a2=0 a3=7ffeb8dd02bc items=0 ppid=2885 pid=2941 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.409000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Mar 17 18:50:05.415000 audit[2943]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_rule pid=2943 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.415000 audit[2943]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffdc1d413f0 a2=0 a3=7ffdc1d413dc items=0 ppid=2885 pid=2943 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.415000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Mar 17 18:50:05.422000 audit[2946]: NETFILTER_CFG table=filter:54 family=2 entries=1 op=nft_register_rule pid=2946 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.422000 audit[2946]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffcf2cc2b10 a2=0 a3=7ffcf2cc2afc items=0 ppid=2885 pid=2946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.422000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Mar 17 18:50:05.431000 audit[2947]: NETFILTER_CFG table=filter:55 family=2 entries=1 op=nft_register_chain pid=2947 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.431000 audit[2947]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffd76072d0 a2=0 a3=7fffd76072bc items=0 ppid=2885 pid=2947 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.431000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Mar 17 18:50:05.434000 audit[2949]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_rule pid=2949 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.434000 audit[2949]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7fff3f254c80 a2=0 a3=7fff3f254c6c items=0 ppid=2885 pid=2949 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.434000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Mar 17 18:50:05.435000 audit[2950]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=2950 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.435000 audit[2950]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffe0c4582d0 a2=0 a3=7ffe0c4582bc items=0 ppid=2885 pid=2950 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.435000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Mar 17 18:50:05.438000 audit[2952]: NETFILTER_CFG table=filter:58 family=2 entries=1 op=nft_register_rule pid=2952 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.438000 audit[2952]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd2304f430 a2=0 a3=7ffd2304f41c items=0 ppid=2885 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.438000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 18:50:05.441000 audit[2955]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_rule pid=2955 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.441000 audit[2955]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd4e0518b0 a2=0 a3=7ffd4e05189c items=0 ppid=2885 pid=2955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.441000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 18:50:05.445000 audit[2958]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_rule pid=2958 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.445000 audit[2958]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffdabc343c0 a2=0 a3=7ffdabc343ac items=0 ppid=2885 pid=2958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.445000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Mar 17 18:50:05.446000 audit[2959]: NETFILTER_CFG table=nat:61 family=2 entries=1 op=nft_register_chain pid=2959 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.446000 audit[2959]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffcaff9e3f0 a2=0 a3=7ffcaff9e3dc items=0 ppid=2885 pid=2959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.446000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Mar 17 18:50:05.449000 audit[2961]: NETFILTER_CFG table=nat:62 family=2 entries=1 op=nft_register_rule pid=2961 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.449000 audit[2961]: SYSCALL arch=c000003e syscall=46 success=yes exit=524 a0=3 a1=7ffc451cf3d0 a2=0 a3=7ffc451cf3bc items=0 ppid=2885 pid=2961 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.449000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:50:05.452000 audit[2964]: NETFILTER_CFG table=nat:63 family=2 entries=1 op=nft_register_rule pid=2964 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.452000 audit[2964]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffca32c03b0 a2=0 a3=7ffca32c039c items=0 ppid=2885 pid=2964 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.452000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:50:05.453000 audit[2965]: NETFILTER_CFG table=nat:64 family=2 entries=1 op=nft_register_chain pid=2965 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.453000 audit[2965]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffd19494d60 a2=0 a3=7ffd19494d4c items=0 ppid=2885 pid=2965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.453000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Mar 17 18:50:05.456000 audit[2967]: NETFILTER_CFG table=nat:65 family=2 entries=1 op=nft_register_rule pid=2967 subj=system_u:system_r:kernel_t:s0 comm="iptables" Mar 17 18:50:05.456000 audit[2967]: SYSCALL arch=c000003e syscall=46 success=yes exit=532 a0=3 a1=7ffea7174900 a2=0 a3=7ffea71748ec items=0 ppid=2885 pid=2967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.456000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Mar 17 18:50:05.536000 audit[2973]: NETFILTER_CFG table=filter:66 family=2 entries=8 op=nft_register_rule pid=2973 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:05.536000 audit[2973]: SYSCALL arch=c000003e syscall=46 success=yes exit=5164 a0=3 a1=7ffdf5e8e8c0 a2=0 a3=7ffdf5e8e8ac items=0 ppid=2885 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.536000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:05.563000 audit[2973]: NETFILTER_CFG table=nat:67 family=2 entries=14 op=nft_register_chain pid=2973 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:05.563000 audit[2973]: SYSCALL arch=c000003e syscall=46 success=yes exit=5508 a0=3 a1=7ffdf5e8e8c0 a2=0 a3=7ffdf5e8e8ac items=0 ppid=2885 pid=2973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.563000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:05.564000 audit[2978]: NETFILTER_CFG table=filter:68 family=10 entries=1 op=nft_register_chain pid=2978 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.564000 audit[2978]: SYSCALL arch=c000003e syscall=46 success=yes exit=108 a0=3 a1=7fffd3642da0 a2=0 a3=7fffd3642d8c items=0 ppid=2885 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.564000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Mar 17 18:50:05.569000 audit[2980]: NETFILTER_CFG table=filter:69 family=10 entries=2 op=nft_register_chain pid=2980 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.569000 audit[2980]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffc00296b60 a2=0 a3=7ffc00296b4c items=0 ppid=2885 pid=2980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.569000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Mar 17 18:50:05.573000 audit[2983]: NETFILTER_CFG table=filter:70 family=10 entries=2 op=nft_register_chain pid=2983 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.573000 audit[2983]: SYSCALL arch=c000003e syscall=46 success=yes exit=836 a0=3 a1=7ffdc1eaf540 a2=0 a3=7ffdc1eaf52c items=0 ppid=2885 pid=2983 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.573000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Mar 17 18:50:05.574000 audit[2984]: NETFILTER_CFG table=filter:71 family=10 entries=1 op=nft_register_chain pid=2984 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.574000 audit[2984]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fff9a28ddd0 a2=0 a3=7fff9a28ddbc items=0 ppid=2885 pid=2984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.574000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Mar 17 18:50:05.577000 audit[2986]: NETFILTER_CFG table=filter:72 family=10 entries=1 op=nft_register_rule pid=2986 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.577000 audit[2986]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffd1291c2e0 a2=0 a3=7ffd1291c2cc items=0 ppid=2885 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.577000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Mar 17 18:50:05.578000 audit[2987]: NETFILTER_CFG table=filter:73 family=10 entries=1 op=nft_register_chain pid=2987 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.578000 audit[2987]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffeb1417020 a2=0 a3=7ffeb141700c items=0 ppid=2885 pid=2987 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.578000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Mar 17 18:50:05.581000 audit[2989]: NETFILTER_CFG table=filter:74 family=10 entries=1 op=nft_register_rule pid=2989 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.581000 audit[2989]: SYSCALL arch=c000003e syscall=46 success=yes exit=744 a0=3 a1=7ffc42709f50 a2=0 a3=7ffc42709f3c items=0 ppid=2885 pid=2989 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.581000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Mar 17 18:50:05.584000 audit[2992]: NETFILTER_CFG table=filter:75 family=10 entries=2 op=nft_register_chain pid=2992 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.584000 audit[2992]: SYSCALL arch=c000003e syscall=46 success=yes exit=828 a0=3 a1=7fff98b10140 a2=0 a3=7fff98b1012c items=0 ppid=2885 pid=2992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.584000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Mar 17 18:50:05.585000 audit[2993]: NETFILTER_CFG table=filter:76 family=10 entries=1 op=nft_register_chain pid=2993 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.585000 audit[2993]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffa683bd40 a2=0 a3=7fffa683bd2c items=0 ppid=2885 pid=2993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.585000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Mar 17 18:50:05.587000 audit[2995]: NETFILTER_CFG table=filter:77 family=10 entries=1 op=nft_register_rule pid=2995 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.587000 audit[2995]: SYSCALL arch=c000003e syscall=46 success=yes exit=528 a0=3 a1=7ffe2d12f9c0 a2=0 a3=7ffe2d12f9ac items=0 ppid=2885 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.587000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Mar 17 18:50:05.589000 audit[2996]: NETFILTER_CFG table=filter:78 family=10 entries=1 op=nft_register_chain pid=2996 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.589000 audit[2996]: SYSCALL arch=c000003e syscall=46 success=yes exit=104 a0=3 a1=7ffffe14b630 a2=0 a3=7ffffe14b61c items=0 ppid=2885 pid=2996 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.589000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Mar 17 18:50:05.591000 audit[2998]: NETFILTER_CFG table=filter:79 family=10 entries=1 op=nft_register_rule pid=2998 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.591000 audit[2998]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffd0e0eea90 a2=0 a3=7ffd0e0eea7c items=0 ppid=2885 pid=2998 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.591000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Mar 17 18:50:05.595000 audit[3001]: NETFILTER_CFG table=filter:80 family=10 entries=1 op=nft_register_rule pid=3001 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.595000 audit[3001]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffce215f2e0 a2=0 a3=7ffce215f2cc items=0 ppid=2885 pid=3001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.595000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Mar 17 18:50:05.599000 audit[3004]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_rule pid=3004 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.599000 audit[3004]: SYSCALL arch=c000003e syscall=46 success=yes exit=748 a0=3 a1=7ffca4360c50 a2=0 a3=7ffca4360c3c items=0 ppid=2885 pid=3004 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.599000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Mar 17 18:50:05.600000 audit[3005]: NETFILTER_CFG table=nat:82 family=10 entries=1 op=nft_register_chain pid=3005 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.600000 audit[3005]: SYSCALL arch=c000003e syscall=46 success=yes exit=96 a0=3 a1=7ffd8ca3af20 a2=0 a3=7ffd8ca3af0c items=0 ppid=2885 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.600000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Mar 17 18:50:05.602000 audit[3007]: NETFILTER_CFG table=nat:83 family=10 entries=2 op=nft_register_chain pid=3007 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.602000 audit[3007]: SYSCALL arch=c000003e syscall=46 success=yes exit=600 a0=3 a1=7ffd1656a030 a2=0 a3=7ffd1656a01c items=0 ppid=2885 pid=3007 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.602000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:50:05.605000 audit[3010]: NETFILTER_CFG table=nat:84 family=10 entries=2 op=nft_register_chain pid=3010 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.605000 audit[3010]: SYSCALL arch=c000003e syscall=46 success=yes exit=608 a0=3 a1=7ffea09caf30 a2=0 a3=7ffea09caf1c items=0 ppid=2885 pid=3010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.605000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Mar 17 18:50:05.606000 audit[3011]: NETFILTER_CFG table=nat:85 family=10 entries=1 op=nft_register_chain pid=3011 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.606000 audit[3011]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7ffca19e9220 a2=0 a3=7ffca19e920c items=0 ppid=2885 pid=3011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.606000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Mar 17 18:50:05.609000 audit[3013]: NETFILTER_CFG table=nat:86 family=10 entries=2 op=nft_register_chain pid=3013 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.609000 audit[3013]: SYSCALL arch=c000003e syscall=46 success=yes exit=612 a0=3 a1=7ffd6c94c6c0 a2=0 a3=7ffd6c94c6ac items=0 ppid=2885 pid=3013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.609000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Mar 17 18:50:05.610000 audit[3014]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_chain pid=3014 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.610000 audit[3014]: SYSCALL arch=c000003e syscall=46 success=yes exit=100 a0=3 a1=7fffb3ccb040 a2=0 a3=7fffb3ccb02c items=0 ppid=2885 pid=3014 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.610000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Mar 17 18:50:05.612000 audit[3016]: NETFILTER_CFG table=filter:88 family=10 entries=1 op=nft_register_rule pid=3016 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.612000 audit[3016]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7ffe87066550 a2=0 a3=7ffe8706653c items=0 ppid=2885 pid=3016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.612000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:50:05.615000 audit[3019]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_rule pid=3019 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Mar 17 18:50:05.615000 audit[3019]: SYSCALL arch=c000003e syscall=46 success=yes exit=228 a0=3 a1=7fff60d74f00 a2=0 a3=7fff60d74eec items=0 ppid=2885 pid=3019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.615000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Mar 17 18:50:05.618000 audit[3021]: NETFILTER_CFG table=filter:90 family=10 entries=3 op=nft_register_rule pid=3021 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Mar 17 18:50:05.618000 audit[3021]: SYSCALL arch=c000003e syscall=46 success=yes exit=2004 a0=3 a1=7ffd105aba70 a2=0 a3=7ffd105aba5c items=0 ppid=2885 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.618000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:05.619000 audit[3021]: NETFILTER_CFG table=nat:91 family=10 entries=7 op=nft_register_chain pid=3021 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Mar 17 18:50:05.619000 audit[3021]: SYSCALL arch=c000003e syscall=46 success=yes exit=2056 a0=3 a1=7ffd105aba70 a2=0 a3=7ffd105aba5c items=0 ppid=2885 pid=3021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:05.619000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:05.841202 systemd[1]: run-containerd-runc-k8s.io-0ccd01856585c11ca679603a0a671346402501ae7c2d25d4afa8a93060549383-runc.ZcbIAM.mount: Deactivated successfully. Mar 17 18:50:06.731790 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3931959111.mount: Deactivated successfully. Mar 17 18:50:07.419521 env[1524]: time="2025-03-17T18:50:07.419467112Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator:v1.36.2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:07.426908 env[1524]: time="2025-03-17T18:50:07.426860488Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:07.429858 env[1524]: time="2025-03-17T18:50:07.429819118Z" level=info msg="ImageUpdate event &ImageUpdate{Name:quay.io/tigera/operator:v1.36.2,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:07.433078 env[1524]: time="2025-03-17T18:50:07.433040551Z" level=info msg="ImageCreate event &ImageCreate{Name:quay.io/tigera/operator@sha256:fc9ea45f2475fd99db1b36d2ff180a50017b1a5ea0e82a171c6b439b3a620764,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:07.433654 env[1524]: time="2025-03-17T18:50:07.433619157Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.36.2\" returns image reference \"sha256:3045aa4a360d468ed15090f280e94c54bf4678269a6e863a9ebcf5b31534a346\"" Mar 17 18:50:07.436454 env[1524]: time="2025-03-17T18:50:07.436422086Z" level=info msg="CreateContainer within sandbox \"0ccd01856585c11ca679603a0a671346402501ae7c2d25d4afa8a93060549383\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Mar 17 18:50:07.462806 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount245745483.mount: Deactivated successfully. Mar 17 18:50:07.481733 env[1524]: time="2025-03-17T18:50:07.481688552Z" level=info msg="CreateContainer within sandbox \"0ccd01856585c11ca679603a0a671346402501ae7c2d25d4afa8a93060549383\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"23d0146384db4227231fbcbe7b5a67b0ebf788feb96d486c83e86858e72c09f1\"" Mar 17 18:50:07.483552 env[1524]: time="2025-03-17T18:50:07.482574361Z" level=info msg="StartContainer for \"23d0146384db4227231fbcbe7b5a67b0ebf788feb96d486c83e86858e72c09f1\"" Mar 17 18:50:07.709454 env[1524]: time="2025-03-17T18:50:07.708839888Z" level=info msg="StartContainer for \"23d0146384db4227231fbcbe7b5a67b0ebf788feb96d486c83e86858e72c09f1\" returns successfully" Mar 17 18:50:08.236627 kubelet[2704]: I0317 18:50:08.236407 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-dlwwd" podStartSLOduration=4.236388468 podStartE2EDuration="4.236388468s" podCreationTimestamp="2025-03-17 18:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:50:05.351359183 +0000 UTC m=+17.599459381" watchObservedRunningTime="2025-03-17 18:50:08.236388468 +0000 UTC m=+20.484488566" Mar 17 18:50:10.534547 kernel: kauditd_printk_skb: 143 callbacks suppressed Mar 17 18:50:10.534663 kernel: audit: type=1325 audit(1742237410.528:290): table=filter:92 family=2 entries=15 op=nft_register_rule pid=3064 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:10.528000 audit[3064]: NETFILTER_CFG table=filter:92 family=2 entries=15 op=nft_register_rule pid=3064 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:10.528000 audit[3064]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffd1b9a3630 a2=0 a3=7ffd1b9a361c items=0 ppid=2885 pid=3064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:10.562483 kernel: audit: type=1300 audit(1742237410.528:290): arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7ffd1b9a3630 a2=0 a3=7ffd1b9a361c items=0 ppid=2885 pid=3064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:10.562609 kernel: audit: type=1327 audit(1742237410.528:290): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:10.528000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:10.561000 audit[3064]: NETFILTER_CFG table=nat:93 family=2 entries=12 op=nft_register_rule pid=3064 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:10.581162 kernel: audit: type=1325 audit(1742237410.561:291): table=nat:93 family=2 entries=12 op=nft_register_rule pid=3064 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:10.581264 kernel: audit: type=1300 audit(1742237410.561:291): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd1b9a3630 a2=0 a3=0 items=0 ppid=2885 pid=3064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:10.561000 audit[3064]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffd1b9a3630 a2=0 a3=0 items=0 ppid=2885 pid=3064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:10.561000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:10.608695 kernel: audit: type=1327 audit(1742237410.561:291): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:10.694000 audit[3066]: NETFILTER_CFG table=filter:94 family=2 entries=16 op=nft_register_rule pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:10.694000 audit[3066]: SYSCALL arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7fffba7daf60 a2=0 a3=7fffba7daf4c items=0 ppid=2885 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:10.726931 kernel: audit: type=1325 audit(1742237410.694:292): table=filter:94 family=2 entries=16 op=nft_register_rule pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:10.727101 kernel: audit: type=1300 audit(1742237410.694:292): arch=c000003e syscall=46 success=yes exit=5908 a0=3 a1=7fffba7daf60 a2=0 a3=7fffba7daf4c items=0 ppid=2885 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:10.694000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:10.744695 kernel: audit: type=1327 audit(1742237410.694:292): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:10.736000 audit[3066]: NETFILTER_CFG table=nat:95 family=2 entries=12 op=nft_register_rule pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:10.757706 kernel: audit: type=1325 audit(1742237410.736:293): table=nat:95 family=2 entries=12 op=nft_register_rule pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:10.736000 audit[3066]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7fffba7daf60 a2=0 a3=0 items=0 ppid=2885 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:10.736000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:10.782473 kubelet[2704]: I0317 18:50:10.782406 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7bc55997bb-4cnnf" podStartSLOduration=4.418406005 podStartE2EDuration="6.782367489s" podCreationTimestamp="2025-03-17 18:50:04 +0000 UTC" firstStartedPulling="2025-03-17 18:50:05.070703684 +0000 UTC m=+17.318803882" lastFinishedPulling="2025-03-17 18:50:07.434665268 +0000 UTC m=+19.682765366" observedRunningTime="2025-03-17 18:50:08.350903623 +0000 UTC m=+20.599003721" watchObservedRunningTime="2025-03-17 18:50:10.782367489 +0000 UTC m=+23.030467687" Mar 17 18:50:10.783300 kubelet[2704]: I0317 18:50:10.783261 2704 topology_manager.go:215] "Topology Admit Handler" podUID="9a346ae8-8f9f-4fc6-985f-4433e1ab90b5" podNamespace="calico-system" podName="calico-typha-54458c7855-vx4fm" Mar 17 18:50:10.795215 kubelet[2704]: W0317 18:50:10.794356 2704 reflector.go:547] object-"calico-system"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.795402 kubelet[2704]: E0317 18:50:10.795377 2704 reflector.go:150] object-"calico-system"/"kube-root-ca.crt": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.795503 kubelet[2704]: W0317 18:50:10.795475 2704 reflector.go:547] object-"calico-system"/"typha-certs": failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.795559 kubelet[2704]: E0317 18:50:10.795508 2704 reflector.go:150] object-"calico-system"/"typha-certs": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets "typha-certs" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.795605 kubelet[2704]: W0317 18:50:10.795593 2704 reflector.go:547] object-"calico-system"/"tigera-ca-bundle": failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.795660 kubelet[2704]: E0317 18:50:10.795607 2704 reflector.go:150] object-"calico-system"/"tigera-ca-bundle": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "tigera-ca-bundle" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.863012 kubelet[2704]: I0317 18:50:10.862960 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/9a346ae8-8f9f-4fc6-985f-4433e1ab90b5-typha-certs\") pod \"calico-typha-54458c7855-vx4fm\" (UID: \"9a346ae8-8f9f-4fc6-985f-4433e1ab90b5\") " pod="calico-system/calico-typha-54458c7855-vx4fm" Mar 17 18:50:10.863012 kubelet[2704]: I0317 18:50:10.863018 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9a346ae8-8f9f-4fc6-985f-4433e1ab90b5-tigera-ca-bundle\") pod \"calico-typha-54458c7855-vx4fm\" (UID: \"9a346ae8-8f9f-4fc6-985f-4433e1ab90b5\") " pod="calico-system/calico-typha-54458c7855-vx4fm" Mar 17 18:50:10.863268 kubelet[2704]: I0317 18:50:10.863044 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t99q\" (UniqueName: \"kubernetes.io/projected/9a346ae8-8f9f-4fc6-985f-4433e1ab90b5-kube-api-access-7t99q\") pod \"calico-typha-54458c7855-vx4fm\" (UID: \"9a346ae8-8f9f-4fc6-985f-4433e1ab90b5\") " pod="calico-system/calico-typha-54458c7855-vx4fm" Mar 17 18:50:10.900429 kubelet[2704]: I0317 18:50:10.900380 2704 topology_manager.go:215] "Topology Admit Handler" podUID="e096ffa9-9b38-4c1f-80d3-dda980bb7c8b" podNamespace="calico-system" podName="calico-node-mbfjj" Mar 17 18:50:10.905608 kubelet[2704]: W0317 18:50:10.905568 2704 reflector.go:547] object-"calico-system"/"cni-config": failed to list *v1.ConfigMap: configmaps "cni-config" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.905608 kubelet[2704]: E0317 18:50:10.905610 2704 reflector.go:150] object-"calico-system"/"cni-config": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps "cni-config" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "configmaps" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.905837 kubelet[2704]: W0317 18:50:10.905641 2704 reflector.go:547] object-"calico-system"/"node-certs": failed to list *v1.Secret: secrets "node-certs" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.905837 kubelet[2704]: E0317 18:50:10.905654 2704 reflector.go:150] object-"calico-system"/"node-certs": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets "node-certs" is forbidden: User "system:node:ci-3510.3.7-a-b312ad98ee" cannot list resource "secrets" in API group "" in the namespace "calico-system": no relationship found between node 'ci-3510.3.7-a-b312ad98ee' and this object Mar 17 18:50:10.964007 kubelet[2704]: I0317 18:50:10.963968 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-node-certs\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964215 kubelet[2704]: I0317 18:50:10.964034 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-tigera-ca-bundle\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964215 kubelet[2704]: I0317 18:50:10.964058 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-lib-modules\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964215 kubelet[2704]: I0317 18:50:10.964078 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-var-run-calico\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964215 kubelet[2704]: I0317 18:50:10.964104 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwvqm\" (UniqueName: \"kubernetes.io/projected/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-kube-api-access-hwvqm\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964215 kubelet[2704]: I0317 18:50:10.964125 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-var-lib-calico\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964409 kubelet[2704]: I0317 18:50:10.964144 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-cni-bin-dir\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964409 kubelet[2704]: I0317 18:50:10.964166 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-cni-log-dir\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964409 kubelet[2704]: I0317 18:50:10.964185 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-policysync\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964409 kubelet[2704]: I0317 18:50:10.964217 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-xtables-lock\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964409 kubelet[2704]: I0317 18:50:10.964256 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-flexvol-driver-host\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:10.964594 kubelet[2704]: I0317 18:50:10.964278 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-cni-net-dir\") pod \"calico-node-mbfjj\" (UID: \"e096ffa9-9b38-4c1f-80d3-dda980bb7c8b\") " pod="calico-system/calico-node-mbfjj" Mar 17 18:50:11.046535 kubelet[2704]: I0317 18:50:11.046434 2704 topology_manager.go:215] "Topology Admit Handler" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" podNamespace="calico-system" podName="csi-node-driver-tt74p" Mar 17 18:50:11.047069 kubelet[2704]: E0317 18:50:11.047042 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:11.064946 kubelet[2704]: I0317 18:50:11.064902 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce-socket-dir\") pod \"csi-node-driver-tt74p\" (UID: \"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce\") " pod="calico-system/csi-node-driver-tt74p" Mar 17 18:50:11.065168 kubelet[2704]: I0317 18:50:11.065144 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce-varrun\") pod \"csi-node-driver-tt74p\" (UID: \"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce\") " pod="calico-system/csi-node-driver-tt74p" Mar 17 18:50:11.065409 kubelet[2704]: I0317 18:50:11.065389 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncjbt\" (UniqueName: \"kubernetes.io/projected/eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce-kube-api-access-ncjbt\") pod \"csi-node-driver-tt74p\" (UID: \"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce\") " pod="calico-system/csi-node-driver-tt74p" Mar 17 18:50:11.065563 kubelet[2704]: I0317 18:50:11.065546 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce-registration-dir\") pod \"csi-node-driver-tt74p\" (UID: \"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce\") " pod="calico-system/csi-node-driver-tt74p" Mar 17 18:50:11.065774 kubelet[2704]: I0317 18:50:11.065755 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce-kubelet-dir\") pod \"csi-node-driver-tt74p\" (UID: \"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce\") " pod="calico-system/csi-node-driver-tt74p" Mar 17 18:50:11.166710 kubelet[2704]: E0317 18:50:11.166665 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.166710 kubelet[2704]: W0317 18:50:11.166703 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.166941 kubelet[2704]: E0317 18:50:11.166730 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.166993 kubelet[2704]: E0317 18:50:11.166982 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.167056 kubelet[2704]: W0317 18:50:11.166995 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.167056 kubelet[2704]: E0317 18:50:11.167011 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.167356 kubelet[2704]: E0317 18:50:11.167333 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.167356 kubelet[2704]: W0317 18:50:11.167352 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.167517 kubelet[2704]: E0317 18:50:11.167380 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.167625 kubelet[2704]: E0317 18:50:11.167608 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.167710 kubelet[2704]: W0317 18:50:11.167626 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.167710 kubelet[2704]: E0317 18:50:11.167645 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.167913 kubelet[2704]: E0317 18:50:11.167881 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.167913 kubelet[2704]: W0317 18:50:11.167913 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.168028 kubelet[2704]: E0317 18:50:11.167931 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.168182 kubelet[2704]: E0317 18:50:11.168156 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.168182 kubelet[2704]: W0317 18:50:11.168172 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.168315 kubelet[2704]: E0317 18:50:11.168190 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.168420 kubelet[2704]: E0317 18:50:11.168403 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.168493 kubelet[2704]: W0317 18:50:11.168422 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.168542 kubelet[2704]: E0317 18:50:11.168521 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.168687 kubelet[2704]: E0317 18:50:11.168659 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.168756 kubelet[2704]: W0317 18:50:11.168707 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.168756 kubelet[2704]: E0317 18:50:11.168727 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.168954 kubelet[2704]: E0317 18:50:11.168929 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.168954 kubelet[2704]: W0317 18:50:11.168945 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.169079 kubelet[2704]: E0317 18:50:11.168963 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.169172 kubelet[2704]: E0317 18:50:11.169158 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.169235 kubelet[2704]: W0317 18:50:11.169173 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.169235 kubelet[2704]: E0317 18:50:11.169189 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.169468 kubelet[2704]: E0317 18:50:11.169450 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.169468 kubelet[2704]: W0317 18:50:11.169467 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.169594 kubelet[2704]: E0317 18:50:11.169485 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.169713 kubelet[2704]: E0317 18:50:11.169701 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.169713 kubelet[2704]: W0317 18:50:11.169713 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.169846 kubelet[2704]: E0317 18:50:11.169726 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.169988 kubelet[2704]: E0317 18:50:11.169962 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.169988 kubelet[2704]: W0317 18:50:11.169978 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.170097 kubelet[2704]: E0317 18:50:11.169994 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.170258 kubelet[2704]: E0317 18:50:11.170241 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.170258 kubelet[2704]: W0317 18:50:11.170258 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.170379 kubelet[2704]: E0317 18:50:11.170274 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.170539 kubelet[2704]: E0317 18:50:11.170513 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.170539 kubelet[2704]: W0317 18:50:11.170537 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.170646 kubelet[2704]: E0317 18:50:11.170555 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.170833 kubelet[2704]: E0317 18:50:11.170818 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.170833 kubelet[2704]: W0317 18:50:11.170832 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.170960 kubelet[2704]: E0317 18:50:11.170848 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.171057 kubelet[2704]: E0317 18:50:11.171042 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.171124 kubelet[2704]: W0317 18:50:11.171058 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.171181 kubelet[2704]: E0317 18:50:11.171138 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.171268 kubelet[2704]: E0317 18:50:11.171254 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.171332 kubelet[2704]: W0317 18:50:11.171268 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.171399 kubelet[2704]: E0317 18:50:11.171344 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.171477 kubelet[2704]: E0317 18:50:11.171464 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.171545 kubelet[2704]: W0317 18:50:11.171480 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.171606 kubelet[2704]: E0317 18:50:11.171559 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.171706 kubelet[2704]: E0317 18:50:11.171693 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.171771 kubelet[2704]: W0317 18:50:11.171705 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.171834 kubelet[2704]: E0317 18:50:11.171795 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.171935 kubelet[2704]: E0317 18:50:11.171921 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.171990 kubelet[2704]: W0317 18:50:11.171935 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.171990 kubelet[2704]: E0317 18:50:11.171951 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.172305 kubelet[2704]: E0317 18:50:11.172192 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.172305 kubelet[2704]: W0317 18:50:11.172204 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.172305 kubelet[2704]: E0317 18:50:11.172220 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.172475 kubelet[2704]: E0317 18:50:11.172416 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.172475 kubelet[2704]: W0317 18:50:11.172426 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.172475 kubelet[2704]: E0317 18:50:11.172442 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.173265 kubelet[2704]: E0317 18:50:11.172631 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.173265 kubelet[2704]: W0317 18:50:11.172643 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.173265 kubelet[2704]: E0317 18:50:11.172770 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.173265 kubelet[2704]: E0317 18:50:11.173078 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.173265 kubelet[2704]: W0317 18:50:11.173089 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.173265 kubelet[2704]: E0317 18:50:11.173170 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.173561 kubelet[2704]: E0317 18:50:11.173289 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.173561 kubelet[2704]: W0317 18:50:11.173298 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.173561 kubelet[2704]: E0317 18:50:11.173397 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.173561 kubelet[2704]: E0317 18:50:11.173512 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.173561 kubelet[2704]: W0317 18:50:11.173520 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.173561 kubelet[2704]: E0317 18:50:11.173535 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.173843 kubelet[2704]: E0317 18:50:11.173795 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.173843 kubelet[2704]: W0317 18:50:11.173806 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.173843 kubelet[2704]: E0317 18:50:11.173822 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.174090 kubelet[2704]: E0317 18:50:11.174045 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.174090 kubelet[2704]: W0317 18:50:11.174062 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.174090 kubelet[2704]: E0317 18:50:11.174078 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.174310 kubelet[2704]: E0317 18:50:11.174258 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.174310 kubelet[2704]: W0317 18:50:11.174299 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.174423 kubelet[2704]: E0317 18:50:11.174313 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.174843 kubelet[2704]: E0317 18:50:11.174573 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.174843 kubelet[2704]: W0317 18:50:11.174588 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.174843 kubelet[2704]: E0317 18:50:11.174602 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.271778 kubelet[2704]: E0317 18:50:11.271748 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.271778 kubelet[2704]: W0317 18:50:11.271769 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.272047 kubelet[2704]: E0317 18:50:11.271796 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.272111 kubelet[2704]: E0317 18:50:11.272062 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.272111 kubelet[2704]: W0317 18:50:11.272074 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.272111 kubelet[2704]: E0317 18:50:11.272091 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.272329 kubelet[2704]: E0317 18:50:11.272306 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.272329 kubelet[2704]: W0317 18:50:11.272324 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.272453 kubelet[2704]: E0317 18:50:11.272338 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.272543 kubelet[2704]: E0317 18:50:11.272528 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.272543 kubelet[2704]: W0317 18:50:11.272543 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.272688 kubelet[2704]: E0317 18:50:11.272555 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.272779 kubelet[2704]: E0317 18:50:11.272759 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.272779 kubelet[2704]: W0317 18:50:11.272774 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.272897 kubelet[2704]: E0317 18:50:11.272787 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.272986 kubelet[2704]: E0317 18:50:11.272972 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.272986 kubelet[2704]: W0317 18:50:11.272984 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.273097 kubelet[2704]: E0317 18:50:11.272996 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.273265 kubelet[2704]: E0317 18:50:11.273247 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.273265 kubelet[2704]: W0317 18:50:11.273261 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.273364 kubelet[2704]: E0317 18:50:11.273274 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.375943 kubelet[2704]: E0317 18:50:11.374385 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.375943 kubelet[2704]: W0317 18:50:11.374405 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.375943 kubelet[2704]: E0317 18:50:11.374427 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.375943 kubelet[2704]: E0317 18:50:11.374726 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.375943 kubelet[2704]: W0317 18:50:11.374740 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.375943 kubelet[2704]: E0317 18:50:11.374755 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.375943 kubelet[2704]: E0317 18:50:11.374934 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.375943 kubelet[2704]: W0317 18:50:11.374942 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.375943 kubelet[2704]: E0317 18:50:11.374954 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.377923 kubelet[2704]: E0317 18:50:11.377903 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.377923 kubelet[2704]: W0317 18:50:11.377918 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.378101 kubelet[2704]: E0317 18:50:11.377935 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.378153 kubelet[2704]: E0317 18:50:11.378142 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.378195 kubelet[2704]: W0317 18:50:11.378153 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.378195 kubelet[2704]: E0317 18:50:11.378166 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.378397 kubelet[2704]: E0317 18:50:11.378382 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.378397 kubelet[2704]: W0317 18:50:11.378394 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.378534 kubelet[2704]: E0317 18:50:11.378407 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.378621 kubelet[2704]: E0317 18:50:11.378603 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.378621 kubelet[2704]: W0317 18:50:11.378617 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.378721 kubelet[2704]: E0317 18:50:11.378629 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.479784 kubelet[2704]: E0317 18:50:11.479752 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.479784 kubelet[2704]: W0317 18:50:11.479775 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.480047 kubelet[2704]: E0317 18:50:11.479798 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.480047 kubelet[2704]: E0317 18:50:11.480025 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.480047 kubelet[2704]: W0317 18:50:11.480037 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.480197 kubelet[2704]: E0317 18:50:11.480052 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.480269 kubelet[2704]: E0317 18:50:11.480249 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.480269 kubelet[2704]: W0317 18:50:11.480265 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.480382 kubelet[2704]: E0317 18:50:11.480279 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.480479 kubelet[2704]: E0317 18:50:11.480467 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.480542 kubelet[2704]: W0317 18:50:11.480479 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.480542 kubelet[2704]: E0317 18:50:11.480492 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.480713 kubelet[2704]: E0317 18:50:11.480696 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.480713 kubelet[2704]: W0317 18:50:11.480710 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.480845 kubelet[2704]: E0317 18:50:11.480723 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.480923 kubelet[2704]: E0317 18:50:11.480905 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.480923 kubelet[2704]: W0317 18:50:11.480919 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.481036 kubelet[2704]: E0317 18:50:11.480931 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.481128 kubelet[2704]: E0317 18:50:11.481115 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.481128 kubelet[2704]: W0317 18:50:11.481126 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.481214 kubelet[2704]: E0317 18:50:11.481138 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.581891 kubelet[2704]: E0317 18:50:11.581858 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.581891 kubelet[2704]: W0317 18:50:11.581881 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.582155 kubelet[2704]: E0317 18:50:11.581904 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.582155 kubelet[2704]: E0317 18:50:11.582144 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.582155 kubelet[2704]: W0317 18:50:11.582154 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.582303 kubelet[2704]: E0317 18:50:11.582168 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.582364 kubelet[2704]: E0317 18:50:11.582347 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.582410 kubelet[2704]: W0317 18:50:11.582365 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.582410 kubelet[2704]: E0317 18:50:11.582376 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.582571 kubelet[2704]: E0317 18:50:11.582550 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.582571 kubelet[2704]: W0317 18:50:11.582566 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.582771 kubelet[2704]: E0317 18:50:11.582579 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.582825 kubelet[2704]: E0317 18:50:11.582791 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.582825 kubelet[2704]: W0317 18:50:11.582802 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.582825 kubelet[2704]: E0317 18:50:11.582816 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.583009 kubelet[2704]: E0317 18:50:11.582995 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.583009 kubelet[2704]: W0317 18:50:11.583007 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.583141 kubelet[2704]: E0317 18:50:11.583020 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.583233 kubelet[2704]: E0317 18:50:11.583219 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.583233 kubelet[2704]: W0317 18:50:11.583232 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.583314 kubelet[2704]: E0317 18:50:11.583243 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.684398 kubelet[2704]: E0317 18:50:11.684300 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.684398 kubelet[2704]: W0317 18:50:11.684321 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.684398 kubelet[2704]: E0317 18:50:11.684343 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.684685 kubelet[2704]: E0317 18:50:11.684600 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.684685 kubelet[2704]: W0317 18:50:11.684613 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.684685 kubelet[2704]: E0317 18:50:11.684628 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.686851 kubelet[2704]: E0317 18:50:11.684847 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.686851 kubelet[2704]: W0317 18:50:11.684861 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.686851 kubelet[2704]: E0317 18:50:11.684876 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.686851 kubelet[2704]: E0317 18:50:11.685058 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.686851 kubelet[2704]: W0317 18:50:11.685067 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.686851 kubelet[2704]: E0317 18:50:11.685079 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.686851 kubelet[2704]: E0317 18:50:11.685257 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.686851 kubelet[2704]: W0317 18:50:11.685268 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.686851 kubelet[2704]: E0317 18:50:11.685280 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.686851 kubelet[2704]: E0317 18:50:11.685474 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.687159 kubelet[2704]: W0317 18:50:11.685484 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.687159 kubelet[2704]: E0317 18:50:11.685495 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.687159 kubelet[2704]: E0317 18:50:11.685690 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.687159 kubelet[2704]: W0317 18:50:11.685700 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.687159 kubelet[2704]: E0317 18:50:11.685714 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.756000 audit[3134]: NETFILTER_CFG table=filter:96 family=2 entries=17 op=nft_register_rule pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:11.756000 audit[3134]: SYSCALL arch=c000003e syscall=46 success=yes exit=6652 a0=3 a1=7ffc7f44a360 a2=0 a3=7ffc7f44a34c items=0 ppid=2885 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:11.756000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:11.760000 audit[3134]: NETFILTER_CFG table=nat:97 family=2 entries=12 op=nft_register_rule pid=3134 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:11.760000 audit[3134]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc7f44a360 a2=0 a3=0 items=0 ppid=2885 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:11.760000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:11.786714 kubelet[2704]: E0317 18:50:11.786686 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.786714 kubelet[2704]: W0317 18:50:11.786711 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.787369 kubelet[2704]: E0317 18:50:11.786734 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.787369 kubelet[2704]: E0317 18:50:11.786967 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.787369 kubelet[2704]: W0317 18:50:11.786978 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.787369 kubelet[2704]: E0317 18:50:11.786993 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.787369 kubelet[2704]: E0317 18:50:11.787186 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.787369 kubelet[2704]: W0317 18:50:11.787197 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.787369 kubelet[2704]: E0317 18:50:11.787209 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.787689 kubelet[2704]: E0317 18:50:11.787380 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.787689 kubelet[2704]: W0317 18:50:11.787389 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.787689 kubelet[2704]: E0317 18:50:11.787402 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.787689 kubelet[2704]: E0317 18:50:11.787571 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.787689 kubelet[2704]: W0317 18:50:11.787580 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.787689 kubelet[2704]: E0317 18:50:11.787591 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.787955 kubelet[2704]: E0317 18:50:11.787815 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.787955 kubelet[2704]: W0317 18:50:11.787826 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.787955 kubelet[2704]: E0317 18:50:11.787838 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.788073 kubelet[2704]: E0317 18:50:11.788003 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.788073 kubelet[2704]: W0317 18:50:11.788012 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.788073 kubelet[2704]: E0317 18:50:11.788023 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.888448 kubelet[2704]: E0317 18:50:11.888420 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.888448 kubelet[2704]: W0317 18:50:11.888439 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.888769 kubelet[2704]: E0317 18:50:11.888461 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.888769 kubelet[2704]: E0317 18:50:11.888722 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.888769 kubelet[2704]: W0317 18:50:11.888734 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.888769 kubelet[2704]: E0317 18:50:11.888751 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.888994 kubelet[2704]: E0317 18:50:11.888971 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.888994 kubelet[2704]: W0317 18:50:11.888982 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.889105 kubelet[2704]: E0317 18:50:11.888995 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.889219 kubelet[2704]: E0317 18:50:11.889201 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.889383 kubelet[2704]: W0317 18:50:11.889219 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.889383 kubelet[2704]: E0317 18:50:11.889238 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.889498 kubelet[2704]: E0317 18:50:11.889453 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.889498 kubelet[2704]: W0317 18:50:11.889465 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.889498 kubelet[2704]: E0317 18:50:11.889491 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.889760 kubelet[2704]: E0317 18:50:11.889735 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.889760 kubelet[2704]: W0317 18:50:11.889755 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.889928 kubelet[2704]: E0317 18:50:11.889771 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.890012 kubelet[2704]: E0317 18:50:11.889991 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.890012 kubelet[2704]: W0317 18:50:11.890007 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.890113 kubelet[2704]: E0317 18:50:11.890033 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.965323 kubelet[2704]: E0317 18:50:11.965206 2704 configmap.go:199] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 17 18:50:11.965323 kubelet[2704]: E0317 18:50:11.965315 2704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9a346ae8-8f9f-4fc6-985f-4433e1ab90b5-tigera-ca-bundle podName:9a346ae8-8f9f-4fc6-985f-4433e1ab90b5 nodeName:}" failed. No retries permitted until 2025-03-17 18:50:12.465293524 +0000 UTC m=+24.713393722 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/9a346ae8-8f9f-4fc6-985f-4433e1ab90b5-tigera-ca-bundle") pod "calico-typha-54458c7855-vx4fm" (UID: "9a346ae8-8f9f-4fc6-985f-4433e1ab90b5") : failed to sync configmap cache: timed out waiting for the condition Mar 17 18:50:11.965609 kubelet[2704]: E0317 18:50:11.965206 2704 secret.go:194] Couldn't get secret calico-system/typha-certs: failed to sync secret cache: timed out waiting for the condition Mar 17 18:50:11.965609 kubelet[2704]: E0317 18:50:11.965558 2704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9a346ae8-8f9f-4fc6-985f-4433e1ab90b5-typha-certs podName:9a346ae8-8f9f-4fc6-985f-4433e1ab90b5 nodeName:}" failed. No retries permitted until 2025-03-17 18:50:12.465541227 +0000 UTC m=+24.713641325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "typha-certs" (UniqueName: "kubernetes.io/secret/9a346ae8-8f9f-4fc6-985f-4433e1ab90b5-typha-certs") pod "calico-typha-54458c7855-vx4fm" (UID: "9a346ae8-8f9f-4fc6-985f-4433e1ab90b5") : failed to sync secret cache: timed out waiting for the condition Mar 17 18:50:11.970551 kubelet[2704]: E0317 18:50:11.970201 2704 projected.go:294] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 17 18:50:11.970551 kubelet[2704]: E0317 18:50:11.970232 2704 projected.go:200] Error preparing data for projected volume kube-api-access-7t99q for pod calico-system/calico-typha-54458c7855-vx4fm: failed to sync configmap cache: timed out waiting for the condition Mar 17 18:50:11.970551 kubelet[2704]: E0317 18:50:11.970280 2704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a346ae8-8f9f-4fc6-985f-4433e1ab90b5-kube-api-access-7t99q podName:9a346ae8-8f9f-4fc6-985f-4433e1ab90b5 nodeName:}" failed. No retries permitted until 2025-03-17 18:50:12.470264772 +0000 UTC m=+24.718364870 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7t99q" (UniqueName: "kubernetes.io/projected/9a346ae8-8f9f-4fc6-985f-4433e1ab90b5-kube-api-access-7t99q") pod "calico-typha-54458c7855-vx4fm" (UID: "9a346ae8-8f9f-4fc6-985f-4433e1ab90b5") : failed to sync configmap cache: timed out waiting for the condition Mar 17 18:50:11.990634 kubelet[2704]: E0317 18:50:11.990604 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.990634 kubelet[2704]: W0317 18:50:11.990628 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.990901 kubelet[2704]: E0317 18:50:11.990650 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.990901 kubelet[2704]: E0317 18:50:11.990894 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.990995 kubelet[2704]: W0317 18:50:11.990907 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.990995 kubelet[2704]: E0317 18:50:11.990922 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.991126 kubelet[2704]: E0317 18:50:11.991107 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.991126 kubelet[2704]: W0317 18:50:11.991123 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.991248 kubelet[2704]: E0317 18:50:11.991139 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.991343 kubelet[2704]: E0317 18:50:11.991328 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.991396 kubelet[2704]: W0317 18:50:11.991344 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.991396 kubelet[2704]: E0317 18:50:11.991357 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.991716 kubelet[2704]: E0317 18:50:11.991545 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.991716 kubelet[2704]: W0317 18:50:11.991568 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.991716 kubelet[2704]: E0317 18:50:11.991580 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.991893 kubelet[2704]: E0317 18:50:11.991791 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.991893 kubelet[2704]: W0317 18:50:11.991801 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.991893 kubelet[2704]: E0317 18:50:11.991813 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:11.992033 kubelet[2704]: E0317 18:50:11.991988 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:11.992033 kubelet[2704]: W0317 18:50:11.991997 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:11.992033 kubelet[2704]: E0317 18:50:11.992010 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.026446 kubelet[2704]: E0317 18:50:12.026421 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.026614 kubelet[2704]: W0317 18:50:12.026597 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.026753 kubelet[2704]: E0317 18:50:12.026736 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.037123 kubelet[2704]: E0317 18:50:12.037104 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.037285 kubelet[2704]: W0317 18:50:12.037269 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.037375 kubelet[2704]: E0317 18:50:12.037362 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.065378 kubelet[2704]: E0317 18:50:12.065338 2704 configmap.go:199] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 17 18:50:12.065541 kubelet[2704]: E0317 18:50:12.065446 2704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-tigera-ca-bundle podName:e096ffa9-9b38-4c1f-80d3-dda980bb7c8b nodeName:}" failed. No retries permitted until 2025-03-17 18:50:12.56540997 +0000 UTC m=+24.813510168 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-tigera-ca-bundle") pod "calico-node-mbfjj" (UID: "e096ffa9-9b38-4c1f-80d3-dda980bb7c8b") : failed to sync configmap cache: timed out waiting for the condition Mar 17 18:50:12.065955 kubelet[2704]: E0317 18:50:12.065786 2704 secret.go:194] Couldn't get secret calico-system/node-certs: failed to sync secret cache: timed out waiting for the condition Mar 17 18:50:12.065955 kubelet[2704]: E0317 18:50:12.065854 2704 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-node-certs podName:e096ffa9-9b38-4c1f-80d3-dda980bb7c8b nodeName:}" failed. No retries permitted until 2025-03-17 18:50:12.565835874 +0000 UTC m=+24.813935972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-certs" (UniqueName: "kubernetes.io/secret/e096ffa9-9b38-4c1f-80d3-dda980bb7c8b-node-certs") pod "calico-node-mbfjj" (UID: "e096ffa9-9b38-4c1f-80d3-dda980bb7c8b") : failed to sync secret cache: timed out waiting for the condition Mar 17 18:50:12.093105 kubelet[2704]: E0317 18:50:12.093074 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.093949 kubelet[2704]: W0317 18:50:12.093253 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.093949 kubelet[2704]: E0317 18:50:12.093294 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.093949 kubelet[2704]: E0317 18:50:12.093548 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.093949 kubelet[2704]: W0317 18:50:12.093561 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.093949 kubelet[2704]: E0317 18:50:12.093578 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.093949 kubelet[2704]: E0317 18:50:12.093854 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.093949 kubelet[2704]: W0317 18:50:12.093867 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.093949 kubelet[2704]: E0317 18:50:12.093882 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.094764 kubelet[2704]: E0317 18:50:12.094075 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.094764 kubelet[2704]: W0317 18:50:12.094085 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.094764 kubelet[2704]: E0317 18:50:12.094098 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.094764 kubelet[2704]: E0317 18:50:12.094289 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.094764 kubelet[2704]: W0317 18:50:12.094299 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.094764 kubelet[2704]: E0317 18:50:12.094311 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.194878 kubelet[2704]: E0317 18:50:12.194845 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.194878 kubelet[2704]: W0317 18:50:12.194869 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.195111 kubelet[2704]: E0317 18:50:12.194891 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.195204 kubelet[2704]: E0317 18:50:12.195129 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.195204 kubelet[2704]: W0317 18:50:12.195140 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.195204 kubelet[2704]: E0317 18:50:12.195157 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.195361 kubelet[2704]: E0317 18:50:12.195351 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.195409 kubelet[2704]: W0317 18:50:12.195362 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.195409 kubelet[2704]: E0317 18:50:12.195376 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.195585 kubelet[2704]: E0317 18:50:12.195566 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.195585 kubelet[2704]: W0317 18:50:12.195580 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.195743 kubelet[2704]: E0317 18:50:12.195594 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.195838 kubelet[2704]: E0317 18:50:12.195821 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.195838 kubelet[2704]: W0317 18:50:12.195835 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.195917 kubelet[2704]: E0317 18:50:12.195849 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.297116 kubelet[2704]: E0317 18:50:12.297086 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.297116 kubelet[2704]: W0317 18:50:12.297107 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.297358 kubelet[2704]: E0317 18:50:12.297128 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.297443 kubelet[2704]: E0317 18:50:12.297379 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.297443 kubelet[2704]: W0317 18:50:12.297391 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.297443 kubelet[2704]: E0317 18:50:12.297405 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.297628 kubelet[2704]: E0317 18:50:12.297609 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.297628 kubelet[2704]: W0317 18:50:12.297624 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.297787 kubelet[2704]: E0317 18:50:12.297637 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.297879 kubelet[2704]: E0317 18:50:12.297861 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.297879 kubelet[2704]: W0317 18:50:12.297876 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.297972 kubelet[2704]: E0317 18:50:12.297889 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.298103 kubelet[2704]: E0317 18:50:12.298087 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.298103 kubelet[2704]: W0317 18:50:12.298100 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.298186 kubelet[2704]: E0317 18:50:12.298114 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.399103 kubelet[2704]: E0317 18:50:12.399074 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.399103 kubelet[2704]: W0317 18:50:12.399094 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.399396 kubelet[2704]: E0317 18:50:12.399118 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.399396 kubelet[2704]: E0317 18:50:12.399388 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.399521 kubelet[2704]: W0317 18:50:12.399402 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.399521 kubelet[2704]: E0317 18:50:12.399418 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.399664 kubelet[2704]: E0317 18:50:12.399646 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.399664 kubelet[2704]: W0317 18:50:12.399657 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.399840 kubelet[2704]: E0317 18:50:12.399700 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.399955 kubelet[2704]: E0317 18:50:12.399924 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.399955 kubelet[2704]: W0317 18:50:12.399942 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.400102 kubelet[2704]: E0317 18:50:12.399958 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.400198 kubelet[2704]: E0317 18:50:12.400177 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.400198 kubelet[2704]: W0317 18:50:12.400194 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.400295 kubelet[2704]: E0317 18:50:12.400209 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.500788 kubelet[2704]: E0317 18:50:12.500757 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.500788 kubelet[2704]: W0317 18:50:12.500779 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.501057 kubelet[2704]: E0317 18:50:12.500802 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.501124 kubelet[2704]: E0317 18:50:12.501107 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.501124 kubelet[2704]: W0317 18:50:12.501118 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.501208 kubelet[2704]: E0317 18:50:12.501136 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.501394 kubelet[2704]: E0317 18:50:12.501375 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.501394 kubelet[2704]: W0317 18:50:12.501388 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.501549 kubelet[2704]: E0317 18:50:12.501410 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.501635 kubelet[2704]: E0317 18:50:12.501618 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.501635 kubelet[2704]: W0317 18:50:12.501633 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.501750 kubelet[2704]: E0317 18:50:12.501646 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.501872 kubelet[2704]: E0317 18:50:12.501856 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.501872 kubelet[2704]: W0317 18:50:12.501869 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.501974 kubelet[2704]: E0317 18:50:12.501884 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.502098 kubelet[2704]: E0317 18:50:12.502084 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.502098 kubelet[2704]: W0317 18:50:12.502096 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.502213 kubelet[2704]: E0317 18:50:12.502113 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.502311 kubelet[2704]: E0317 18:50:12.502297 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.502311 kubelet[2704]: W0317 18:50:12.502309 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.502426 kubelet[2704]: E0317 18:50:12.502335 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.502518 kubelet[2704]: E0317 18:50:12.502501 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.502518 kubelet[2704]: W0317 18:50:12.502514 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.502627 kubelet[2704]: E0317 18:50:12.502525 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.502765 kubelet[2704]: E0317 18:50:12.502743 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.502765 kubelet[2704]: W0317 18:50:12.502756 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.502887 kubelet[2704]: E0317 18:50:12.502773 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.503162 kubelet[2704]: E0317 18:50:12.503142 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.503262 kubelet[2704]: W0317 18:50:12.503248 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.503341 kubelet[2704]: E0317 18:50:12.503329 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.503582 kubelet[2704]: E0317 18:50:12.503570 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.503689 kubelet[2704]: W0317 18:50:12.503651 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.503756 kubelet[2704]: E0317 18:50:12.503690 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.503918 kubelet[2704]: E0317 18:50:12.503902 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.503918 kubelet[2704]: W0317 18:50:12.503915 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.504033 kubelet[2704]: E0317 18:50:12.503933 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.504160 kubelet[2704]: E0317 18:50:12.504142 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.504219 kubelet[2704]: W0317 18:50:12.504163 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.504219 kubelet[2704]: E0317 18:50:12.504176 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.504682 kubelet[2704]: E0317 18:50:12.504647 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.504682 kubelet[2704]: W0317 18:50:12.504663 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.504825 kubelet[2704]: E0317 18:50:12.504697 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.504960 kubelet[2704]: E0317 18:50:12.504945 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.505018 kubelet[2704]: W0317 18:50:12.504961 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.505018 kubelet[2704]: E0317 18:50:12.504974 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.505206 kubelet[2704]: E0317 18:50:12.505191 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.505206 kubelet[2704]: W0317 18:50:12.505206 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.505330 kubelet[2704]: E0317 18:50:12.505219 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.505598 kubelet[2704]: E0317 18:50:12.505578 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.505725 kubelet[2704]: W0317 18:50:12.505609 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.505725 kubelet[2704]: E0317 18:50:12.505623 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.505875 kubelet[2704]: E0317 18:50:12.505858 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.505928 kubelet[2704]: W0317 18:50:12.505879 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.505928 kubelet[2704]: E0317 18:50:12.505892 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.521890 kubelet[2704]: E0317 18:50:12.521233 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.521890 kubelet[2704]: W0317 18:50:12.521253 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.521890 kubelet[2704]: E0317 18:50:12.521277 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.521890 kubelet[2704]: E0317 18:50:12.521550 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.521890 kubelet[2704]: W0317 18:50:12.521570 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.521890 kubelet[2704]: E0317 18:50:12.521587 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.592614 env[1524]: time="2025-03-17T18:50:12.590077592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54458c7855-vx4fm,Uid:9a346ae8-8f9f-4fc6-985f-4433e1ab90b5,Namespace:calico-system,Attempt:0,}" Mar 17 18:50:12.602309 kubelet[2704]: E0317 18:50:12.602285 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.602309 kubelet[2704]: W0317 18:50:12.602303 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.602490 kubelet[2704]: E0317 18:50:12.602325 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.602631 kubelet[2704]: E0317 18:50:12.602614 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.602631 kubelet[2704]: W0317 18:50:12.602629 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.602797 kubelet[2704]: E0317 18:50:12.602659 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.602975 kubelet[2704]: E0317 18:50:12.602957 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.602975 kubelet[2704]: W0317 18:50:12.602971 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.603104 kubelet[2704]: E0317 18:50:12.602992 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.603244 kubelet[2704]: E0317 18:50:12.603226 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.603244 kubelet[2704]: W0317 18:50:12.603240 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.603371 kubelet[2704]: E0317 18:50:12.603258 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.603455 kubelet[2704]: E0317 18:50:12.603438 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.603455 kubelet[2704]: W0317 18:50:12.603451 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.603563 kubelet[2704]: E0317 18:50:12.603468 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.603699 kubelet[2704]: E0317 18:50:12.603684 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.603699 kubelet[2704]: W0317 18:50:12.603698 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.603866 kubelet[2704]: E0317 18:50:12.603848 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.604010 kubelet[2704]: E0317 18:50:12.603871 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.604084 kubelet[2704]: W0317 18:50:12.604010 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.604084 kubelet[2704]: E0317 18:50:12.604025 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.604235 kubelet[2704]: E0317 18:50:12.604220 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.604291 kubelet[2704]: W0317 18:50:12.604236 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.604291 kubelet[2704]: E0317 18:50:12.604249 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.604425 kubelet[2704]: E0317 18:50:12.604410 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.604478 kubelet[2704]: W0317 18:50:12.604426 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.604478 kubelet[2704]: E0317 18:50:12.604438 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.604639 kubelet[2704]: E0317 18:50:12.604623 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.604734 kubelet[2704]: W0317 18:50:12.604640 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.604734 kubelet[2704]: E0317 18:50:12.604652 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.605566 kubelet[2704]: E0317 18:50:12.605544 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.605566 kubelet[2704]: W0317 18:50:12.605565 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.605706 kubelet[2704]: E0317 18:50:12.605579 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.614191 kubelet[2704]: E0317 18:50:12.614173 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:12.614312 kubelet[2704]: W0317 18:50:12.614297 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:12.614397 kubelet[2704]: E0317 18:50:12.614373 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:12.635409 env[1524]: time="2025-03-17T18:50:12.635339817Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:12.635409 env[1524]: time="2025-03-17T18:50:12.635373317Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:12.635631 env[1524]: time="2025-03-17T18:50:12.635595419Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:12.635893 env[1524]: time="2025-03-17T18:50:12.635849121Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/97954a8b19f704f37024355eec38d1ffb9fc3637357823ab288132635d10d19d pid=3222 runtime=io.containerd.runc.v2 Mar 17 18:50:12.696432 env[1524]: time="2025-03-17T18:50:12.696393389Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-54458c7855-vx4fm,Uid:9a346ae8-8f9f-4fc6-985f-4433e1ab90b5,Namespace:calico-system,Attempt:0,} returns sandbox id \"97954a8b19f704f37024355eec38d1ffb9fc3637357823ab288132635d10d19d\"" Mar 17 18:50:12.698934 env[1524]: time="2025-03-17T18:50:12.698816012Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\"" Mar 17 18:50:12.704413 env[1524]: time="2025-03-17T18:50:12.704386764Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mbfjj,Uid:e096ffa9-9b38-4c1f-80d3-dda980bb7c8b,Namespace:calico-system,Attempt:0,}" Mar 17 18:50:12.734727 env[1524]: time="2025-03-17T18:50:12.734415046Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:12.734727 env[1524]: time="2025-03-17T18:50:12.734466746Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:12.734727 env[1524]: time="2025-03-17T18:50:12.734481447Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:12.734727 env[1524]: time="2025-03-17T18:50:12.734622448Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/2e1219721c525786caca484fed284f7ed9918ccab0ecc9bb3e3aba917a7ad52c pid=3265 runtime=io.containerd.runc.v2 Mar 17 18:50:12.782653 env[1524]: time="2025-03-17T18:50:12.782608098Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-mbfjj,Uid:e096ffa9-9b38-4c1f-80d3-dda980bb7c8b,Namespace:calico-system,Attempt:0,} returns sandbox id \"2e1219721c525786caca484fed284f7ed9918ccab0ecc9bb3e3aba917a7ad52c\"" Mar 17 18:50:13.223003 kubelet[2704]: E0317 18:50:13.222957 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:14.636655 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2722105799.mount: Deactivated successfully. Mar 17 18:50:15.223161 kubelet[2704]: E0317 18:50:15.223095 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:15.841157 env[1524]: time="2025-03-17T18:50:15.841112204Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:15.854466 env[1524]: time="2025-03-17T18:50:15.854422722Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:15.858468 env[1524]: time="2025-03-17T18:50:15.858433258Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/typha:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:15.863484 env[1524]: time="2025-03-17T18:50:15.863456803Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/typha@sha256:768a194e1115c73bcbf35edb7afd18a63e16e08d940c79993565b6a3cca2da7c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:15.863946 env[1524]: time="2025-03-17T18:50:15.863913707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.29.1\" returns image reference \"sha256:4cb3738506f5a9c530033d1e24fd6b9ec618518a2ec8b012ded33572be06ab44\"" Mar 17 18:50:15.865202 env[1524]: time="2025-03-17T18:50:15.865173218Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\"" Mar 17 18:50:15.882547 env[1524]: time="2025-03-17T18:50:15.882504273Z" level=info msg="CreateContainer within sandbox \"97954a8b19f704f37024355eec38d1ffb9fc3637357823ab288132635d10d19d\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Mar 17 18:50:15.935314 env[1524]: time="2025-03-17T18:50:15.935172842Z" level=info msg="CreateContainer within sandbox \"97954a8b19f704f37024355eec38d1ffb9fc3637357823ab288132635d10d19d\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"122d4082eab084f43c68c0c5762297a8791234fa192bb1b5462036fc49647c97\"" Mar 17 18:50:15.938185 env[1524]: time="2025-03-17T18:50:15.937208260Z" level=info msg="StartContainer for \"122d4082eab084f43c68c0c5762297a8791234fa192bb1b5462036fc49647c97\"" Mar 17 18:50:16.014178 env[1524]: time="2025-03-17T18:50:16.014126143Z" level=info msg="StartContainer for \"122d4082eab084f43c68c0c5762297a8791234fa192bb1b5462036fc49647c97\" returns successfully" Mar 17 18:50:16.370924 kubelet[2704]: I0317 18:50:16.370480 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-54458c7855-vx4fm" podStartSLOduration=3.202955646 podStartE2EDuration="6.370458863s" podCreationTimestamp="2025-03-17 18:50:10 +0000 UTC" firstStartedPulling="2025-03-17 18:50:12.6975036 +0000 UTC m=+24.945603798" lastFinishedPulling="2025-03-17 18:50:15.865006917 +0000 UTC m=+28.113107015" observedRunningTime="2025-03-17 18:50:16.37015976 +0000 UTC m=+28.618259858" watchObservedRunningTime="2025-03-17 18:50:16.370458863 +0000 UTC m=+28.618559061" Mar 17 18:50:16.396953 kubelet[2704]: E0317 18:50:16.396919 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.396953 kubelet[2704]: W0317 18:50:16.396944 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.396953 kubelet[2704]: E0317 18:50:16.396969 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.397234 kubelet[2704]: E0317 18:50:16.397176 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.397234 kubelet[2704]: W0317 18:50:16.397187 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.397234 kubelet[2704]: E0317 18:50:16.397200 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.397402 kubelet[2704]: E0317 18:50:16.397360 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.397402 kubelet[2704]: W0317 18:50:16.397369 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.397402 kubelet[2704]: E0317 18:50:16.397381 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.397557 kubelet[2704]: E0317 18:50:16.397543 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.397557 kubelet[2704]: W0317 18:50:16.397552 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.397651 kubelet[2704]: E0317 18:50:16.397563 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.397795 kubelet[2704]: E0317 18:50:16.397771 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.397795 kubelet[2704]: W0317 18:50:16.397784 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.397935 kubelet[2704]: E0317 18:50:16.397798 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.397981 kubelet[2704]: E0317 18:50:16.397966 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.397981 kubelet[2704]: W0317 18:50:16.397975 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.398079 kubelet[2704]: E0317 18:50:16.397986 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.398163 kubelet[2704]: E0317 18:50:16.398147 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.398237 kubelet[2704]: W0317 18:50:16.398163 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.398237 kubelet[2704]: E0317 18:50:16.398175 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.398351 kubelet[2704]: E0317 18:50:16.398340 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.398397 kubelet[2704]: W0317 18:50:16.398352 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.398397 kubelet[2704]: E0317 18:50:16.398363 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.398556 kubelet[2704]: E0317 18:50:16.398537 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.398556 kubelet[2704]: W0317 18:50:16.398552 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.398700 kubelet[2704]: E0317 18:50:16.398565 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.398748 kubelet[2704]: E0317 18:50:16.398739 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.398793 kubelet[2704]: W0317 18:50:16.398749 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.398793 kubelet[2704]: E0317 18:50:16.398760 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.398932 kubelet[2704]: E0317 18:50:16.398914 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.398932 kubelet[2704]: W0317 18:50:16.398927 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.399059 kubelet[2704]: E0317 18:50:16.398937 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.399110 kubelet[2704]: E0317 18:50:16.399099 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.399152 kubelet[2704]: W0317 18:50:16.399111 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.399152 kubelet[2704]: E0317 18:50:16.399122 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.399419 kubelet[2704]: E0317 18:50:16.399280 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.399419 kubelet[2704]: W0317 18:50:16.399291 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.399419 kubelet[2704]: E0317 18:50:16.399301 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.399624 kubelet[2704]: E0317 18:50:16.399462 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.399624 kubelet[2704]: W0317 18:50:16.399471 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.399624 kubelet[2704]: E0317 18:50:16.399481 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.399759 kubelet[2704]: E0317 18:50:16.399640 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.399759 kubelet[2704]: W0317 18:50:16.399651 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.399759 kubelet[2704]: E0317 18:50:16.399662 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.433452 kubelet[2704]: E0317 18:50:16.433413 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.433452 kubelet[2704]: W0317 18:50:16.433441 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.433753 kubelet[2704]: E0317 18:50:16.433469 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.433894 kubelet[2704]: E0317 18:50:16.433871 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.433894 kubelet[2704]: W0317 18:50:16.433889 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.434050 kubelet[2704]: E0317 18:50:16.433908 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.434208 kubelet[2704]: E0317 18:50:16.434184 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.434315 kubelet[2704]: W0317 18:50:16.434209 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.434315 kubelet[2704]: E0317 18:50:16.434230 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.434572 kubelet[2704]: E0317 18:50:16.434542 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.434572 kubelet[2704]: W0317 18:50:16.434562 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.434777 kubelet[2704]: E0317 18:50:16.434585 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.434861 kubelet[2704]: E0317 18:50:16.434843 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.434861 kubelet[2704]: W0317 18:50:16.434858 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.434981 kubelet[2704]: E0317 18:50:16.434875 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.435104 kubelet[2704]: E0317 18:50:16.435088 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.435177 kubelet[2704]: W0317 18:50:16.435101 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.435177 kubelet[2704]: E0317 18:50:16.435127 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.435374 kubelet[2704]: E0317 18:50:16.435358 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.435374 kubelet[2704]: W0317 18:50:16.435371 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.435499 kubelet[2704]: E0317 18:50:16.435456 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.437066 kubelet[2704]: E0317 18:50:16.436079 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.437066 kubelet[2704]: W0317 18:50:16.436093 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.437066 kubelet[2704]: E0317 18:50:16.436181 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.437066 kubelet[2704]: E0317 18:50:16.436328 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.437066 kubelet[2704]: W0317 18:50:16.436339 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.437066 kubelet[2704]: E0317 18:50:16.436417 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.437066 kubelet[2704]: E0317 18:50:16.436539 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.437066 kubelet[2704]: W0317 18:50:16.436548 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.437066 kubelet[2704]: E0317 18:50:16.436564 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.437066 kubelet[2704]: E0317 18:50:16.436783 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.438249 kubelet[2704]: W0317 18:50:16.436793 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.438249 kubelet[2704]: E0317 18:50:16.436810 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.438249 kubelet[2704]: E0317 18:50:16.437010 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.438249 kubelet[2704]: W0317 18:50:16.437037 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.438249 kubelet[2704]: E0317 18:50:16.437054 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.438249 kubelet[2704]: E0317 18:50:16.437335 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.438249 kubelet[2704]: W0317 18:50:16.437347 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.438249 kubelet[2704]: E0317 18:50:16.437365 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.438249 kubelet[2704]: E0317 18:50:16.437869 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.438249 kubelet[2704]: W0317 18:50:16.437882 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.438713 kubelet[2704]: E0317 18:50:16.437970 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.438713 kubelet[2704]: E0317 18:50:16.438100 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.438713 kubelet[2704]: W0317 18:50:16.438111 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.438713 kubelet[2704]: E0317 18:50:16.438122 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.438713 kubelet[2704]: E0317 18:50:16.438292 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.438713 kubelet[2704]: W0317 18:50:16.438301 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.438713 kubelet[2704]: E0317 18:50:16.438311 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.438713 kubelet[2704]: E0317 18:50:16.438500 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.438713 kubelet[2704]: W0317 18:50:16.438509 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.438713 kubelet[2704]: E0317 18:50:16.438522 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:16.439482 kubelet[2704]: E0317 18:50:16.438893 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:16.439482 kubelet[2704]: W0317 18:50:16.438904 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:16.439482 kubelet[2704]: E0317 18:50:16.438916 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.225956 kubelet[2704]: E0317 18:50:17.222922 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:17.279235 env[1524]: time="2025-03-17T18:50:17.279186479Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:17.284533 env[1524]: time="2025-03-17T18:50:17.284497025Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:17.288652 env[1524]: time="2025-03-17T18:50:17.288572260Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:17.292355 env[1524]: time="2025-03-17T18:50:17.292323092Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:a63f8b4ff531912d12d143664eb263fdbc6cd7b3ff4aa777dfb6e318a090462c,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:17.292838 env[1524]: time="2025-03-17T18:50:17.292795996Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.29.1\" returns image reference \"sha256:2b7452b763ec8833ca0386ada5fd066e552a9b3b02b8538a5e34cc3d6d3840a6\"" Mar 17 18:50:17.295912 env[1524]: time="2025-03-17T18:50:17.295881523Z" level=info msg="CreateContainer within sandbox \"2e1219721c525786caca484fed284f7ed9918ccab0ecc9bb3e3aba917a7ad52c\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Mar 17 18:50:17.322184 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2520291302.mount: Deactivated successfully. Mar 17 18:50:17.336348 env[1524]: time="2025-03-17T18:50:17.336308171Z" level=info msg="CreateContainer within sandbox \"2e1219721c525786caca484fed284f7ed9918ccab0ecc9bb3e3aba917a7ad52c\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"87fe825fc27e46430e1c22e23c1fbf1bf9021fcd248f978e76d8af7c967a1a79\"" Mar 17 18:50:17.336797 env[1524]: time="2025-03-17T18:50:17.336769875Z" level=info msg="StartContainer for \"87fe825fc27e46430e1c22e23c1fbf1bf9021fcd248f978e76d8af7c967a1a79\"" Mar 17 18:50:17.366584 kubelet[2704]: I0317 18:50:17.361036 2704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 18:50:17.399970 env[1524]: time="2025-03-17T18:50:17.399920719Z" level=info msg="StartContainer for \"87fe825fc27e46430e1c22e23c1fbf1bf9021fcd248f978e76d8af7c967a1a79\" returns successfully" Mar 17 18:50:17.405709 kubelet[2704]: E0317 18:50:17.405682 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.405709 kubelet[2704]: W0317 18:50:17.405704 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.406220 kubelet[2704]: E0317 18:50:17.405728 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.406220 kubelet[2704]: E0317 18:50:17.405976 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.406220 kubelet[2704]: W0317 18:50:17.405988 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.406220 kubelet[2704]: E0317 18:50:17.406020 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.406942 kubelet[2704]: E0317 18:50:17.406921 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.406942 kubelet[2704]: W0317 18:50:17.406939 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.407108 kubelet[2704]: E0317 18:50:17.406957 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.407300 kubelet[2704]: E0317 18:50:17.407275 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.407300 kubelet[2704]: W0317 18:50:17.407293 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.407431 kubelet[2704]: E0317 18:50:17.407306 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.407534 kubelet[2704]: E0317 18:50:17.407517 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.407534 kubelet[2704]: W0317 18:50:17.407533 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.407664 kubelet[2704]: E0317 18:50:17.407545 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.407751 kubelet[2704]: E0317 18:50:17.407736 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.407816 kubelet[2704]: W0317 18:50:17.407751 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.407816 kubelet[2704]: E0317 18:50:17.407764 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.407967 kubelet[2704]: E0317 18:50:17.407953 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.408037 kubelet[2704]: W0317 18:50:17.407967 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.408037 kubelet[2704]: E0317 18:50:17.407980 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.408180 kubelet[2704]: E0317 18:50:17.408166 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.408241 kubelet[2704]: W0317 18:50:17.408180 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.408241 kubelet[2704]: E0317 18:50:17.408192 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.408437 kubelet[2704]: E0317 18:50:17.408425 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.408437 kubelet[2704]: W0317 18:50:17.408437 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.408563 kubelet[2704]: E0317 18:50:17.408450 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.408727 kubelet[2704]: E0317 18:50:17.408709 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.408727 kubelet[2704]: W0317 18:50:17.408725 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.408850 kubelet[2704]: E0317 18:50:17.408737 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.408992 kubelet[2704]: E0317 18:50:17.408971 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.408992 kubelet[2704]: W0317 18:50:17.408986 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.409128 kubelet[2704]: E0317 18:50:17.408999 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.409208 kubelet[2704]: E0317 18:50:17.409193 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.409276 kubelet[2704]: W0317 18:50:17.409208 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.409276 kubelet[2704]: E0317 18:50:17.409220 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.409444 kubelet[2704]: E0317 18:50:17.409427 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.409521 kubelet[2704]: W0317 18:50:17.409459 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.409521 kubelet[2704]: E0317 18:50:17.409473 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.409735 kubelet[2704]: E0317 18:50:17.409719 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.409735 kubelet[2704]: W0317 18:50:17.409735 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.409850 kubelet[2704]: E0317 18:50:17.409747 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.410064 kubelet[2704]: E0317 18:50:17.410046 2704 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Mar 17 18:50:17.410149 kubelet[2704]: W0317 18:50:17.410082 2704 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Mar 17 18:50:17.410149 kubelet[2704]: E0317 18:50:17.410097 2704 plugins.go:730] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Mar 17 18:50:17.872607 systemd[1]: run-containerd-runc-k8s.io-87fe825fc27e46430e1c22e23c1fbf1bf9021fcd248f978e76d8af7c967a1a79-runc.MUHLpR.mount: Deactivated successfully. Mar 17 18:50:17.872807 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-87fe825fc27e46430e1c22e23c1fbf1bf9021fcd248f978e76d8af7c967a1a79-rootfs.mount: Deactivated successfully. Mar 17 18:50:18.748496 env[1524]: time="2025-03-17T18:50:18.432283448Z" level=error msg="collecting metrics for 87fe825fc27e46430e1c22e23c1fbf1bf9021fcd248f978e76d8af7c967a1a79" error="cgroups: cgroup deleted: unknown" Mar 17 18:50:18.805872 env[1524]: time="2025-03-17T18:50:18.805806413Z" level=info msg="shim disconnected" id=87fe825fc27e46430e1c22e23c1fbf1bf9021fcd248f978e76d8af7c967a1a79 Mar 17 18:50:18.805872 env[1524]: time="2025-03-17T18:50:18.805865413Z" level=warning msg="cleaning up after shim disconnected" id=87fe825fc27e46430e1c22e23c1fbf1bf9021fcd248f978e76d8af7c967a1a79 namespace=k8s.io Mar 17 18:50:18.805872 env[1524]: time="2025-03-17T18:50:18.805879413Z" level=info msg="cleaning up dead shim" Mar 17 18:50:18.813448 env[1524]: time="2025-03-17T18:50:18.813406077Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:50:18Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3447 runtime=io.containerd.runc.v2\n" Mar 17 18:50:19.160924 waagent[1747]: 2025-03-17T18:50:19.160308Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 2] Mar 17 18:50:19.171421 waagent[1747]: 2025-03-17T18:50:19.171357Z INFO ExtHandler Mar 17 18:50:19.171598 waagent[1747]: 2025-03-17T18:50:19.171554Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: aaa15dde-cc94-41d6-9705-def94e007478 eTag: 123015679491943002 source: Fabric] Mar 17 18:50:19.172375 waagent[1747]: 2025-03-17T18:50:19.172306Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. Mar 17 18:50:19.174777 waagent[1747]: 2025-03-17T18:50:19.174717Z INFO ExtHandler Mar 17 18:50:19.175025 waagent[1747]: 2025-03-17T18:50:19.174971Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 2] Mar 17 18:50:19.223316 kubelet[2704]: E0317 18:50:19.223236 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:19.242506 waagent[1747]: 2025-03-17T18:50:19.242446Z INFO ExtHandler ExtHandler Downloading artifacts profile blob Mar 17 18:50:19.323329 waagent[1747]: 2025-03-17T18:50:19.323185Z INFO ExtHandler Downloaded certificate {'thumbprint': '4016A80D0561C251388267BC8A44866F24A05316', 'hasPrivateKey': False} Mar 17 18:50:19.324523 waagent[1747]: 2025-03-17T18:50:19.324437Z INFO ExtHandler Downloaded certificate {'thumbprint': '21941C2D807910C8B6B07F8B0E9CB0880C5FCDFF', 'hasPrivateKey': True} Mar 17 18:50:19.325700 waagent[1747]: 2025-03-17T18:50:19.325617Z INFO ExtHandler Fetch goal state completed Mar 17 18:50:19.326636 waagent[1747]: 2025-03-17T18:50:19.326574Z INFO ExtHandler ExtHandler VM enabled for RSM updates, switching to RSM update mode Mar 17 18:50:19.327883 waagent[1747]: 2025-03-17T18:50:19.327825Z INFO ExtHandler ExtHandler Mar 17 18:50:19.328033 waagent[1747]: 2025-03-17T18:50:19.327981Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_2 channel: WireServer source: Fabric activity: 7c22d8b4-335e-46fe-935e-5a370126c09e correlation 4c90e8bf-51cc-4eb8-bebf-4f00b2b45bc7 created: 2025-03-17T18:50:13.261635Z] Mar 17 18:50:19.328768 waagent[1747]: 2025-03-17T18:50:19.328713Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. Mar 17 18:50:19.330527 waagent[1747]: 2025-03-17T18:50:19.330468Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_2 2 ms] Mar 17 18:50:19.379569 env[1524]: time="2025-03-17T18:50:19.373889875Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\"" Mar 17 18:50:21.223885 kubelet[2704]: E0317 18:50:21.223805 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:23.223076 kubelet[2704]: E0317 18:50:23.223027 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:23.820799 env[1524]: time="2025-03-17T18:50:23.820749161Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:23.825551 env[1524]: time="2025-03-17T18:50:23.825504898Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:23.828330 env[1524]: time="2025-03-17T18:50:23.828289420Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/cni:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:23.831994 env[1524]: time="2025-03-17T18:50:23.831963549Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/cni@sha256:21e759d51c90dfb34fc1397dc180dd3a3fb564c2b0580d2f61ffe108f2a3c94b,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:23.832576 env[1524]: time="2025-03-17T18:50:23.832544954Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.29.1\" returns image reference \"sha256:7dd6ea186aba0d7a1791a79d426fe854527ca95192b26bbd19e8baf8373f7d0e\"" Mar 17 18:50:23.835551 env[1524]: time="2025-03-17T18:50:23.835452376Z" level=info msg="CreateContainer within sandbox \"2e1219721c525786caca484fed284f7ed9918ccab0ecc9bb3e3aba917a7ad52c\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Mar 17 18:50:23.874641 env[1524]: time="2025-03-17T18:50:23.874552083Z" level=info msg="CreateContainer within sandbox \"2e1219721c525786caca484fed284f7ed9918ccab0ecc9bb3e3aba917a7ad52c\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"bfd5a7968f87b929bcba0aca26665dc6e0ce339d4e818074453da879f372c962\"" Mar 17 18:50:23.876403 env[1524]: time="2025-03-17T18:50:23.875401189Z" level=info msg="StartContainer for \"bfd5a7968f87b929bcba0aca26665dc6e0ce339d4e818074453da879f372c962\"" Mar 17 18:50:23.934014 env[1524]: time="2025-03-17T18:50:23.933941648Z" level=info msg="StartContainer for \"bfd5a7968f87b929bcba0aca26665dc6e0ce339d4e818074453da879f372c962\" returns successfully" Mar 17 18:50:25.223625 kubelet[2704]: E0317 18:50:25.223573 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:25.323250 env[1524]: time="2025-03-17T18:50:25.323180344Z" level=error msg="failed to reload cni configuration after receiving fs change event(\"/etc/cni/net.d/calico-kubeconfig\": WRITE)" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Mar 17 18:50:25.350472 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-bfd5a7968f87b929bcba0aca26665dc6e0ce339d4e818074453da879f372c962-rootfs.mount: Deactivated successfully. Mar 17 18:50:25.386192 kubelet[2704]: I0317 18:50:25.385218 2704 kubelet_node_status.go:497] "Fast updating node status as it just became ready" Mar 17 18:50:25.413588 kubelet[2704]: I0317 18:50:25.413534 2704 topology_manager.go:215] "Topology Admit Handler" podUID="309db727-4be2-4bfe-b325-49e81b2078ad" podNamespace="kube-system" podName="coredns-7db6d8ff4d-hvsfz" Mar 17 18:50:25.419373 kubelet[2704]: I0317 18:50:25.419327 2704 topology_manager.go:215] "Topology Admit Handler" podUID="7349d13c-6ed5-461e-80f1-d59a9207db7f" podNamespace="kube-system" podName="coredns-7db6d8ff4d-58lvk" Mar 17 18:50:25.428055 kubelet[2704]: I0317 18:50:25.428014 2704 topology_manager.go:215] "Topology Admit Handler" podUID="9b7bbe37-740f-49ee-a599-03cedfded2d6" podNamespace="calico-system" podName="calico-kube-controllers-ff8db6ccc-8lpcj" Mar 17 18:50:25.428429 kubelet[2704]: I0317 18:50:25.428405 2704 topology_manager.go:215] "Topology Admit Handler" podUID="7db7f05a-c0bd-436a-b834-87825c403071" podNamespace="calico-apiserver" podName="calico-apiserver-6b476d9448-hjl2z" Mar 17 18:50:25.428721 kubelet[2704]: I0317 18:50:25.428697 2704 topology_manager.go:215] "Topology Admit Handler" podUID="181cc807-8e75-4c0c-9bab-b609021ab74e" podNamespace="calico-apiserver" podName="calico-apiserver-6b476d9448-c9626" Mar 17 18:50:25.503208 kubelet[2704]: I0317 18:50:25.503162 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ds4f\" (UniqueName: \"kubernetes.io/projected/7db7f05a-c0bd-436a-b834-87825c403071-kube-api-access-5ds4f\") pod \"calico-apiserver-6b476d9448-hjl2z\" (UID: \"7db7f05a-c0bd-436a-b834-87825c403071\") " pod="calico-apiserver/calico-apiserver-6b476d9448-hjl2z" Mar 17 18:50:25.503395 kubelet[2704]: I0317 18:50:25.503241 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfkk4\" (UniqueName: \"kubernetes.io/projected/309db727-4be2-4bfe-b325-49e81b2078ad-kube-api-access-pfkk4\") pod \"coredns-7db6d8ff4d-hvsfz\" (UID: \"309db727-4be2-4bfe-b325-49e81b2078ad\") " pod="kube-system/coredns-7db6d8ff4d-hvsfz" Mar 17 18:50:25.503395 kubelet[2704]: I0317 18:50:25.503276 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/181cc807-8e75-4c0c-9bab-b609021ab74e-calico-apiserver-certs\") pod \"calico-apiserver-6b476d9448-c9626\" (UID: \"181cc807-8e75-4c0c-9bab-b609021ab74e\") " pod="calico-apiserver/calico-apiserver-6b476d9448-c9626" Mar 17 18:50:25.503395 kubelet[2704]: I0317 18:50:25.503311 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/309db727-4be2-4bfe-b325-49e81b2078ad-config-volume\") pod \"coredns-7db6d8ff4d-hvsfz\" (UID: \"309db727-4be2-4bfe-b325-49e81b2078ad\") " pod="kube-system/coredns-7db6d8ff4d-hvsfz" Mar 17 18:50:25.503395 kubelet[2704]: I0317 18:50:25.503333 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7349d13c-6ed5-461e-80f1-d59a9207db7f-config-volume\") pod \"coredns-7db6d8ff4d-58lvk\" (UID: \"7349d13c-6ed5-461e-80f1-d59a9207db7f\") " pod="kube-system/coredns-7db6d8ff4d-58lvk" Mar 17 18:50:25.503395 kubelet[2704]: I0317 18:50:25.503353 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlshf\" (UniqueName: \"kubernetes.io/projected/9b7bbe37-740f-49ee-a599-03cedfded2d6-kube-api-access-zlshf\") pod \"calico-kube-controllers-ff8db6ccc-8lpcj\" (UID: \"9b7bbe37-740f-49ee-a599-03cedfded2d6\") " pod="calico-system/calico-kube-controllers-ff8db6ccc-8lpcj" Mar 17 18:50:25.503634 kubelet[2704]: I0317 18:50:25.503375 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsvqb\" (UniqueName: \"kubernetes.io/projected/7349d13c-6ed5-461e-80f1-d59a9207db7f-kube-api-access-bsvqb\") pod \"coredns-7db6d8ff4d-58lvk\" (UID: \"7349d13c-6ed5-461e-80f1-d59a9207db7f\") " pod="kube-system/coredns-7db6d8ff4d-58lvk" Mar 17 18:50:25.503634 kubelet[2704]: I0317 18:50:25.503401 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b7bbe37-740f-49ee-a599-03cedfded2d6-tigera-ca-bundle\") pod \"calico-kube-controllers-ff8db6ccc-8lpcj\" (UID: \"9b7bbe37-740f-49ee-a599-03cedfded2d6\") " pod="calico-system/calico-kube-controllers-ff8db6ccc-8lpcj" Mar 17 18:50:25.503634 kubelet[2704]: I0317 18:50:25.503424 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/7db7f05a-c0bd-436a-b834-87825c403071-calico-apiserver-certs\") pod \"calico-apiserver-6b476d9448-hjl2z\" (UID: \"7db7f05a-c0bd-436a-b834-87825c403071\") " pod="calico-apiserver/calico-apiserver-6b476d9448-hjl2z" Mar 17 18:50:25.503634 kubelet[2704]: I0317 18:50:25.503450 2704 reconciler_common.go:247] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-df85r\" (UniqueName: \"kubernetes.io/projected/181cc807-8e75-4c0c-9bab-b609021ab74e-kube-api-access-df85r\") pod \"calico-apiserver-6b476d9448-c9626\" (UID: \"181cc807-8e75-4c0c-9bab-b609021ab74e\") " pod="calico-apiserver/calico-apiserver-6b476d9448-c9626" Mar 17 18:50:25.734114 env[1524]: time="2025-03-17T18:50:25.733700467Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b476d9448-c9626,Uid:181cc807-8e75-4c0c-9bab-b609021ab74e,Namespace:calico-apiserver,Attempt:0,}" Mar 17 18:50:25.734114 env[1524]: time="2025-03-17T18:50:25.733703067Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-58lvk,Uid:7349d13c-6ed5-461e-80f1-d59a9207db7f,Namespace:kube-system,Attempt:0,}" Mar 17 18:50:25.736135 env[1524]: time="2025-03-17T18:50:25.736092785Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b476d9448-hjl2z,Uid:7db7f05a-c0bd-436a-b834-87825c403071,Namespace:calico-apiserver,Attempt:0,}" Mar 17 18:50:25.736431 env[1524]: time="2025-03-17T18:50:25.736398687Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-hvsfz,Uid:309db727-4be2-4bfe-b325-49e81b2078ad,Namespace:kube-system,Attempt:0,}" Mar 17 18:50:25.739360 env[1524]: time="2025-03-17T18:50:25.739210409Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-ff8db6ccc-8lpcj,Uid:9b7bbe37-740f-49ee-a599-03cedfded2d6,Namespace:calico-system,Attempt:0,}" Mar 17 18:50:27.024735 env[1524]: time="2025-03-17T18:50:27.024685474Z" level=info msg="shim disconnected" id=bfd5a7968f87b929bcba0aca26665dc6e0ce339d4e818074453da879f372c962 Mar 17 18:50:27.025283 env[1524]: time="2025-03-17T18:50:27.024734174Z" level=warning msg="cleaning up after shim disconnected" id=bfd5a7968f87b929bcba0aca26665dc6e0ce339d4e818074453da879f372c962 namespace=k8s.io Mar 17 18:50:27.025283 env[1524]: time="2025-03-17T18:50:27.024755675Z" level=info msg="cleaning up dead shim" Mar 17 18:50:27.035727 env[1524]: time="2025-03-17T18:50:27.035681655Z" level=warning msg="cleanup warnings time=\"2025-03-17T18:50:27Z\" level=info msg=\"starting signal loop\" namespace=k8s.io pid=3523 runtime=io.containerd.runc.v2\n" Mar 17 18:50:27.227903 env[1524]: time="2025-03-17T18:50:27.227858276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tt74p,Uid:eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce,Namespace:calico-system,Attempt:0,}" Mar 17 18:50:27.366381 env[1524]: time="2025-03-17T18:50:27.365844396Z" level=error msg="Failed to destroy network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.366381 env[1524]: time="2025-03-17T18:50:27.366235999Z" level=error msg="encountered an error cleaning up failed sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.366381 env[1524]: time="2025-03-17T18:50:27.366302100Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-ff8db6ccc-8lpcj,Uid:9b7bbe37-740f-49ee-a599-03cedfded2d6,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.366790 kubelet[2704]: E0317 18:50:27.366546 2704 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.366790 kubelet[2704]: E0317 18:50:27.366631 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-ff8db6ccc-8lpcj" Mar 17 18:50:27.366790 kubelet[2704]: E0317 18:50:27.366659 2704 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-ff8db6ccc-8lpcj" Mar 17 18:50:27.367285 kubelet[2704]: E0317 18:50:27.366732 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-ff8db6ccc-8lpcj_calico-system(9b7bbe37-740f-49ee-a599-03cedfded2d6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-ff8db6ccc-8lpcj_calico-system(9b7bbe37-740f-49ee-a599-03cedfded2d6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-ff8db6ccc-8lpcj" podUID="9b7bbe37-740f-49ee-a599-03cedfded2d6" Mar 17 18:50:27.398571 env[1524]: time="2025-03-17T18:50:27.398515838Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\"" Mar 17 18:50:27.413673 kubelet[2704]: I0317 18:50:27.412091 2704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:27.419922 env[1524]: time="2025-03-17T18:50:27.419876796Z" level=info msg="StopPodSandbox for \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\"" Mar 17 18:50:27.444334 env[1524]: time="2025-03-17T18:50:27.444271676Z" level=error msg="Failed to destroy network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.444993 env[1524]: time="2025-03-17T18:50:27.444940581Z" level=error msg="encountered an error cleaning up failed sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.445203 env[1524]: time="2025-03-17T18:50:27.445158683Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b476d9448-c9626,Uid:181cc807-8e75-4c0c-9bab-b609021ab74e,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.446105 kubelet[2704]: E0317 18:50:27.445593 2704 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.446105 kubelet[2704]: E0317 18:50:27.445697 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b476d9448-c9626" Mar 17 18:50:27.446105 kubelet[2704]: E0317 18:50:27.445726 2704 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b476d9448-c9626" Mar 17 18:50:27.446310 kubelet[2704]: E0317 18:50:27.445791 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b476d9448-c9626_calico-apiserver(181cc807-8e75-4c0c-9bab-b609021ab74e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b476d9448-c9626_calico-apiserver(181cc807-8e75-4c0c-9bab-b609021ab74e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b476d9448-c9626" podUID="181cc807-8e75-4c0c-9bab-b609021ab74e" Mar 17 18:50:27.458696 env[1524]: time="2025-03-17T18:50:27.458636782Z" level=error msg="Failed to destroy network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.459063 env[1524]: time="2025-03-17T18:50:27.459022585Z" level=error msg="encountered an error cleaning up failed sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.459144 env[1524]: time="2025-03-17T18:50:27.459087086Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-58lvk,Uid:7349d13c-6ed5-461e-80f1-d59a9207db7f,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.459801 kubelet[2704]: E0317 18:50:27.459342 2704 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.459801 kubelet[2704]: E0317 18:50:27.459409 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-58lvk" Mar 17 18:50:27.459801 kubelet[2704]: E0317 18:50:27.459435 2704 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-58lvk" Mar 17 18:50:27.460001 kubelet[2704]: E0317 18:50:27.459498 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-58lvk_kube-system(7349d13c-6ed5-461e-80f1-d59a9207db7f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-58lvk_kube-system(7349d13c-6ed5-461e-80f1-d59a9207db7f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-58lvk" podUID="7349d13c-6ed5-461e-80f1-d59a9207db7f" Mar 17 18:50:27.488091 env[1524]: time="2025-03-17T18:50:27.488022200Z" level=error msg="Failed to destroy network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.488458 env[1524]: time="2025-03-17T18:50:27.488405603Z" level=error msg="encountered an error cleaning up failed sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.488564 env[1524]: time="2025-03-17T18:50:27.488479203Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-hvsfz,Uid:309db727-4be2-4bfe-b325-49e81b2078ad,Namespace:kube-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.490920 kubelet[2704]: E0317 18:50:27.488733 2704 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.490920 kubelet[2704]: E0317 18:50:27.488824 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-hvsfz" Mar 17 18:50:27.490920 kubelet[2704]: E0317 18:50:27.488862 2704 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7db6d8ff4d-hvsfz" Mar 17 18:50:27.491358 kubelet[2704]: E0317 18:50:27.488953 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7db6d8ff4d-hvsfz_kube-system(309db727-4be2-4bfe-b325-49e81b2078ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7db6d8ff4d-hvsfz_kube-system(309db727-4be2-4bfe-b325-49e81b2078ad)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-hvsfz" podUID="309db727-4be2-4bfe-b325-49e81b2078ad" Mar 17 18:50:27.499437 env[1524]: time="2025-03-17T18:50:27.499391184Z" level=error msg="Failed to destroy network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.499754 env[1524]: time="2025-03-17T18:50:27.499716486Z" level=error msg="encountered an error cleaning up failed sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.499860 env[1524]: time="2025-03-17T18:50:27.499773387Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b476d9448-hjl2z,Uid:7db7f05a-c0bd-436a-b834-87825c403071,Namespace:calico-apiserver,Attempt:0,} failed, error" error="failed to setup network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.500013 kubelet[2704]: E0317 18:50:27.499977 2704 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.500093 kubelet[2704]: E0317 18:50:27.500032 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b476d9448-hjl2z" Mar 17 18:50:27.500093 kubelet[2704]: E0317 18:50:27.500055 2704 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6b476d9448-hjl2z" Mar 17 18:50:27.500177 kubelet[2704]: E0317 18:50:27.500102 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6b476d9448-hjl2z_calico-apiserver(7db7f05a-c0bd-436a-b834-87825c403071)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6b476d9448-hjl2z_calico-apiserver(7db7f05a-c0bd-436a-b834-87825c403071)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b476d9448-hjl2z" podUID="7db7f05a-c0bd-436a-b834-87825c403071" Mar 17 18:50:27.501353 env[1524]: time="2025-03-17T18:50:27.501313198Z" level=error msg="Failed to destroy network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.501746 env[1524]: time="2025-03-17T18:50:27.501704301Z" level=error msg="encountered an error cleaning up failed sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\", marking sandbox state as SANDBOX_UNKNOWN" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.501830 env[1524]: time="2025-03-17T18:50:27.501753901Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tt74p,Uid:eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce,Namespace:calico-system,Attempt:0,} failed, error" error="failed to setup network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.502038 kubelet[2704]: E0317 18:50:27.501909 2704 remote_runtime.go:193] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.502038 kubelet[2704]: E0317 18:50:27.501956 2704 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tt74p" Mar 17 18:50:27.502038 kubelet[2704]: E0317 18:50:27.501985 2704 kuberuntime_manager.go:1166] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tt74p" Mar 17 18:50:27.502186 kubelet[2704]: E0317 18:50:27.502026 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tt74p_calico-system(eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tt74p_calico-system(eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:27.524528 env[1524]: time="2025-03-17T18:50:27.524462769Z" level=error msg="StopPodSandbox for \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\" failed" error="failed to destroy network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:27.524721 kubelet[2704]: E0317 18:50:27.524683 2704 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:27.524812 kubelet[2704]: E0317 18:50:27.524740 2704 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b"} Mar 17 18:50:27.524866 kubelet[2704]: E0317 18:50:27.524804 2704 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"9b7bbe37-740f-49ee-a599-03cedfded2d6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:50:27.524866 kubelet[2704]: E0317 18:50:27.524835 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"9b7bbe37-740f-49ee-a599-03cedfded2d6\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-ff8db6ccc-8lpcj" podUID="9b7bbe37-740f-49ee-a599-03cedfded2d6" Mar 17 18:50:28.156317 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e-shm.mount: Deactivated successfully. Mar 17 18:50:28.156554 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0-shm.mount: Deactivated successfully. Mar 17 18:50:28.157264 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b-shm.mount: Deactivated successfully. Mar 17 18:50:28.416433 kubelet[2704]: I0317 18:50:28.414520 2704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:28.417113 env[1524]: time="2025-03-17T18:50:28.417060626Z" level=info msg="StopPodSandbox for \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\"" Mar 17 18:50:28.418239 kubelet[2704]: I0317 18:50:28.418004 2704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:28.418876 env[1524]: time="2025-03-17T18:50:28.418847639Z" level=info msg="StopPodSandbox for \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\"" Mar 17 18:50:28.422575 kubelet[2704]: I0317 18:50:28.422547 2704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:28.424617 env[1524]: time="2025-03-17T18:50:28.423739575Z" level=info msg="StopPodSandbox for \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\"" Mar 17 18:50:28.425457 kubelet[2704]: I0317 18:50:28.424984 2704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:28.425844 env[1524]: time="2025-03-17T18:50:28.425811190Z" level=info msg="StopPodSandbox for \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\"" Mar 17 18:50:28.426925 kubelet[2704]: I0317 18:50:28.426515 2704 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:28.427114 env[1524]: time="2025-03-17T18:50:28.427086999Z" level=info msg="StopPodSandbox for \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\"" Mar 17 18:50:28.519600 env[1524]: time="2025-03-17T18:50:28.519535774Z" level=error msg="StopPodSandbox for \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\" failed" error="failed to destroy network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:28.519902 kubelet[2704]: E0317 18:50:28.519839 2704 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:28.520015 kubelet[2704]: E0317 18:50:28.519915 2704 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f"} Mar 17 18:50:28.520015 kubelet[2704]: E0317 18:50:28.519974 2704 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"309db727-4be2-4bfe-b325-49e81b2078ad\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:50:28.520184 kubelet[2704]: E0317 18:50:28.520003 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"309db727-4be2-4bfe-b325-49e81b2078ad\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-hvsfz" podUID="309db727-4be2-4bfe-b325-49e81b2078ad" Mar 17 18:50:28.521192 env[1524]: time="2025-03-17T18:50:28.521136685Z" level=error msg="StopPodSandbox for \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\" failed" error="failed to destroy network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:28.521405 kubelet[2704]: E0317 18:50:28.521370 2704 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:28.521512 kubelet[2704]: E0317 18:50:28.521431 2704 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e"} Mar 17 18:50:28.521512 kubelet[2704]: E0317 18:50:28.521470 2704 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:50:28.521639 kubelet[2704]: E0317 18:50:28.521530 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tt74p" podUID="eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce" Mar 17 18:50:28.528973 env[1524]: time="2025-03-17T18:50:28.528916542Z" level=error msg="StopPodSandbox for \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\" failed" error="failed to destroy network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:28.529182 kubelet[2704]: E0317 18:50:28.529145 2704 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:28.529276 kubelet[2704]: E0317 18:50:28.529206 2704 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0"} Mar 17 18:50:28.529276 kubelet[2704]: E0317 18:50:28.529245 2704 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"181cc807-8e75-4c0c-9bab-b609021ab74e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:50:28.529401 kubelet[2704]: E0317 18:50:28.529286 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"181cc807-8e75-4c0c-9bab-b609021ab74e\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b476d9448-c9626" podUID="181cc807-8e75-4c0c-9bab-b609021ab74e" Mar 17 18:50:28.538586 env[1524]: time="2025-03-17T18:50:28.538539612Z" level=error msg="StopPodSandbox for \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\" failed" error="failed to destroy network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:28.538940 kubelet[2704]: E0317 18:50:28.538894 2704 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:28.539047 kubelet[2704]: E0317 18:50:28.538948 2704 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e"} Mar 17 18:50:28.539047 kubelet[2704]: E0317 18:50:28.538982 2704 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7349d13c-6ed5-461e-80f1-d59a9207db7f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:50:28.539047 kubelet[2704]: E0317 18:50:28.539012 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7349d13c-6ed5-461e-80f1-d59a9207db7f\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7db6d8ff4d-58lvk" podUID="7349d13c-6ed5-461e-80f1-d59a9207db7f" Mar 17 18:50:28.549847 env[1524]: time="2025-03-17T18:50:28.549773094Z" level=error msg="StopPodSandbox for \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\" failed" error="failed to destroy network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Mar 17 18:50:28.551344 kubelet[2704]: E0317 18:50:28.550328 2704 remote_runtime.go:222] "StopPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to destroy network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\": plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" podSandboxID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:28.551344 kubelet[2704]: E0317 18:50:28.550359 2704 kuberuntime_manager.go:1375] "Failed to stop sandbox" podSandboxID={"Type":"containerd","ID":"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b"} Mar 17 18:50:28.551344 kubelet[2704]: E0317 18:50:28.550381 2704 kuberuntime_manager.go:1075] "killPodWithSyncResult failed" err="failed to \"KillPodSandbox\" for \"7db7f05a-c0bd-436a-b834-87825c403071\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" Mar 17 18:50:28.551344 kubelet[2704]: E0317 18:50:28.550399 2704 pod_workers.go:1298] "Error syncing pod, skipping" err="failed to \"KillPodSandbox\" for \"7db7f05a-c0bd-436a-b834-87825c403071\" with KillPodSandboxError: \"rpc error: code = Unknown desc = failed to destroy network for sandbox \\\"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\\\": plugin type=\\\"calico\\\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6b476d9448-hjl2z" podUID="7db7f05a-c0bd-436a-b834-87825c403071" Mar 17 18:50:34.150502 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3792926189.mount: Deactivated successfully. Mar 17 18:50:34.197194 env[1524]: time="2025-03-17T18:50:34.197146547Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:34.204577 env[1524]: time="2025-03-17T18:50:34.204533997Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:34.207745 env[1524]: time="2025-03-17T18:50:34.207710318Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:34.212183 env[1524]: time="2025-03-17T18:50:34.212149948Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node@sha256:99c3917516efe1f807a0cfdf2d14b628b7c5cc6bd8a9ee5a253154f31756bea1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:34.212642 env[1524]: time="2025-03-17T18:50:34.212612251Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.29.1\" returns image reference \"sha256:feb26d4585d68e875d9bd9bd6c27ea9f2d5c9ed9ef70f8b8cb0ebb0559a1d664\"" Mar 17 18:50:34.233736 env[1524]: time="2025-03-17T18:50:34.233587893Z" level=info msg="CreateContainer within sandbox \"2e1219721c525786caca484fed284f7ed9918ccab0ecc9bb3e3aba917a7ad52c\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Mar 17 18:50:34.263447 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount918485900.mount: Deactivated successfully. Mar 17 18:50:34.277519 env[1524]: time="2025-03-17T18:50:34.277472389Z" level=info msg="CreateContainer within sandbox \"2e1219721c525786caca484fed284f7ed9918ccab0ecc9bb3e3aba917a7ad52c\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"7724dc134a0c6248f84c12c30eed51e14a73f0ffa47fe0423eec6266d70ad547\"" Mar 17 18:50:34.279339 env[1524]: time="2025-03-17T18:50:34.278199493Z" level=info msg="StartContainer for \"7724dc134a0c6248f84c12c30eed51e14a73f0ffa47fe0423eec6266d70ad547\"" Mar 17 18:50:34.338380 env[1524]: time="2025-03-17T18:50:34.338327199Z" level=info msg="StartContainer for \"7724dc134a0c6248f84c12c30eed51e14a73f0ffa47fe0423eec6266d70ad547\" returns successfully" Mar 17 18:50:34.462609 kubelet[2704]: I0317 18:50:34.461725 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-mbfjj" podStartSLOduration=3.031985983 podStartE2EDuration="24.461698931s" podCreationTimestamp="2025-03-17 18:50:10 +0000 UTC" firstStartedPulling="2025-03-17 18:50:12.78387131 +0000 UTC m=+25.031971408" lastFinishedPulling="2025-03-17 18:50:34.213584258 +0000 UTC m=+46.461684356" observedRunningTime="2025-03-17 18:50:34.461445129 +0000 UTC m=+46.709545227" watchObservedRunningTime="2025-03-17 18:50:34.461698931 +0000 UTC m=+46.709799029" Mar 17 18:50:34.633003 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Mar 17 18:50:34.633189 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Mar 17 18:50:35.927639 kernel: kauditd_printk_skb: 8 callbacks suppressed Mar 17 18:50:35.927828 kernel: audit: type=1400 audit(1742237435.909:296): avc: denied { write } for pid=4008 comm="tee" name="fd" dev="proc" ino=30912 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:50:35.909000 audit[4008]: AVC avc: denied { write } for pid=4008 comm="tee" name="fd" dev="proc" ino=30912 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:50:35.909000 audit[4008]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc703a8a14 a2=241 a3=1b6 items=1 ppid=3987 pid=4008 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:35.909000 audit: CWD cwd="/etc/service/enabled/felix/log" Mar 17 18:50:35.967455 kernel: audit: type=1300 audit(1742237435.909:296): arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc703a8a14 a2=241 a3=1b6 items=1 ppid=3987 pid=4008 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:35.967579 kernel: audit: type=1307 audit(1742237435.909:296): cwd="/etc/service/enabled/felix/log" Mar 17 18:50:35.909000 audit: PATH item=0 name="/dev/fd/63" inode=30530 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:50:35.981707 kernel: audit: type=1302 audit(1742237435.909:296): item=0 name="/dev/fd/63" inode=30530 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:50:35.909000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:50:35.996700 kernel: audit: type=1327 audit(1742237435.909:296): proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:50:35.947000 audit[4029]: AVC avc: denied { write } for pid=4029 comm="tee" name="fd" dev="proc" ino=30938 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:50:36.010698 kernel: audit: type=1400 audit(1742237435.947:297): avc: denied { write } for pid=4029 comm="tee" name="fd" dev="proc" ino=30938 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:50:35.947000 audit[4029]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc03ac6a14 a2=241 a3=1b6 items=1 ppid=3980 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:35.947000 audit: CWD cwd="/etc/service/enabled/bird6/log" Mar 17 18:50:36.036841 kernel: audit: type=1300 audit(1742237435.947:297): arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc03ac6a14 a2=241 a3=1b6 items=1 ppid=3980 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:36.036964 kernel: audit: type=1307 audit(1742237435.947:297): cwd="/etc/service/enabled/bird6/log" Mar 17 18:50:35.947000 audit: PATH item=0 name="/dev/fd/63" inode=30921 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:50:36.050102 kernel: audit: type=1302 audit(1742237435.947:297): item=0 name="/dev/fd/63" inode=30921 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:50:36.050240 kernel: audit: type=1327 audit(1742237435.947:297): proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:50:35.947000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:50:35.947000 audit[4024]: AVC avc: denied { write } for pid=4024 comm="tee" name="fd" dev="proc" ino=30942 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:50:35.947000 audit[4024]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc145f5a04 a2=241 a3=1b6 items=1 ppid=3989 pid=4024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:35.947000 audit: CWD cwd="/etc/service/enabled/allocate-tunnel-addrs/log" Mar 17 18:50:35.947000 audit: PATH item=0 name="/dev/fd/63" inode=30918 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:50:35.947000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:50:35.957000 audit[4033]: AVC avc: denied { write } for pid=4033 comm="tee" name="fd" dev="proc" ino=30950 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:50:35.957000 audit[4033]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffce6531a16 a2=241 a3=1b6 items=1 ppid=3981 pid=4033 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:35.957000 audit: CWD cwd="/etc/service/enabled/cni/log" Mar 17 18:50:35.957000 audit: PATH item=0 name="/dev/fd/63" inode=30929 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:50:35.957000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:50:35.967000 audit[4031]: AVC avc: denied { write } for pid=4031 comm="tee" name="fd" dev="proc" ino=30954 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:50:35.967000 audit[4031]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffd06c74a14 a2=241 a3=1b6 items=1 ppid=3984 pid=4031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:35.967000 audit: CWD cwd="/etc/service/enabled/confd/log" Mar 17 18:50:35.967000 audit: PATH item=0 name="/dev/fd/63" inode=30926 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:50:35.967000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:50:35.973000 audit[4048]: AVC avc: denied { write } for pid=4048 comm="tee" name="fd" dev="proc" ino=30959 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:50:35.973000 audit[4048]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7ffc2aed5a15 a2=241 a3=1b6 items=1 ppid=3993 pid=4048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:35.973000 audit: CWD cwd="/etc/service/enabled/bird/log" Mar 17 18:50:35.973000 audit: PATH item=0 name="/dev/fd/63" inode=30947 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:50:35.973000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:50:36.011000 audit[4055]: AVC avc: denied { write } for pid=4055 comm="tee" name="fd" dev="proc" ino=30539 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=dir permissive=0 Mar 17 18:50:36.011000 audit[4055]: SYSCALL arch=c000003e syscall=257 success=yes exit=3 a0=ffffff9c a1=7fff85322a05 a2=241 a3=1b6 items=1 ppid=3991 pid=4055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="tee" exe="/usr/bin/coreutils" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:36.011000 audit: CWD cwd="/etc/service/enabled/node-status-reporter/log" Mar 17 18:50:36.011000 audit: PATH item=0 name="/dev/fd/63" inode=30536 dev=00:0c mode=010600 ouid=0 ogid=0 rdev=00:00 obj=system_u:system_r:kernel_t:s0 nametype=NORMAL cap_fp=0 cap_fi=0 cap_fe=0 cap_fver=0 cap_frootid=0 Mar 17 18:50:36.011000 audit: PROCTITLE proctitle=2F7573722F62696E2F636F72657574696C73002D2D636F72657574696C732D70726F672D73686562616E673D746565002F7573722F62696E2F746565002F6465762F66642F3633 Mar 17 18:50:39.225940 env[1524]: time="2025-03-17T18:50:39.224905979Z" level=info msg="StopPodSandbox for \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\"" Mar 17 18:50:39.225940 env[1524]: time="2025-03-17T18:50:39.225383382Z" level=info msg="StopPodSandbox for \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\"" Mar 17 18:50:39.225940 env[1524]: time="2025-03-17T18:50:39.225714284Z" level=info msg="StopPodSandbox for \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\"" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.318 [INFO][4140] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.318 [INFO][4140] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" iface="eth0" netns="/var/run/netns/cni-83484b02-6aca-ba13-b4ee-7be7231b60fb" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.318 [INFO][4140] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" iface="eth0" netns="/var/run/netns/cni-83484b02-6aca-ba13-b4ee-7be7231b60fb" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.318 [INFO][4140] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" iface="eth0" netns="/var/run/netns/cni-83484b02-6aca-ba13-b4ee-7be7231b60fb" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.318 [INFO][4140] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.318 [INFO][4140] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.417 [INFO][4175] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" HandleID="k8s-pod-network.84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.420 [INFO][4175] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.420 [INFO][4175] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.448 [WARNING][4175] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" HandleID="k8s-pod-network.84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.448 [INFO][4175] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" HandleID="k8s-pod-network.84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.452 [INFO][4175] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:39.460514 env[1524]: 2025-03-17 18:50:39.459 [INFO][4140] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:39.470222 systemd[1]: run-netns-cni\x2d83484b02\x2d6aca\x2dba13\x2db4ee\x2d7be7231b60fb.mount: Deactivated successfully. Mar 17 18:50:39.472281 env[1524]: time="2025-03-17T18:50:39.472235652Z" level=info msg="TearDown network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\" successfully" Mar 17 18:50:39.472407 env[1524]: time="2025-03-17T18:50:39.472389353Z" level=info msg="StopPodSandbox for \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\" returns successfully" Mar 17 18:50:39.473295 env[1524]: time="2025-03-17T18:50:39.473263359Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b476d9448-hjl2z,Uid:7db7f05a-c0bd-436a-b834-87825c403071,Namespace:calico-apiserver,Attempt:1,}" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.389 [INFO][4161] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.393 [INFO][4161] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" iface="eth0" netns="/var/run/netns/cni-ec5e1bc5-3057-b0c4-c65f-10e174e490a8" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.393 [INFO][4161] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" iface="eth0" netns="/var/run/netns/cni-ec5e1bc5-3057-b0c4-c65f-10e174e490a8" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.397 [INFO][4161] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" iface="eth0" netns="/var/run/netns/cni-ec5e1bc5-3057-b0c4-c65f-10e174e490a8" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.397 [INFO][4161] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.397 [INFO][4161] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.481 [INFO][4184] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" HandleID="k8s-pod-network.ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.482 [INFO][4184] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.482 [INFO][4184] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.503 [WARNING][4184] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" HandleID="k8s-pod-network.ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.503 [INFO][4184] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" HandleID="k8s-pod-network.ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.505 [INFO][4184] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:39.509139 env[1524]: 2025-03-17 18:50:39.507 [INFO][4161] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:39.516888 systemd[1]: run-netns-cni\x2dec5e1bc5\x2d3057\x2db0c4\x2dc65f\x2d10e174e490a8.mount: Deactivated successfully. Mar 17 18:50:39.519887 env[1524]: time="2025-03-17T18:50:39.519839355Z" level=info msg="TearDown network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\" successfully" Mar 17 18:50:39.520049 env[1524]: time="2025-03-17T18:50:39.520026156Z" level=info msg="StopPodSandbox for \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\" returns successfully" Mar 17 18:50:39.521893 env[1524]: time="2025-03-17T18:50:39.521861268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b476d9448-c9626,Uid:181cc807-8e75-4c0c-9bab-b609021ab74e,Namespace:calico-apiserver,Attempt:1,}" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.446 [INFO][4166] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.447 [INFO][4166] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" iface="eth0" netns="/var/run/netns/cni-02a1c891-d42c-c1ab-9762-99f4b16e832c" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.447 [INFO][4166] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" iface="eth0" netns="/var/run/netns/cni-02a1c891-d42c-c1ab-9762-99f4b16e832c" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.447 [INFO][4166] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" iface="eth0" netns="/var/run/netns/cni-02a1c891-d42c-c1ab-9762-99f4b16e832c" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.447 [INFO][4166] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.447 [INFO][4166] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.566 [INFO][4190] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" HandleID="k8s-pod-network.cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.566 [INFO][4190] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.567 [INFO][4190] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.577 [WARNING][4190] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" HandleID="k8s-pod-network.cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.578 [INFO][4190] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" HandleID="k8s-pod-network.cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.582 [INFO][4190] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:39.585651 env[1524]: 2025-03-17 18:50:39.584 [INFO][4166] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:39.593842 systemd[1]: run-netns-cni\x2d02a1c891\x2dd42c\x2dc1ab\x2d9762\x2d99f4b16e832c.mount: Deactivated successfully. Mar 17 18:50:39.596005 env[1524]: time="2025-03-17T18:50:39.595957739Z" level=info msg="TearDown network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\" successfully" Mar 17 18:50:39.596141 env[1524]: time="2025-03-17T18:50:39.596119040Z" level=info msg="StopPodSandbox for \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\" returns successfully" Mar 17 18:50:39.596910 env[1524]: time="2025-03-17T18:50:39.596877545Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tt74p,Uid:eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce,Namespace:calico-system,Attempt:1,}" Mar 17 18:50:39.801416 systemd-networkd[1687]: cali2f2537c9527: Link UP Mar 17 18:50:39.815500 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 18:50:39.815622 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali2f2537c9527: link becomes ready Mar 17 18:50:39.816556 systemd-networkd[1687]: cali2f2537c9527: Gained carrier Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.620 [INFO][4203] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.638 [INFO][4203] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0 calico-apiserver-6b476d9448- calico-apiserver 7db7f05a-c0bd-436a-b834-87825c403071 768 0 2025-03-17 18:50:10 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b476d9448 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-3510.3.7-a-b312ad98ee calico-apiserver-6b476d9448-hjl2z eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2f2537c9527 [] []}} ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-hjl2z" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.638 [INFO][4203] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-hjl2z" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.725 [INFO][4227] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" HandleID="k8s-pod-network.20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.736 [INFO][4227] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" HandleID="k8s-pod-network.20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003ab110), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-3510.3.7-a-b312ad98ee", "pod":"calico-apiserver-6b476d9448-hjl2z", "timestamp":"2025-03-17 18:50:39.725479363 +0000 UTC"}, Hostname:"ci-3510.3.7-a-b312ad98ee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.736 [INFO][4227] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.736 [INFO][4227] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.736 [INFO][4227] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.7-a-b312ad98ee' Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.738 [INFO][4227] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.742 [INFO][4227] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.746 [INFO][4227] ipam/ipam.go 489: Trying affinity for 192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.748 [INFO][4227] ipam/ipam.go 155: Attempting to load block cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.750 [INFO][4227] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.750 [INFO][4227] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.103.64/26 handle="k8s-pod-network.20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.751 [INFO][4227] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9 Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.756 [INFO][4227] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.103.64/26 handle="k8s-pod-network.20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.765 [INFO][4227] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.103.65/26] block=192.168.103.64/26 handle="k8s-pod-network.20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.765 [INFO][4227] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.103.65/26] handle="k8s-pod-network.20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.765 [INFO][4227] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:39.839158 env[1524]: 2025-03-17 18:50:39.765 [INFO][4227] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.103.65/26] IPv6=[] ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" HandleID="k8s-pod-network.20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.840127 env[1524]: 2025-03-17 18:50:39.773 [INFO][4203] cni-plugin/k8s.go 386: Populated endpoint ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-hjl2z" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0", GenerateName:"calico-apiserver-6b476d9448-", Namespace:"calico-apiserver", SelfLink:"", UID:"7db7f05a-c0bd-436a-b834-87825c403071", ResourceVersion:"768", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b476d9448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"", Pod:"calico-apiserver-6b476d9448-hjl2z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.103.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2f2537c9527", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:39.840127 env[1524]: 2025-03-17 18:50:39.773 [INFO][4203] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.103.65/32] ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-hjl2z" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.840127 env[1524]: 2025-03-17 18:50:39.773 [INFO][4203] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2f2537c9527 ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-hjl2z" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.840127 env[1524]: 2025-03-17 18:50:39.817 [INFO][4203] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-hjl2z" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.840127 env[1524]: 2025-03-17 18:50:39.818 [INFO][4203] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-hjl2z" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0", GenerateName:"calico-apiserver-6b476d9448-", Namespace:"calico-apiserver", SelfLink:"", UID:"7db7f05a-c0bd-436a-b834-87825c403071", ResourceVersion:"768", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b476d9448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9", Pod:"calico-apiserver-6b476d9448-hjl2z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.103.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2f2537c9527", MAC:"66:26:01:a0:de:cb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:39.840127 env[1524]: 2025-03-17 18:50:39.835 [INFO][4203] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-hjl2z" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:39.860929 systemd-networkd[1687]: caliebc6c73aca4: Link UP Mar 17 18:50:39.868722 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): caliebc6c73aca4: link becomes ready Mar 17 18:50:39.868892 systemd-networkd[1687]: caliebc6c73aca4: Gained carrier Mar 17 18:50:39.893358 systemd-networkd[1687]: calib10a649341f: Link UP Mar 17 18:50:39.900847 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): calib10a649341f: link becomes ready Mar 17 18:50:39.900600 systemd-networkd[1687]: calib10a649341f: Gained carrier Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.689 [INFO][4214] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.707 [INFO][4214] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0 calico-apiserver-6b476d9448- calico-apiserver 181cc807-8e75-4c0c-9bab-b609021ab74e 769 0 2025-03-17 18:50:10 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6b476d9448 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-3510.3.7-a-b312ad98ee calico-apiserver-6b476d9448-c9626 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] caliebc6c73aca4 [] []}} ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-c9626" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.708 [INFO][4214] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-c9626" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.765 [INFO][4248] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" HandleID="k8s-pod-network.b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.782 [INFO][4248] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" HandleID="k8s-pod-network.b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000310790), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-3510.3.7-a-b312ad98ee", "pod":"calico-apiserver-6b476d9448-c9626", "timestamp":"2025-03-17 18:50:39.765229616 +0000 UTC"}, Hostname:"ci-3510.3.7-a-b312ad98ee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.782 [INFO][4248] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.782 [INFO][4248] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.782 [INFO][4248] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.7-a-b312ad98ee' Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.785 [INFO][4248] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.790 [INFO][4248] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.794 [INFO][4248] ipam/ipam.go 489: Trying affinity for 192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.810 [INFO][4248] ipam/ipam.go 155: Attempting to load block cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.819 [INFO][4248] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.819 [INFO][4248] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.103.64/26 handle="k8s-pod-network.b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.823 [INFO][4248] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5 Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.831 [INFO][4248] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.103.64/26 handle="k8s-pod-network.b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.841 [INFO][4248] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.103.66/26] block=192.168.103.64/26 handle="k8s-pod-network.b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.841 [INFO][4248] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.103.66/26] handle="k8s-pod-network.b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.841 [INFO][4248] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:39.906230 env[1524]: 2025-03-17 18:50:39.841 [INFO][4248] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.103.66/26] IPv6=[] ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" HandleID="k8s-pod-network.b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.906953 env[1524]: 2025-03-17 18:50:39.845 [INFO][4214] cni-plugin/k8s.go 386: Populated endpoint ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-c9626" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0", GenerateName:"calico-apiserver-6b476d9448-", Namespace:"calico-apiserver", SelfLink:"", UID:"181cc807-8e75-4c0c-9bab-b609021ab74e", ResourceVersion:"769", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b476d9448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"", Pod:"calico-apiserver-6b476d9448-c9626", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.103.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliebc6c73aca4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:39.906953 env[1524]: 2025-03-17 18:50:39.846 [INFO][4214] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.103.66/32] ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-c9626" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.906953 env[1524]: 2025-03-17 18:50:39.847 [INFO][4214] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliebc6c73aca4 ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-c9626" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.906953 env[1524]: 2025-03-17 18:50:39.869 [INFO][4214] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-c9626" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.906953 env[1524]: 2025-03-17 18:50:39.869 [INFO][4214] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-c9626" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0", GenerateName:"calico-apiserver-6b476d9448-", Namespace:"calico-apiserver", SelfLink:"", UID:"181cc807-8e75-4c0c-9bab-b609021ab74e", ResourceVersion:"769", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b476d9448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5", Pod:"calico-apiserver-6b476d9448-c9626", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.103.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliebc6c73aca4", MAC:"2e:b6:6b:08:d8:bc", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:39.906953 env[1524]: 2025-03-17 18:50:39.885 [INFO][4214] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5" Namespace="calico-apiserver" Pod="calico-apiserver-6b476d9448-c9626" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:39.924745 env[1524]: time="2025-03-17T18:50:39.924043726Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:39.924745 env[1524]: time="2025-03-17T18:50:39.924099527Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:39.924745 env[1524]: time="2025-03-17T18:50:39.924114527Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:39.924745 env[1524]: time="2025-03-17T18:50:39.924282828Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9 pid=4295 runtime=io.containerd.runc.v2 Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.680 [INFO][4226] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.698 [INFO][4226] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0 csi-node-driver- calico-system eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce 770 0 2025-03-17 18:50:11 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:65bf684474 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-3510.3.7-a-b312ad98ee csi-node-driver-tt74p eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calib10a649341f [] []}} ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Namespace="calico-system" Pod="csi-node-driver-tt74p" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.698 [INFO][4226] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Namespace="calico-system" Pod="csi-node-driver-tt74p" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.782 [INFO][4244] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" HandleID="k8s-pod-network.486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.797 [INFO][4244] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" HandleID="k8s-pod-network.486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000310af0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-3510.3.7-a-b312ad98ee", "pod":"csi-node-driver-tt74p", "timestamp":"2025-03-17 18:50:39.782018923 +0000 UTC"}, Hostname:"ci-3510.3.7-a-b312ad98ee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.797 [INFO][4244] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.841 [INFO][4244] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.841 [INFO][4244] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.7-a-b312ad98ee' Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.843 [INFO][4244] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.847 [INFO][4244] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.851 [INFO][4244] ipam/ipam.go 489: Trying affinity for 192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.853 [INFO][4244] ipam/ipam.go 155: Attempting to load block cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.855 [INFO][4244] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.855 [INFO][4244] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.103.64/26 handle="k8s-pod-network.486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.856 [INFO][4244] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7 Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.870 [INFO][4244] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.103.64/26 handle="k8s-pod-network.486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.883 [INFO][4244] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.103.67/26] block=192.168.103.64/26 handle="k8s-pod-network.486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.883 [INFO][4244] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.103.67/26] handle="k8s-pod-network.486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.883 [INFO][4244] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:39.926321 env[1524]: 2025-03-17 18:50:39.883 [INFO][4244] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.103.67/26] IPv6=[] ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" HandleID="k8s-pod-network.486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.927326 env[1524]: 2025-03-17 18:50:39.887 [INFO][4226] cni-plugin/k8s.go 386: Populated endpoint ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Namespace="calico-system" Pod="csi-node-driver-tt74p" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce", ResourceVersion:"770", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"", Pod:"csi-node-driver-tt74p", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.103.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib10a649341f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:39.927326 env[1524]: 2025-03-17 18:50:39.887 [INFO][4226] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.103.67/32] ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Namespace="calico-system" Pod="csi-node-driver-tt74p" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.927326 env[1524]: 2025-03-17 18:50:39.887 [INFO][4226] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib10a649341f ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Namespace="calico-system" Pod="csi-node-driver-tt74p" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.927326 env[1524]: 2025-03-17 18:50:39.907 [INFO][4226] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Namespace="calico-system" Pod="csi-node-driver-tt74p" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.927326 env[1524]: 2025-03-17 18:50:39.908 [INFO][4226] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Namespace="calico-system" Pod="csi-node-driver-tt74p" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce", ResourceVersion:"770", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7", Pod:"csi-node-driver-tt74p", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.103.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib10a649341f", MAC:"d6:fc:7f:a4:0f:48", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:39.927326 env[1524]: 2025-03-17 18:50:39.923 [INFO][4226] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7" Namespace="calico-system" Pod="csi-node-driver-tt74p" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:39.947829 env[1524]: time="2025-03-17T18:50:39.947755477Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:39.948081 env[1524]: time="2025-03-17T18:50:39.948038279Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:39.948244 env[1524]: time="2025-03-17T18:50:39.948220080Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:39.948675 env[1524]: time="2025-03-17T18:50:39.948625783Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5 pid=4321 runtime=io.containerd.runc.v2 Mar 17 18:50:39.976593 env[1524]: time="2025-03-17T18:50:39.976499360Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:39.976765 env[1524]: time="2025-03-17T18:50:39.976604461Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:39.976765 env[1524]: time="2025-03-17T18:50:39.976632761Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:39.976904 env[1524]: time="2025-03-17T18:50:39.976807962Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7 pid=4362 runtime=io.containerd.runc.v2 Mar 17 18:50:40.055365 env[1524]: time="2025-03-17T18:50:40.055207957Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tt74p,Uid:eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce,Namespace:calico-system,Attempt:1,} returns sandbox id \"486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7\"" Mar 17 18:50:40.067723 env[1524]: time="2025-03-17T18:50:40.066074125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\"" Mar 17 18:50:40.069096 env[1524]: time="2025-03-17T18:50:40.069042844Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b476d9448-hjl2z,Uid:7db7f05a-c0bd-436a-b834-87825c403071,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9\"" Mar 17 18:50:40.104025 env[1524]: time="2025-03-17T18:50:40.103975064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6b476d9448-c9626,Uid:181cc807-8e75-4c0c-9bab-b609021ab74e,Namespace:calico-apiserver,Attempt:1,} returns sandbox id \"b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5\"" Mar 17 18:50:40.226885 env[1524]: time="2025-03-17T18:50:40.224515422Z" level=info msg="StopPodSandbox for \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\"" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.281 [INFO][4432] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.281 [INFO][4432] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" iface="eth0" netns="/var/run/netns/cni-a82e4fc3-3e21-012a-442b-098adea90639" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.281 [INFO][4432] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" iface="eth0" netns="/var/run/netns/cni-a82e4fc3-3e21-012a-442b-098adea90639" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.282 [INFO][4432] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" iface="eth0" netns="/var/run/netns/cni-a82e4fc3-3e21-012a-442b-098adea90639" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.282 [INFO][4432] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.282 [INFO][4432] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.315 [INFO][4445] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" HandleID="k8s-pod-network.420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.315 [INFO][4445] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.315 [INFO][4445] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.321 [WARNING][4445] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" HandleID="k8s-pod-network.420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.321 [INFO][4445] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" HandleID="k8s-pod-network.420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.322 [INFO][4445] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:40.325278 env[1524]: 2025-03-17 18:50:40.323 [INFO][4432] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:40.325278 env[1524]: time="2025-03-17T18:50:40.325024455Z" level=info msg="TearDown network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\" successfully" Mar 17 18:50:40.325278 env[1524]: time="2025-03-17T18:50:40.325063055Z" level=info msg="StopPodSandbox for \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\" returns successfully" Mar 17 18:50:40.326248 env[1524]: time="2025-03-17T18:50:40.326212662Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-58lvk,Uid:7349d13c-6ed5-461e-80f1-d59a9207db7f,Namespace:kube-system,Attempt:1,}" Mar 17 18:50:40.470558 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): calia35f6408891: link becomes ready Mar 17 18:50:40.474247 systemd-networkd[1687]: calia35f6408891: Link UP Mar 17 18:50:40.474483 systemd-networkd[1687]: calia35f6408891: Gained carrier Mar 17 18:50:40.484923 systemd[1]: run-netns-cni\x2da82e4fc3\x2d3e21\x2d012a\x2d442b\x2d098adea90639.mount: Deactivated successfully. Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.383 [INFO][4453] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.393 [INFO][4453] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0 coredns-7db6d8ff4d- kube-system 7349d13c-6ed5-461e-80f1-d59a9207db7f 786 0 2025-03-17 18:50:04 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-3510.3.7-a-b312ad98ee coredns-7db6d8ff4d-58lvk eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calia35f6408891 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Namespace="kube-system" Pod="coredns-7db6d8ff4d-58lvk" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.393 [INFO][4453] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Namespace="kube-system" Pod="coredns-7db6d8ff4d-58lvk" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.419 [INFO][4464] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" HandleID="k8s-pod-network.a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.428 [INFO][4464] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" HandleID="k8s-pod-network.a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0003116c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-3510.3.7-a-b312ad98ee", "pod":"coredns-7db6d8ff4d-58lvk", "timestamp":"2025-03-17 18:50:40.41969985 +0000 UTC"}, Hostname:"ci-3510.3.7-a-b312ad98ee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.428 [INFO][4464] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.428 [INFO][4464] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.428 [INFO][4464] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.7-a-b312ad98ee' Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.430 [INFO][4464] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.433 [INFO][4464] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.436 [INFO][4464] ipam/ipam.go 489: Trying affinity for 192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.437 [INFO][4464] ipam/ipam.go 155: Attempting to load block cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.439 [INFO][4464] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.439 [INFO][4464] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.103.64/26 handle="k8s-pod-network.a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.440 [INFO][4464] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65 Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.446 [INFO][4464] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.103.64/26 handle="k8s-pod-network.a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.452 [INFO][4464] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.103.68/26] block=192.168.103.64/26 handle="k8s-pod-network.a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.452 [INFO][4464] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.103.68/26] handle="k8s-pod-network.a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.452 [INFO][4464] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:40.492137 env[1524]: 2025-03-17 18:50:40.452 [INFO][4464] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.103.68/26] IPv6=[] ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" HandleID="k8s-pod-network.a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.493181 env[1524]: 2025-03-17 18:50:40.455 [INFO][4453] cni-plugin/k8s.go 386: Populated endpoint ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Namespace="kube-system" Pod="coredns-7db6d8ff4d-58lvk" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7349d13c-6ed5-461e-80f1-d59a9207db7f", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"", Pod:"coredns-7db6d8ff4d-58lvk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.103.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia35f6408891", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:40.493181 env[1524]: 2025-03-17 18:50:40.455 [INFO][4453] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.103.68/32] ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Namespace="kube-system" Pod="coredns-7db6d8ff4d-58lvk" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.493181 env[1524]: 2025-03-17 18:50:40.455 [INFO][4453] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia35f6408891 ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Namespace="kube-system" Pod="coredns-7db6d8ff4d-58lvk" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.493181 env[1524]: 2025-03-17 18:50:40.462 [INFO][4453] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Namespace="kube-system" Pod="coredns-7db6d8ff4d-58lvk" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.493181 env[1524]: 2025-03-17 18:50:40.462 [INFO][4453] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Namespace="kube-system" Pod="coredns-7db6d8ff4d-58lvk" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7349d13c-6ed5-461e-80f1-d59a9207db7f", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65", Pod:"coredns-7db6d8ff4d-58lvk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.103.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia35f6408891", MAC:"de:e0:04:79:37:36", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:40.493181 env[1524]: 2025-03-17 18:50:40.486 [INFO][4453] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65" Namespace="kube-system" Pod="coredns-7db6d8ff4d-58lvk" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:40.512101 env[1524]: time="2025-03-17T18:50:40.512039131Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:40.512290 env[1524]: time="2025-03-17T18:50:40.512107132Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:40.512290 env[1524]: time="2025-03-17T18:50:40.512135132Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:40.512446 env[1524]: time="2025-03-17T18:50:40.512297533Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65 pid=4490 runtime=io.containerd.runc.v2 Mar 17 18:50:40.548315 systemd[1]: run-containerd-runc-k8s.io-a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65-runc.BQsbFh.mount: Deactivated successfully. Mar 17 18:50:40.579663 env[1524]: time="2025-03-17T18:50:40.577637644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-58lvk,Uid:7349d13c-6ed5-461e-80f1-d59a9207db7f,Namespace:kube-system,Attempt:1,} returns sandbox id \"a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65\"" Mar 17 18:50:40.581994 env[1524]: time="2025-03-17T18:50:40.581950271Z" level=info msg="CreateContainer within sandbox \"a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 18:50:40.618473 env[1524]: time="2025-03-17T18:50:40.618422401Z" level=info msg="CreateContainer within sandbox \"a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"637d052c940c7b7fbb780e6ec1d07abfa5e5d6e509af99688223462f27e5eb81\"" Mar 17 18:50:40.620510 env[1524]: time="2025-03-17T18:50:40.619338706Z" level=info msg="StartContainer for \"637d052c940c7b7fbb780e6ec1d07abfa5e5d6e509af99688223462f27e5eb81\"" Mar 17 18:50:40.687501 env[1524]: time="2025-03-17T18:50:40.687448535Z" level=info msg="StartContainer for \"637d052c940c7b7fbb780e6ec1d07abfa5e5d6e509af99688223462f27e5eb81\" returns successfully" Mar 17 18:50:41.224548 env[1524]: time="2025-03-17T18:50:41.224493299Z" level=info msg="StopPodSandbox for \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\"" Mar 17 18:50:41.284956 systemd-networkd[1687]: caliebc6c73aca4: Gained IPv6LL Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.309 [INFO][4586] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.309 [INFO][4586] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" iface="eth0" netns="/var/run/netns/cni-afbbd5b7-ae31-7e87-3185-1f226fd01184" Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.309 [INFO][4586] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" iface="eth0" netns="/var/run/netns/cni-afbbd5b7-ae31-7e87-3185-1f226fd01184" Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.309 [INFO][4586] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" iface="eth0" netns="/var/run/netns/cni-afbbd5b7-ae31-7e87-3185-1f226fd01184" Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.310 [INFO][4586] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.310 [INFO][4586] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.364 [INFO][4593] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" HandleID="k8s-pod-network.3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.364 [INFO][4593] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.365 [INFO][4593] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.371 [WARNING][4593] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" HandleID="k8s-pod-network.3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.371 [INFO][4593] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" HandleID="k8s-pod-network.3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.372 [INFO][4593] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:41.375468 env[1524]: 2025-03-17 18:50:41.374 [INFO][4586] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:41.376306 env[1524]: time="2025-03-17T18:50:41.376263043Z" level=info msg="TearDown network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\" successfully" Mar 17 18:50:41.376392 env[1524]: time="2025-03-17T18:50:41.376379544Z" level=info msg="StopPodSandbox for \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\" returns successfully" Mar 17 18:50:41.377087 env[1524]: time="2025-03-17T18:50:41.377060748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-hvsfz,Uid:309db727-4be2-4bfe-b325-49e81b2078ad,Namespace:kube-system,Attempt:1,}" Mar 17 18:50:41.472394 systemd[1]: run-netns-cni\x2dafbbd5b7\x2dae31\x2d7e87\x2d3185\x2d1f226fd01184.mount: Deactivated successfully. Mar 17 18:50:41.475871 systemd-networkd[1687]: cali2f2537c9527: Gained IPv6LL Mar 17 18:50:41.502946 kubelet[2704]: I0317 18:50:41.502836 2704 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 17 18:50:41.510689 kubelet[2704]: I0317 18:50:41.504411 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-58lvk" podStartSLOduration=37.50430064 podStartE2EDuration="37.50430064s" podCreationTimestamp="2025-03-17 18:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:50:41.50111222 +0000 UTC m=+53.749212418" watchObservedRunningTime="2025-03-17 18:50:41.50430064 +0000 UTC m=+53.752400838" Mar 17 18:50:41.523724 kernel: kauditd_printk_skb: 25 callbacks suppressed Mar 17 18:50:41.524741 kernel: audit: type=1325 audit(1742237441.510:303): table=filter:98 family=2 entries=18 op=nft_register_rule pid=4605 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:41.510000 audit[4605]: NETFILTER_CFG table=filter:98 family=2 entries=18 op=nft_register_rule pid=4605 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:41.546118 kernel: audit: type=1300 audit(1742237441.510:303): arch=c000003e syscall=46 success=yes exit=6652 a0=3 a1=7ffc798ac8c0 a2=0 a3=7ffc798ac8ac items=0 ppid=2885 pid=4605 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:41.510000 audit[4605]: SYSCALL arch=c000003e syscall=46 success=yes exit=6652 a0=3 a1=7ffc798ac8c0 a2=0 a3=7ffc798ac8ac items=0 ppid=2885 pid=4605 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:41.510000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:41.568775 kernel: audit: type=1327 audit(1742237441.510:303): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:41.568879 kernel: audit: type=1325 audit(1742237441.552:304): table=nat:99 family=2 entries=12 op=nft_register_rule pid=4605 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:41.552000 audit[4605]: NETFILTER_CFG table=nat:99 family=2 entries=12 op=nft_register_rule pid=4605 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:41.552000 audit[4605]: SYSCALL arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc798ac8c0 a2=0 a3=0 items=0 ppid=2885 pid=4605 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:41.588693 kernel: audit: type=1300 audit(1742237441.552:304): arch=c000003e syscall=46 success=yes exit=2700 a0=3 a1=7ffc798ac8c0 a2=0 a3=0 items=0 ppid=2885 pid=4605 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:41.552000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:41.621727 kernel: audit: type=1327 audit(1742237441.552:304): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:41.630000 audit[4617]: NETFILTER_CFG table=filter:100 family=2 entries=15 op=nft_register_rule pid=4617 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:41.641713 kernel: audit: type=1325 audit(1742237441.630:305): table=filter:100 family=2 entries=15 op=nft_register_rule pid=4617 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:41.666049 kernel: audit: type=1300 audit(1742237441.630:305): arch=c000003e syscall=46 success=yes exit=4420 a0=3 a1=7ffd6c068380 a2=0 a3=7ffd6c06836c items=0 ppid=2885 pid=4617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:41.630000 audit[4617]: SYSCALL arch=c000003e syscall=46 success=yes exit=4420 a0=3 a1=7ffd6c068380 a2=0 a3=7ffd6c06836c items=0 ppid=2885 pid=4617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:41.630000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:41.700608 kernel: audit: type=1327 audit(1742237441.630:305): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:41.700799 kernel: audit: type=1325 audit(1742237441.642:306): table=nat:101 family=2 entries=33 op=nft_register_chain pid=4617 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:41.642000 audit[4617]: NETFILTER_CFG table=nat:101 family=2 entries=33 op=nft_register_chain pid=4617 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:41.642000 audit[4617]: SYSCALL arch=c000003e syscall=46 success=yes exit=13428 a0=3 a1=7ffd6c068380 a2=0 a3=7ffd6c06836c items=0 ppid=2885 pid=4617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:41.642000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:41.868651 systemd-networkd[1687]: calib10a649341f: Gained IPv6LL Mar 17 18:50:41.885118 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready Mar 17 18:50:41.885217 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali4bd3043fa88: link becomes ready Mar 17 18:50:41.875094 systemd-networkd[1687]: cali4bd3043fa88: Link UP Mar 17 18:50:41.896723 systemd-networkd[1687]: cali4bd3043fa88: Gained carrier Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.680 [INFO][4606] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.702 [INFO][4606] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0 coredns-7db6d8ff4d- kube-system 309db727-4be2-4bfe-b325-49e81b2078ad 796 0 2025-03-17 18:50:04 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7db6d8ff4d projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-3510.3.7-a-b312ad98ee coredns-7db6d8ff4d-hvsfz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4bd3043fa88 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] []}} ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Namespace="kube-system" Pod="coredns-7db6d8ff4d-hvsfz" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.702 [INFO][4606] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Namespace="kube-system" Pod="coredns-7db6d8ff4d-hvsfz" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.804 [INFO][4622] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" HandleID="k8s-pod-network.45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.819 [INFO][4622] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" HandleID="k8s-pod-network.45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000311450), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-3510.3.7-a-b312ad98ee", "pod":"coredns-7db6d8ff4d-hvsfz", "timestamp":"2025-03-17 18:50:41.804182807 +0000 UTC"}, Hostname:"ci-3510.3.7-a-b312ad98ee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.819 [INFO][4622] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.819 [INFO][4622] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.819 [INFO][4622] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.7-a-b312ad98ee' Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.820 [INFO][4622] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.824 [INFO][4622] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.828 [INFO][4622] ipam/ipam.go 489: Trying affinity for 192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.830 [INFO][4622] ipam/ipam.go 155: Attempting to load block cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.832 [INFO][4622] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.832 [INFO][4622] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.103.64/26 handle="k8s-pod-network.45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.833 [INFO][4622] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219 Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.846 [INFO][4622] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.103.64/26 handle="k8s-pod-network.45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.870 [INFO][4622] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.103.69/26] block=192.168.103.64/26 handle="k8s-pod-network.45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.870 [INFO][4622] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.103.69/26] handle="k8s-pod-network.45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.870 [INFO][4622] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:41.916104 env[1524]: 2025-03-17 18:50:41.870 [INFO][4622] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.103.69/26] IPv6=[] ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" HandleID="k8s-pod-network.45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.917036 env[1524]: 2025-03-17 18:50:41.872 [INFO][4606] cni-plugin/k8s.go 386: Populated endpoint ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Namespace="kube-system" Pod="coredns-7db6d8ff4d-hvsfz" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"309db727-4be2-4bfe-b325-49e81b2078ad", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"", Pod:"coredns-7db6d8ff4d-hvsfz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.103.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4bd3043fa88", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:41.917036 env[1524]: 2025-03-17 18:50:41.872 [INFO][4606] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.103.69/32] ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Namespace="kube-system" Pod="coredns-7db6d8ff4d-hvsfz" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.917036 env[1524]: 2025-03-17 18:50:41.872 [INFO][4606] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4bd3043fa88 ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Namespace="kube-system" Pod="coredns-7db6d8ff4d-hvsfz" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.917036 env[1524]: 2025-03-17 18:50:41.892 [INFO][4606] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Namespace="kube-system" Pod="coredns-7db6d8ff4d-hvsfz" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.917036 env[1524]: 2025-03-17 18:50:41.900 [INFO][4606] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Namespace="kube-system" Pod="coredns-7db6d8ff4d-hvsfz" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"309db727-4be2-4bfe-b325-49e81b2078ad", ResourceVersion:"796", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219", Pod:"coredns-7db6d8ff4d-hvsfz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.103.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4bd3043fa88", MAC:"3a:ea:a8:2d:f8:bd", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:41.917036 env[1524]: 2025-03-17 18:50:41.913 [INFO][4606] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219" Namespace="kube-system" Pod="coredns-7db6d8ff4d-hvsfz" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:41.973601 env[1524]: time="2025-03-17T18:50:41.973556761Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:41.977202 env[1524]: time="2025-03-17T18:50:41.970649543Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:41.977202 env[1524]: time="2025-03-17T18:50:41.970699443Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:41.977202 env[1524]: time="2025-03-17T18:50:41.970713043Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:41.977202 env[1524]: time="2025-03-17T18:50:41.972407054Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219 pid=4677 runtime=io.containerd.runc.v2 Mar 17 18:50:41.988303 env[1524]: time="2025-03-17T18:50:41.988262153Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:41.994490 env[1524]: time="2025-03-17T18:50:41.994451491Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/csi:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:42.009784 env[1524]: time="2025-03-17T18:50:42.003577848Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/csi@sha256:eaa7e01fb16b603c155a67b81f16992281db7f831684c7b2081d3434587a7ff3,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:42.009784 env[1524]: time="2025-03-17T18:50:42.004041451Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.29.1\" returns image reference \"sha256:bda8c42e04758c4f061339e213f50ccdc7502c4176fbf631aa12357e62b63540\"" Mar 17 18:50:42.009784 env[1524]: time="2025-03-17T18:50:42.008993381Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Mar 17 18:50:42.011743 env[1524]: time="2025-03-17T18:50:42.011709398Z" level=info msg="CreateContainer within sandbox \"486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Mar 17 18:50:42.053780 env[1524]: time="2025-03-17T18:50:42.052972352Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7db6d8ff4d-hvsfz,Uid:309db727-4be2-4bfe-b325-49e81b2078ad,Namespace:kube-system,Attempt:1,} returns sandbox id \"45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219\"" Mar 17 18:50:42.056051 env[1524]: time="2025-03-17T18:50:42.056013371Z" level=info msg="CreateContainer within sandbox \"45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Mar 17 18:50:42.059159 env[1524]: time="2025-03-17T18:50:42.059126990Z" level=info msg="CreateContainer within sandbox \"486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"870e6696e9f2b4c7743db354048b65b9a63081a297f3e8e300f5e3554d1c6c54\"" Mar 17 18:50:42.070585 env[1524]: time="2025-03-17T18:50:42.070546960Z" level=info msg="StartContainer for \"870e6696e9f2b4c7743db354048b65b9a63081a297f3e8e300f5e3554d1c6c54\"" Mar 17 18:50:42.088564 env[1524]: time="2025-03-17T18:50:42.088517971Z" level=info msg="CreateContainer within sandbox \"45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"49f5cb0f367ea679ea0ac2ace4c7fb310aeb5f4c3d53f01d9886b85f1d222bfd\"" Mar 17 18:50:42.093279 env[1524]: time="2025-03-17T18:50:42.093243300Z" level=info msg="StartContainer for \"49f5cb0f367ea679ea0ac2ace4c7fb310aeb5f4c3d53f01d9886b85f1d222bfd\"" Mar 17 18:50:42.155137 env[1524]: time="2025-03-17T18:50:42.152533265Z" level=info msg="StartContainer for \"870e6696e9f2b4c7743db354048b65b9a63081a297f3e8e300f5e3554d1c6c54\" returns successfully" Mar 17 18:50:42.169714 env[1524]: time="2025-03-17T18:50:42.169513470Z" level=info msg="StartContainer for \"49f5cb0f367ea679ea0ac2ace4c7fb310aeb5f4c3d53f01d9886b85f1d222bfd\" returns successfully" Mar 17 18:50:42.225077 env[1524]: time="2025-03-17T18:50:42.224505208Z" level=info msg="StopPodSandbox for \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\"" Mar 17 18:50:42.244819 systemd-networkd[1687]: calia35f6408891: Gained IPv6LL Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.279 [INFO][4798] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.279 [INFO][4798] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" iface="eth0" netns="/var/run/netns/cni-e4d660cd-97e4-7b59-2b97-ca49d00248b6" Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.279 [INFO][4798] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" iface="eth0" netns="/var/run/netns/cni-e4d660cd-97e4-7b59-2b97-ca49d00248b6" Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.280 [INFO][4798] cni-plugin/dataplane_linux.go 597: Workload's veth was already gone. Nothing to do. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" iface="eth0" netns="/var/run/netns/cni-e4d660cd-97e4-7b59-2b97-ca49d00248b6" Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.280 [INFO][4798] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.280 [INFO][4798] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.328 [INFO][4805] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" HandleID="k8s-pod-network.81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.329 [INFO][4805] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.329 [INFO][4805] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.337 [WARNING][4805] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" HandleID="k8s-pod-network.81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.337 [INFO][4805] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" HandleID="k8s-pod-network.81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.338 [INFO][4805] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:42.342841 env[1524]: 2025-03-17 18:50:42.341 [INFO][4798] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:42.345339 env[1524]: time="2025-03-17T18:50:42.343049539Z" level=info msg="TearDown network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\" successfully" Mar 17 18:50:42.345339 env[1524]: time="2025-03-17T18:50:42.343092139Z" level=info msg="StopPodSandbox for \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\" returns successfully" Mar 17 18:50:42.345339 env[1524]: time="2025-03-17T18:50:42.343904144Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-ff8db6ccc-8lpcj,Uid:9b7bbe37-740f-49ee-a599-03cedfded2d6,Namespace:calico-system,Attempt:1,}" Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit: BPF prog-id=10 op=LOAD Mar 17 18:50:42.375000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffee13450f0 a2=98 a3=3 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.375000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.375000 audit: BPF prog-id=10 op=UNLOAD Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit: BPF prog-id=11 op=LOAD Mar 17 18:50:42.375000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffee1344ed0 a2=74 a3=540051 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.375000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.375000 audit: BPF prog-id=11 op=UNLOAD Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.375000 audit: BPF prog-id=12 op=LOAD Mar 17 18:50:42.375000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffee1344f00 a2=94 a3=2 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.375000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.375000 audit: BPF prog-id=12 op=UNLOAD Mar 17 18:50:42.475560 systemd[1]: run-netns-cni\x2de4d660cd\x2d97e4\x2d7b59\x2d2b97\x2dca49d00248b6.mount: Deactivated successfully. Mar 17 18:50:42.545104 kubelet[2704]: I0317 18:50:42.545041 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7db6d8ff4d-hvsfz" podStartSLOduration=38.545019083 podStartE2EDuration="38.545019083s" podCreationTimestamp="2025-03-17 18:50:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-03-17 18:50:42.525877965 +0000 UTC m=+54.773978063" watchObservedRunningTime="2025-03-17 18:50:42.545019083 +0000 UTC m=+54.793119281" Mar 17 18:50:42.558000 audit[4847]: NETFILTER_CFG table=filter:102 family=2 entries=11 op=nft_register_rule pid=4847 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:42.558000 audit[4847]: SYSCALL arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7ffc38077320 a2=0 a3=7ffc3807730c items=0 ppid=2885 pid=4847 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.558000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:42.564000 audit[4847]: NETFILTER_CFG table=nat:103 family=2 entries=49 op=nft_register_chain pid=4847 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:42.564000 audit[4847]: SYSCALL arch=c000003e syscall=46 success=yes exit=17004 a0=3 a1=7ffc38077320 a2=0 a3=7ffc3807730c items=0 ppid=2885 pid=4847 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.564000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:42.676835 kernel: IPv6: ADDRCONF(NETDEV_CHANGE): cali015a9146b1d: link becomes ready Mar 17 18:50:42.682503 systemd-networkd[1687]: cali015a9146b1d: Link UP Mar 17 18:50:42.683261 systemd-networkd[1687]: cali015a9146b1d: Gained carrier Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.446 [INFO][4825] cni-plugin/plugin.go 325: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0 calico-kube-controllers-ff8db6ccc- calico-system 9b7bbe37-740f-49ee-a599-03cedfded2d6 823 0 2025-03-17 18:50:11 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:ff8db6ccc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-3510.3.7-a-b312ad98ee calico-kube-controllers-ff8db6ccc-8lpcj eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali015a9146b1d [] []}} ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Namespace="calico-system" Pod="calico-kube-controllers-ff8db6ccc-8lpcj" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.447 [INFO][4825] cni-plugin/k8s.go 77: Extracted identifiers for CmdAddK8s ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Namespace="calico-system" Pod="calico-kube-controllers-ff8db6ccc-8lpcj" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.572 [INFO][4841] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" HandleID="k8s-pod-network.1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.593 [INFO][4841] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" HandleID="k8s-pod-network.1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002907a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-3510.3.7-a-b312ad98ee", "pod":"calico-kube-controllers-ff8db6ccc-8lpcj", "timestamp":"2025-03-17 18:50:42.568430327 +0000 UTC"}, Hostname:"ci-3510.3.7-a-b312ad98ee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.594 [INFO][4841] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.594 [INFO][4841] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.594 [INFO][4841] ipam/ipam.go 107: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-3510.3.7-a-b312ad98ee' Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.597 [INFO][4841] ipam/ipam.go 660: Looking up existing affinities for host handle="k8s-pod-network.1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.601 [INFO][4841] ipam/ipam.go 372: Looking up existing affinities for host host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.614 [INFO][4841] ipam/ipam.go 489: Trying affinity for 192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.619 [INFO][4841] ipam/ipam.go 155: Attempting to load block cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.625 [INFO][4841] ipam/ipam.go 232: Affinity is confirmed and block has been loaded cidr=192.168.103.64/26 host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.625 [INFO][4841] ipam/ipam.go 1180: Attempting to assign 1 addresses from block block=192.168.103.64/26 handle="k8s-pod-network.1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.629 [INFO][4841] ipam/ipam.go 1685: Creating new handle: k8s-pod-network.1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599 Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.651 [INFO][4841] ipam/ipam.go 1203: Writing block in order to claim IPs block=192.168.103.64/26 handle="k8s-pod-network.1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.665 [INFO][4841] ipam/ipam.go 1216: Successfully claimed IPs: [192.168.103.70/26] block=192.168.103.64/26 handle="k8s-pod-network.1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.665 [INFO][4841] ipam/ipam.go 847: Auto-assigned 1 out of 1 IPv4s: [192.168.103.70/26] handle="k8s-pod-network.1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" host="ci-3510.3.7-a-b312ad98ee" Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.665 [INFO][4841] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:42.699963 env[1524]: 2025-03-17 18:50:42.665 [INFO][4841] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.103.70/26] IPv6=[] ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" HandleID="k8s-pod-network.1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.701278 env[1524]: 2025-03-17 18:50:42.667 [INFO][4825] cni-plugin/k8s.go 386: Populated endpoint ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Namespace="calico-system" Pod="calico-kube-controllers-ff8db6ccc-8lpcj" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0", GenerateName:"calico-kube-controllers-ff8db6ccc-", Namespace:"calico-system", SelfLink:"", UID:"9b7bbe37-740f-49ee-a599-03cedfded2d6", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"ff8db6ccc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"", Pod:"calico-kube-controllers-ff8db6ccc-8lpcj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.103.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali015a9146b1d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:42.701278 env[1524]: 2025-03-17 18:50:42.667 [INFO][4825] cni-plugin/k8s.go 387: Calico CNI using IPs: [192.168.103.70/32] ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Namespace="calico-system" Pod="calico-kube-controllers-ff8db6ccc-8lpcj" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.701278 env[1524]: 2025-03-17 18:50:42.668 [INFO][4825] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali015a9146b1d ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Namespace="calico-system" Pod="calico-kube-controllers-ff8db6ccc-8lpcj" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.701278 env[1524]: 2025-03-17 18:50:42.682 [INFO][4825] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Namespace="calico-system" Pod="calico-kube-controllers-ff8db6ccc-8lpcj" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.701278 env[1524]: 2025-03-17 18:50:42.682 [INFO][4825] cni-plugin/k8s.go 414: Added Mac, interface name, and active container ID to endpoint ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Namespace="calico-system" Pod="calico-kube-controllers-ff8db6ccc-8lpcj" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0", GenerateName:"calico-kube-controllers-ff8db6ccc-", Namespace:"calico-system", SelfLink:"", UID:"9b7bbe37-740f-49ee-a599-03cedfded2d6", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"ff8db6ccc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599", Pod:"calico-kube-controllers-ff8db6ccc-8lpcj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.103.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali015a9146b1d", MAC:"9e:f4:47:56:e0:88", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:42.701278 env[1524]: 2025-03-17 18:50:42.698 [INFO][4825] cni-plugin/k8s.go 500: Wrote updated endpoint to datastore ContainerID="1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599" Namespace="calico-system" Pod="calico-kube-controllers-ff8db6ccc-8lpcj" WorkloadEndpoint="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:42.723920 env[1524]: time="2025-03-17T18:50:42.723856884Z" level=info msg="loading plugin \"io.containerd.event.v1.publisher\"..." runtime=io.containerd.runc.v2 type=io.containerd.event.v1 Mar 17 18:50:42.724199 env[1524]: time="2025-03-17T18:50:42.724165886Z" level=info msg="loading plugin \"io.containerd.internal.v1.shutdown\"..." runtime=io.containerd.runc.v2 type=io.containerd.internal.v1 Mar 17 18:50:42.724338 env[1524]: time="2025-03-17T18:50:42.724316587Z" level=info msg="loading plugin \"io.containerd.ttrpc.v1.task\"..." runtime=io.containerd.runc.v2 type=io.containerd.ttrpc.v1 Mar 17 18:50:42.724562 env[1524]: time="2025-03-17T18:50:42.724536088Z" level=info msg="starting signal loop" namespace=k8s.io path=/run/containerd/io.containerd.runtime.v2.task/k8s.io/1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599 pid=4870 runtime=io.containerd.runc.v2 Mar 17 18:50:42.783808 systemd[1]: run-containerd-runc-k8s.io-1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599-runc.QF6wps.mount: Deactivated successfully. Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit: BPF prog-id=13 op=LOAD Mar 17 18:50:42.807000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7ffee1344dc0 a2=40 a3=1 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.807000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.807000 audit: BPF prog-id=13 op=UNLOAD Mar 17 18:50:42.807000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.807000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=0 a1=7ffee1344e90 a2=50 a3=7ffee1344f70 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.807000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.821000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.821000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffee1344dd0 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.821000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.822000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.822000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffee1344e00 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.822000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.822000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.822000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffee1344d10 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.822000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.822000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.822000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffee1344e20 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.822000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.822000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.822000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffee1344e00 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.822000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.822000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.822000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffee1344df0 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.822000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffee1344e20 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffee1344e00 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffee1344e20 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffee1344df0 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7ffee1344e60 a2=28 a3=0 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffee1344c10 a2=50 a3=1 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit: BPF prog-id=14 op=LOAD Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffee1344c10 a2=94 a3=5 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit: BPF prog-id=14 op=UNLOAD Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7ffee1344cc0 a2=50 a3=1 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=16 a1=7ffee1344de0 a2=4 a3=38 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.823000 audit[4824]: AVC avc: denied { confidentiality } for pid=4824 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:50:42.823000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffee1344e30 a2=94 a3=6 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.823000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.826000 audit[4824]: AVC avc: denied { confidentiality } for pid=4824 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:50:42.826000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffee13445e0 a2=94 a3=83 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.826000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { perfmon } for pid=4824 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { bpf } for pid=4824 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.827000 audit[4824]: AVC avc: denied { confidentiality } for pid=4824 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:50:42.827000 audit[4824]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7ffee13445e0 a2=94 a3=83 items=0 ppid=4804 pid=4824 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.827000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit: BPF prog-id=15 op=LOAD Mar 17 18:50:42.838000 audit[4904]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdc288af90 a2=98 a3=1999999999999999 items=0 ppid=4804 pid=4904 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.838000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:50:42.838000 audit: BPF prog-id=15 op=UNLOAD Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit: BPF prog-id=16 op=LOAD Mar 17 18:50:42.838000 audit[4904]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdc288ae70 a2=74 a3=ffff items=0 ppid=4804 pid=4904 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.838000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:50:42.838000 audit: BPF prog-id=16 op=UNLOAD Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { perfmon } for pid=4904 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit[4904]: AVC avc: denied { bpf } for pid=4904 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.838000 audit: BPF prog-id=17 op=LOAD Mar 17 18:50:42.838000 audit[4904]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffdc288aeb0 a2=40 a3=7ffdc288b090 items=0 ppid=4804 pid=4904 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.838000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Mar 17 18:50:42.839000 audit: BPF prog-id=17 op=UNLOAD Mar 17 18:50:42.860590 env[1524]: time="2025-03-17T18:50:42.860547726Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-ff8db6ccc-8lpcj,Uid:9b7bbe37-740f-49ee-a599-03cedfded2d6,Namespace:calico-system,Attempt:1,} returns sandbox id \"1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599\"" Mar 17 18:50:42.961587 systemd-networkd[1687]: vxlan.calico: Link UP Mar 17 18:50:42.961598 systemd-networkd[1687]: vxlan.calico: Gained carrier Mar 17 18:50:42.988000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.988000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.988000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.988000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.988000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.988000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.988000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.988000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.988000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.988000 audit: BPF prog-id=18 op=LOAD Mar 17 18:50:42.988000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffebb23b330 a2=98 a3=ffffffff items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.988000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.989000 audit: BPF prog-id=18 op=UNLOAD Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit: BPF prog-id=19 op=LOAD Mar 17 18:50:42.989000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffebb23b140 a2=74 a3=540051 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.989000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.989000 audit: BPF prog-id=19 op=UNLOAD Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.989000 audit: BPF prog-id=20 op=LOAD Mar 17 18:50:42.989000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7ffebb23b170 a2=94 a3=2 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.989000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.990000 audit: BPF prog-id=20 op=UNLOAD Mar 17 18:50:42.990000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.990000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffebb23b040 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.990000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.990000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.990000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffebb23b070 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.990000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.990000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.990000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffebb23af80 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.990000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.990000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.990000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffebb23b090 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.990000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.990000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.990000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffebb23b070 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.990000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.990000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.990000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffebb23b060 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.990000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.990000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.990000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffebb23b090 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.990000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffebb23b070 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.991000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffebb23b090 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.991000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7ffebb23b060 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.991000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=12 a1=7ffebb23b0d0 a2=28 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.991000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.991000 audit: BPF prog-id=21 op=LOAD Mar 17 18:50:42.991000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffebb23af40 a2=40 a3=0 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.991000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.991000 audit: BPF prog-id=21 op=UNLOAD Mar 17 18:50:42.992000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.992000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=0 a1=7ffebb23af30 a2=50 a3=2800 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.992000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=0 a1=7ffebb23af30 a2=50 a3=2800 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.994000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit: BPF prog-id=22 op=LOAD Mar 17 18:50:42.994000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffebb23a750 a2=94 a3=2 items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.994000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:42.994000 audit: BPF prog-id=22 op=UNLOAD Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { perfmon } for pid=4935 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit[4935]: AVC avc: denied { bpf } for pid=4935 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:42.994000 audit: BPF prog-id=23 op=LOAD Mar 17 18:50:42.994000 audit[4935]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7ffebb23a850 a2=94 a3=2d items=0 ppid=4804 pid=4935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:42.994000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit: BPF prog-id=24 op=LOAD Mar 17 18:50:43.001000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=3 a0=5 a1=7fff7c07a660 a2=98 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.001000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.001000 audit: BPF prog-id=24 op=UNLOAD Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.001000 audit: BPF prog-id=25 op=LOAD Mar 17 18:50:43.001000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff7c07a440 a2=74 a3=540051 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.001000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.002000 audit: BPF prog-id=25 op=UNLOAD Mar 17 18:50:43.002000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.002000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.002000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.002000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.002000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.002000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.002000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.002000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.002000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.002000 audit: BPF prog-id=26 op=LOAD Mar 17 18:50:43.002000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff7c07a470 a2=94 a3=2 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.002000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.002000 audit: BPF prog-id=26 op=UNLOAD Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit: BPF prog-id=27 op=LOAD Mar 17 18:50:43.132000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=5 a1=7fff7c07a330 a2=40 a3=1 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.132000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.132000 audit: BPF prog-id=27 op=UNLOAD Mar 17 18:50:43.132000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.132000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=0 a1=7fff7c07a400 a2=50 a3=7fff7c07a4e0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.132000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.146000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.146000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fff7c07a340 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.146000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.146000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.146000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fff7c07a370 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.146000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.146000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.146000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fff7c07a280 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.146000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.146000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.146000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fff7c07a390 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.146000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fff7c07a370 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fff7c07a360 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fff7c07a390 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fff7c07a370 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fff7c07a390 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=12 a1=7fff7c07a360 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=4 a0=12 a1=7fff7c07a3d0 a2=28 a3=0 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7fff7c07a180 a2=50 a3=1 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit: BPF prog-id=28 op=LOAD Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=6 a0=5 a1=7fff7c07a180 a2=94 a3=5 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit: BPF prog-id=28 op=UNLOAD Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=5 a0=0 a1=7fff7c07a230 a2=50 a3=1 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=16 a1=7fff7c07a350 a2=4 a3=38 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { confidentiality } for pid=4940 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7fff7c07a3a0 a2=94 a3=6 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.147000 audit[4940]: AVC avc: denied { confidentiality } for pid=4940 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:50:43.147000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7fff7c079b50 a2=94 a3=83 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.147000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { perfmon } for pid=4940 comm="bpftool" capability=38 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { confidentiality } for pid=4940 comm="bpftool" lockdown_reason="use of bpf to read kernel RAM" scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=lockdown permissive=0 Mar 17 18:50:43.148000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=no exit=-22 a0=5 a1=7fff7c079b50 a2=94 a3=83 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.148000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7fff7c07b590 a2=10 a3=f1f00800 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.148000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7fff7c07b430 a2=10 a3=3 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.148000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7fff7c07b3d0 a2=10 a3=3 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.148000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.148000 audit[4940]: AVC avc: denied { bpf } for pid=4940 comm="bpftool" capability=39 scontext=system_u:system_r:kernel_t:s0 tcontext=system_u:system_r:kernel_t:s0 tclass=capability2 permissive=0 Mar 17 18:50:43.148000 audit[4940]: SYSCALL arch=c000003e syscall=321 success=yes exit=0 a0=f a1=7fff7c07b3d0 a2=10 a3=7 items=0 ppid=4804 pid=4940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.148000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Mar 17 18:50:43.155000 audit: BPF prog-id=23 op=UNLOAD Mar 17 18:50:43.267853 systemd-networkd[1687]: cali4bd3043fa88: Gained IPv6LL Mar 17 18:50:43.309000 audit[4979]: NETFILTER_CFG table=mangle:104 family=2 entries=16 op=nft_register_chain pid=4979 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:50:43.309000 audit[4979]: SYSCALL arch=c000003e syscall=46 success=yes exit=6868 a0=3 a1=7ffe620ea790 a2=0 a3=7ffe620ea77c items=0 ppid=4804 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.309000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:50:43.328000 audit[4982]: NETFILTER_CFG table=nat:105 family=2 entries=15 op=nft_register_chain pid=4982 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:50:43.328000 audit[4982]: SYSCALL arch=c000003e syscall=46 success=yes exit=5084 a0=3 a1=7ffc33def730 a2=0 a3=7ffc33def71c items=0 ppid=4804 pid=4982 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.328000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:50:43.345000 audit[4978]: NETFILTER_CFG table=raw:106 family=2 entries=21 op=nft_register_chain pid=4978 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:50:43.345000 audit[4978]: SYSCALL arch=c000003e syscall=46 success=yes exit=8452 a0=3 a1=7fff9c809f60 a2=0 a3=7fff9c809f4c items=0 ppid=4804 pid=4978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.345000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:50:43.358000 audit[4980]: NETFILTER_CFG table=filter:107 family=2 entries=209 op=nft_register_chain pid=4980 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Mar 17 18:50:43.358000 audit[4980]: SYSCALL arch=c000003e syscall=46 success=yes exit=122920 a0=3 a1=7ffdeac041b0 a2=0 a3=7ffdeac0419c items=0 ppid=4804 pid=4980 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.358000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Mar 17 18:50:43.473061 systemd[1]: run-containerd-runc-k8s.io-7724dc134a0c6248f84c12c30eed51e14a73f0ffa47fe0423eec6266d70ad547-runc.Qe7fiX.mount: Deactivated successfully. Mar 17 18:50:43.581000 audit[5000]: NETFILTER_CFG table=filter:108 family=2 entries=10 op=nft_register_rule pid=5000 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:43.581000 audit[5000]: SYSCALL arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7ffdd9030300 a2=0 a3=7ffdd90302ec items=0 ppid=2885 pid=5000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.581000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:43.651000 audit[5000]: NETFILTER_CFG table=nat:109 family=2 entries=56 op=nft_register_chain pid=5000 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:43.651000 audit[5000]: SYSCALL arch=c000003e syscall=46 success=yes exit=19860 a0=3 a1=7ffdd9030300 a2=0 a3=7ffdd90302ec items=0 ppid=2885 pid=5000 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:43.651000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:43.971807 systemd-networkd[1687]: cali015a9146b1d: Gained IPv6LL Mar 17 18:50:44.484071 systemd-networkd[1687]: vxlan.calico: Gained IPv6LL Mar 17 18:50:44.855878 env[1524]: time="2025-03-17T18:50:44.855830845Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:44.861766 env[1524]: time="2025-03-17T18:50:44.861726881Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:44.865174 env[1524]: time="2025-03-17T18:50:44.865136201Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:44.868478 env[1524]: time="2025-03-17T18:50:44.868445721Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:44.868993 env[1524]: time="2025-03-17T18:50:44.868962024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Mar 17 18:50:44.870444 env[1524]: time="2025-03-17T18:50:44.870414733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\"" Mar 17 18:50:44.872366 env[1524]: time="2025-03-17T18:50:44.872045543Z" level=info msg="CreateContainer within sandbox \"20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 18:50:44.911296 env[1524]: time="2025-03-17T18:50:44.911245879Z" level=info msg="CreateContainer within sandbox \"20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"eab1d46559f3ea1f9ba7308e9e748a7fb972f90b98181a54e036e75bd720ac14\"" Mar 17 18:50:44.913249 env[1524]: time="2025-03-17T18:50:44.912860389Z" level=info msg="StartContainer for \"eab1d46559f3ea1f9ba7308e9e748a7fb972f90b98181a54e036e75bd720ac14\"" Mar 17 18:50:44.959104 systemd[1]: run-containerd-runc-k8s.io-eab1d46559f3ea1f9ba7308e9e748a7fb972f90b98181a54e036e75bd720ac14-runc.xwe7gS.mount: Deactivated successfully. Mar 17 18:50:45.024239 env[1524]: time="2025-03-17T18:50:45.024174160Z" level=info msg="StartContainer for \"eab1d46559f3ea1f9ba7308e9e748a7fb972f90b98181a54e036e75bd720ac14\" returns successfully" Mar 17 18:50:45.248616 env[1524]: time="2025-03-17T18:50:45.248552800Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:45.255716 env[1524]: time="2025-03-17T18:50:45.255658543Z" level=info msg="ImageUpdate event &ImageUpdate{Name:sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:45.261080 env[1524]: time="2025-03-17T18:50:45.261037075Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:45.265940 env[1524]: time="2025-03-17T18:50:45.265898604Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/apiserver@sha256:b8c43e264fe52e0c327b0bf3ac882a0224b33bdd7f4ff58a74242da7d9b00486,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:45.266661 env[1524]: time="2025-03-17T18:50:45.266630308Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.29.1\" returns image reference \"sha256:421726ace5ed13894f7edf594dd3a462947aedc13d0f69d08525d7369477fb70\"" Mar 17 18:50:45.269178 env[1524]: time="2025-03-17T18:50:45.269145023Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\"" Mar 17 18:50:45.270546 env[1524]: time="2025-03-17T18:50:45.270464831Z" level=info msg="CreateContainer within sandbox \"b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Mar 17 18:50:45.309907 env[1524]: time="2025-03-17T18:50:45.309853167Z" level=info msg="CreateContainer within sandbox \"b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"24ea6057abbff307334f9d303c5f0f2d8f6be4ebc313a58b12ebd1d193ca15db\"" Mar 17 18:50:45.310927 env[1524]: time="2025-03-17T18:50:45.310892273Z" level=info msg="StartContainer for \"24ea6057abbff307334f9d303c5f0f2d8f6be4ebc313a58b12ebd1d193ca15db\"" Mar 17 18:50:45.405489 env[1524]: time="2025-03-17T18:50:45.405443238Z" level=info msg="StartContainer for \"24ea6057abbff307334f9d303c5f0f2d8f6be4ebc313a58b12ebd1d193ca15db\" returns successfully" Mar 17 18:50:45.576882 kubelet[2704]: I0317 18:50:45.576735 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b476d9448-c9626" podStartSLOduration=30.417060336 podStartE2EDuration="35.576711661s" podCreationTimestamp="2025-03-17 18:50:10 +0000 UTC" firstStartedPulling="2025-03-17 18:50:40.108769794 +0000 UTC m=+52.356869892" lastFinishedPulling="2025-03-17 18:50:45.268421119 +0000 UTC m=+57.516521217" observedRunningTime="2025-03-17 18:50:45.541950553 +0000 UTC m=+57.790050651" watchObservedRunningTime="2025-03-17 18:50:45.576711661 +0000 UTC m=+57.824811859" Mar 17 18:50:45.592000 audit[5076]: NETFILTER_CFG table=filter:110 family=2 entries=10 op=nft_register_rule pid=5076 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:45.592000 audit[5076]: SYSCALL arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7ffe8a00c000 a2=0 a3=7ffe8a00bfec items=0 ppid=2885 pid=5076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:45.592000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:45.597000 audit[5076]: NETFILTER_CFG table=nat:111 family=2 entries=20 op=nft_register_rule pid=5076 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:45.597000 audit[5076]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7ffe8a00c000 a2=0 a3=7ffe8a00bfec items=0 ppid=2885 pid=5076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:45.597000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:45.623000 audit[5078]: NETFILTER_CFG table=filter:112 family=2 entries=10 op=nft_register_rule pid=5078 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:45.623000 audit[5078]: SYSCALL arch=c000003e syscall=46 success=yes exit=3676 a0=3 a1=7fffac1c92a0 a2=0 a3=7fffac1c928c items=0 ppid=2885 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:45.623000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:45.630000 audit[5078]: NETFILTER_CFG table=nat:113 family=2 entries=20 op=nft_register_rule pid=5078 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:45.630000 audit[5078]: SYSCALL arch=c000003e syscall=46 success=yes exit=5772 a0=3 a1=7fffac1c92a0 a2=0 a3=7fffac1c928c items=0 ppid=2885 pid=5078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:45.630000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:46.270197 kubelet[2704]: I0317 18:50:46.270129 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-6b476d9448-hjl2z" podStartSLOduration=31.470976314 podStartE2EDuration="36.270105489s" podCreationTimestamp="2025-03-17 18:50:10 +0000 UTC" firstStartedPulling="2025-03-17 18:50:40.071125857 +0000 UTC m=+52.319225955" lastFinishedPulling="2025-03-17 18:50:44.870255032 +0000 UTC m=+57.118355130" observedRunningTime="2025-03-17 18:50:45.579781079 +0000 UTC m=+57.827881177" watchObservedRunningTime="2025-03-17 18:50:46.270105489 +0000 UTC m=+58.518205587" Mar 17 18:50:46.658398 kernel: kauditd_printk_skb: 506 callbacks suppressed Mar 17 18:50:46.658539 kernel: audit: type=1325 audit(1742237446.642:410): table=filter:114 family=2 entries=9 op=nft_register_rule pid=5080 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:46.642000 audit[5080]: NETFILTER_CFG table=filter:114 family=2 entries=9 op=nft_register_rule pid=5080 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:46.642000 audit[5080]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffdb916c570 a2=0 a3=7ffdb916c55c items=0 ppid=2885 pid=5080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:46.678894 kernel: audit: type=1300 audit(1742237446.642:410): arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7ffdb916c570 a2=0 a3=7ffdb916c55c items=0 ppid=2885 pid=5080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:46.642000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:46.691288 kernel: audit: type=1327 audit(1742237446.642:410): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:46.690000 audit[5080]: NETFILTER_CFG table=nat:115 family=2 entries=27 op=nft_register_chain pid=5080 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:46.703161 kernel: audit: type=1325 audit(1742237446.690:411): table=nat:115 family=2 entries=27 op=nft_register_chain pid=5080 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:46.690000 audit[5080]: SYSCALL arch=c000003e syscall=46 success=yes exit=9348 a0=3 a1=7ffdb916c570 a2=0 a3=7ffdb916c55c items=0 ppid=2885 pid=5080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:46.722719 kernel: audit: type=1300 audit(1742237446.690:411): arch=c000003e syscall=46 success=yes exit=9348 a0=3 a1=7ffdb916c570 a2=0 a3=7ffdb916c55c items=0 ppid=2885 pid=5080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:46.690000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:46.757705 kernel: audit: type=1327 audit(1742237446.690:411): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:47.057077 env[1524]: time="2025-03-17T18:50:47.057030842Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:47.066660 env[1524]: time="2025-03-17T18:50:47.066604298Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:47.073029 env[1524]: time="2025-03-17T18:50:47.072994936Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:47.076902 env[1524]: time="2025-03-17T18:50:47.076865659Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/node-driver-registrar@sha256:a338da9488cbaa83c78457c3d7354d84149969c0480e88dd768e036632ff5b76,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:47.077396 env[1524]: time="2025-03-17T18:50:47.077364561Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.29.1\" returns image reference \"sha256:8b7d18f262d5cf6a6343578ad0db68a140c4c9989d9e02c58c27cb5d2c70320f\"" Mar 17 18:50:47.087804 env[1524]: time="2025-03-17T18:50:47.087768422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\"" Mar 17 18:50:47.088873 env[1524]: time="2025-03-17T18:50:47.088837729Z" level=info msg="CreateContainer within sandbox \"486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Mar 17 18:50:47.135905 env[1524]: time="2025-03-17T18:50:47.135857604Z" level=info msg="CreateContainer within sandbox \"486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"9bd80b71a4c472672f889ea5e2c4b1e37bb399b04f36d2346896a77544c642a2\"" Mar 17 18:50:47.137892 env[1524]: time="2025-03-17T18:50:47.136651509Z" level=info msg="StartContainer for \"9bd80b71a4c472672f889ea5e2c4b1e37bb399b04f36d2346896a77544c642a2\"" Mar 17 18:50:47.204991 env[1524]: time="2025-03-17T18:50:47.204951909Z" level=info msg="StartContainer for \"9bd80b71a4c472672f889ea5e2c4b1e37bb399b04f36d2346896a77544c642a2\" returns successfully" Mar 17 18:50:47.414034 kubelet[2704]: I0317 18:50:47.413903 2704 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Mar 17 18:50:47.414034 kubelet[2704]: I0317 18:50:47.413947 2704 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Mar 17 18:50:47.734000 audit[5119]: NETFILTER_CFG table=filter:116 family=2 entries=8 op=nft_register_rule pid=5119 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:47.734000 audit[5119]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7fff9fca7320 a2=0 a3=7fff9fca730c items=0 ppid=2885 pid=5119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:47.763704 kernel: audit: type=1325 audit(1742237447.734:412): table=filter:116 family=2 entries=8 op=nft_register_rule pid=5119 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:47.763803 kernel: audit: type=1300 audit(1742237447.734:412): arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7fff9fca7320 a2=0 a3=7fff9fca730c items=0 ppid=2885 pid=5119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:47.734000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:47.772998 kernel: audit: type=1327 audit(1742237447.734:412): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:47.747000 audit[5119]: NETFILTER_CFG table=nat:117 family=2 entries=34 op=nft_register_chain pid=5119 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:47.782503 kernel: audit: type=1325 audit(1742237447.747:413): table=nat:117 family=2 entries=34 op=nft_register_chain pid=5119 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:50:47.747000 audit[5119]: SYSCALL arch=c000003e syscall=46 success=yes exit=11236 a0=3 a1=7fff9fca7320 a2=0 a3=7fff9fca730c items=0 ppid=2885 pid=5119 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:50:47.747000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:50:48.281467 env[1524]: time="2025-03-17T18:50:48.281406404Z" level=info msg="StopPodSandbox for \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\"" Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.317 [WARNING][5135] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0", GenerateName:"calico-apiserver-6b476d9448-", Namespace:"calico-apiserver", SelfLink:"", UID:"181cc807-8e75-4c0c-9bab-b609021ab74e", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b476d9448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5", Pod:"calico-apiserver-6b476d9448-c9626", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.103.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliebc6c73aca4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.317 [INFO][5135] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.317 [INFO][5135] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" iface="eth0" netns="" Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.317 [INFO][5135] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.317 [INFO][5135] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.339 [INFO][5141] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" HandleID="k8s-pod-network.ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.340 [INFO][5141] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.343 [INFO][5141] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.350 [WARNING][5141] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" HandleID="k8s-pod-network.ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.351 [INFO][5141] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" HandleID="k8s-pod-network.ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.360 [INFO][5141] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:48.363531 env[1524]: 2025-03-17 18:50:48.361 [INFO][5135] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:48.364553 env[1524]: time="2025-03-17T18:50:48.364511787Z" level=info msg="TearDown network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\" successfully" Mar 17 18:50:48.364658 env[1524]: time="2025-03-17T18:50:48.364636688Z" level=info msg="StopPodSandbox for \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\" returns successfully" Mar 17 18:50:48.366748 env[1524]: time="2025-03-17T18:50:48.366639199Z" level=info msg="RemovePodSandbox for \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\"" Mar 17 18:50:48.366958 env[1524]: time="2025-03-17T18:50:48.366907201Z" level=info msg="Forcibly stopping sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\"" Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.417 [WARNING][5165] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0", GenerateName:"calico-apiserver-6b476d9448-", Namespace:"calico-apiserver", SelfLink:"", UID:"181cc807-8e75-4c0c-9bab-b609021ab74e", ResourceVersion:"864", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b476d9448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"b7fa7ffd4600abe5c5b1e13e826b625191137fafae90417036af38cfa69d9cb5", Pod:"calico-apiserver-6b476d9448-c9626", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.103.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"caliebc6c73aca4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.417 [INFO][5165] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.417 [INFO][5165] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" iface="eth0" netns="" Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.417 [INFO][5165] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.417 [INFO][5165] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.463 [INFO][5173] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" HandleID="k8s-pod-network.ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.463 [INFO][5173] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.463 [INFO][5173] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.471 [WARNING][5173] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" HandleID="k8s-pod-network.ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.471 [INFO][5173] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" HandleID="k8s-pod-network.ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--c9626-eth0" Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.472 [INFO][5173] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:48.475467 env[1524]: 2025-03-17 18:50:48.474 [INFO][5165] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0" Mar 17 18:50:48.475467 env[1524]: time="2025-03-17T18:50:48.475421231Z" level=info msg="TearDown network for sandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\" successfully" Mar 17 18:50:48.484917 env[1524]: time="2025-03-17T18:50:48.484824886Z" level=info msg="RemovePodSandbox \"ad745b3ef552a077199ebf7074a83d096aa9b28152f43047d8f8d586b8dd5cc0\" returns successfully" Mar 17 18:50:48.485699 env[1524]: time="2025-03-17T18:50:48.485629590Z" level=info msg="StopPodSandbox for \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\"" Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.550 [WARNING][5195] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0", GenerateName:"calico-kube-controllers-ff8db6ccc-", Namespace:"calico-system", SelfLink:"", UID:"9b7bbe37-740f-49ee-a599-03cedfded2d6", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"ff8db6ccc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599", Pod:"calico-kube-controllers-ff8db6ccc-8lpcj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.103.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali015a9146b1d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.550 [INFO][5195] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.550 [INFO][5195] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" iface="eth0" netns="" Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.550 [INFO][5195] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.550 [INFO][5195] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.587 [INFO][5201] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" HandleID="k8s-pod-network.81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.587 [INFO][5201] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.587 [INFO][5201] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.594 [WARNING][5201] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" HandleID="k8s-pod-network.81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.595 [INFO][5201] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" HandleID="k8s-pod-network.81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.596 [INFO][5201] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:48.598481 env[1524]: 2025-03-17 18:50:48.597 [INFO][5195] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:48.600267 env[1524]: time="2025-03-17T18:50:48.598444146Z" level=info msg="TearDown network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\" successfully" Mar 17 18:50:48.600267 env[1524]: time="2025-03-17T18:50:48.599802754Z" level=info msg="StopPodSandbox for \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\" returns successfully" Mar 17 18:50:48.600377 env[1524]: time="2025-03-17T18:50:48.600346357Z" level=info msg="RemovePodSandbox for \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\"" Mar 17 18:50:48.600432 env[1524]: time="2025-03-17T18:50:48.600387757Z" level=info msg="Forcibly stopping sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\"" Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.654 [WARNING][5222] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0", GenerateName:"calico-kube-controllers-ff8db6ccc-", Namespace:"calico-system", SelfLink:"", UID:"9b7bbe37-740f-49ee-a599-03cedfded2d6", ResourceVersion:"834", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"ff8db6ccc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599", Pod:"calico-kube-controllers-ff8db6ccc-8lpcj", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.103.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali015a9146b1d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.654 [INFO][5222] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.654 [INFO][5222] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" iface="eth0" netns="" Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.654 [INFO][5222] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.654 [INFO][5222] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.673 [INFO][5228] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" HandleID="k8s-pod-network.81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.673 [INFO][5228] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.673 [INFO][5228] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.680 [WARNING][5228] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" HandleID="k8s-pod-network.81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.680 [INFO][5228] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" HandleID="k8s-pod-network.81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--kube--controllers--ff8db6ccc--8lpcj-eth0" Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.681 [INFO][5228] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:48.683463 env[1524]: 2025-03-17 18:50:48.682 [INFO][5222] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b" Mar 17 18:50:48.684184 env[1524]: time="2025-03-17T18:50:48.683516640Z" level=info msg="TearDown network for sandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\" successfully" Mar 17 18:50:48.873586 env[1524]: time="2025-03-17T18:50:48.873463843Z" level=info msg="RemovePodSandbox \"81b84efd493623d2b47a86758e3aa9155762fdcb6e310744bd20573de6a8d69b\" returns successfully" Mar 17 18:50:48.874894 env[1524]: time="2025-03-17T18:50:48.874860451Z" level=info msg="StopPodSandbox for \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\"" Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:48.958 [WARNING][5248] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7349d13c-6ed5-461e-80f1-d59a9207db7f", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65", Pod:"coredns-7db6d8ff4d-58lvk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.103.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia35f6408891", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:48.958 [INFO][5248] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:48.958 [INFO][5248] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" iface="eth0" netns="" Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:48.958 [INFO][5248] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:48.958 [INFO][5248] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:48.990 [INFO][5254] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" HandleID="k8s-pod-network.420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:48.990 [INFO][5254] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:48.990 [INFO][5254] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:49.004 [WARNING][5254] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" HandleID="k8s-pod-network.420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:49.004 [INFO][5254] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" HandleID="k8s-pod-network.420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:49.006 [INFO][5254] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:49.009135 env[1524]: 2025-03-17 18:50:49.007 [INFO][5248] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:49.010085 env[1524]: time="2025-03-17T18:50:49.009174131Z" level=info msg="TearDown network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\" successfully" Mar 17 18:50:49.010085 env[1524]: time="2025-03-17T18:50:49.009211431Z" level=info msg="StopPodSandbox for \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\" returns successfully" Mar 17 18:50:49.010407 env[1524]: time="2025-03-17T18:50:49.010374438Z" level=info msg="RemovePodSandbox for \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\"" Mar 17 18:50:49.010465 env[1524]: time="2025-03-17T18:50:49.010420838Z" level=info msg="Forcibly stopping sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\"" Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.088 [WARNING][5276] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"7349d13c-6ed5-461e-80f1-d59a9207db7f", ResourceVersion:"800", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"a1ccfa4462a1284c85c15c406b4164480fcd390e8a350cc89b152fee46505b65", Pod:"coredns-7db6d8ff4d-58lvk", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.103.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calia35f6408891", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.088 [INFO][5276] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.088 [INFO][5276] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" iface="eth0" netns="" Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.088 [INFO][5276] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.088 [INFO][5276] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.122 [INFO][5283] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" HandleID="k8s-pod-network.420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.122 [INFO][5283] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.122 [INFO][5283] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.132 [WARNING][5283] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" HandleID="k8s-pod-network.420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.132 [INFO][5283] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" HandleID="k8s-pod-network.420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--58lvk-eth0" Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.133 [INFO][5283] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:49.136832 env[1524]: 2025-03-17 18:50:49.135 [INFO][5276] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e" Mar 17 18:50:49.137793 env[1524]: time="2025-03-17T18:50:49.137754271Z" level=info msg="TearDown network for sandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\" successfully" Mar 17 18:50:49.146999 env[1524]: time="2025-03-17T18:50:49.146956424Z" level=info msg="RemovePodSandbox \"420ad9eba2a143efc389f297676619a6f7ff3fc97d42ae38583b5c0d51c1e40e\" returns successfully" Mar 17 18:50:49.147828 env[1524]: time="2025-03-17T18:50:49.147799929Z" level=info msg="StopPodSandbox for \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\"" Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.234 [WARNING][5303] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"309db727-4be2-4bfe-b325-49e81b2078ad", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219", Pod:"coredns-7db6d8ff4d-hvsfz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.103.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4bd3043fa88", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.235 [INFO][5303] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.235 [INFO][5303] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" iface="eth0" netns="" Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.235 [INFO][5303] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.235 [INFO][5303] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.272 [INFO][5309] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" HandleID="k8s-pod-network.3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.272 [INFO][5309] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.272 [INFO][5309] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.281 [WARNING][5309] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" HandleID="k8s-pod-network.3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.281 [INFO][5309] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" HandleID="k8s-pod-network.3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.283 [INFO][5309] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:49.286445 env[1524]: 2025-03-17 18:50:49.284 [INFO][5303] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:49.287523 env[1524]: time="2025-03-17T18:50:49.287483933Z" level=info msg="TearDown network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\" successfully" Mar 17 18:50:49.287601 env[1524]: time="2025-03-17T18:50:49.287585933Z" level=info msg="StopPodSandbox for \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\" returns successfully" Mar 17 18:50:49.288203 env[1524]: time="2025-03-17T18:50:49.288174137Z" level=info msg="RemovePodSandbox for \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\"" Mar 17 18:50:49.288394 env[1524]: time="2025-03-17T18:50:49.288352938Z" level=info msg="Forcibly stopping sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\"" Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.397 [WARNING][5329] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0", GenerateName:"coredns-7db6d8ff4d-", Namespace:"kube-system", SelfLink:"", UID:"309db727-4be2-4bfe-b325-49e81b2078ad", ResourceVersion:"827", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7db6d8ff4d", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"45d686273d57c54889790886d42e27e2a60a8d182a138f43a048546e2d12b219", Pod:"coredns-7db6d8ff4d-hvsfz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.103.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4bd3043fa88", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.408 [INFO][5329] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.408 [INFO][5329] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" iface="eth0" netns="" Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.408 [INFO][5329] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.408 [INFO][5329] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.456 [INFO][5335] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" HandleID="k8s-pod-network.3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.456 [INFO][5335] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.456 [INFO][5335] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.463 [WARNING][5335] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" HandleID="k8s-pod-network.3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.463 [INFO][5335] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" HandleID="k8s-pod-network.3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Workload="ci--3510.3.7--a--b312ad98ee-k8s-coredns--7db6d8ff4d--hvsfz-eth0" Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.464 [INFO][5335] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:49.467319 env[1524]: 2025-03-17 18:50:49.466 [INFO][5329] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f" Mar 17 18:50:49.468723 env[1524]: time="2025-03-17T18:50:49.467288067Z" level=info msg="TearDown network for sandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\" successfully" Mar 17 18:50:49.480747 env[1524]: time="2025-03-17T18:50:49.480704745Z" level=info msg="RemovePodSandbox \"3254367b49f438f35ab8a07b772bc498092da458d191ce4a8f968575da907c9f\" returns successfully" Mar 17 18:50:49.481284 env[1524]: time="2025-03-17T18:50:49.481251248Z" level=info msg="StopPodSandbox for \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\"" Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.567 [WARNING][5354] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0", GenerateName:"calico-apiserver-6b476d9448-", Namespace:"calico-apiserver", SelfLink:"", UID:"7db7f05a-c0bd-436a-b834-87825c403071", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b476d9448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9", Pod:"calico-apiserver-6b476d9448-hjl2z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.103.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2f2537c9527", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.567 [INFO][5354] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.567 [INFO][5354] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" iface="eth0" netns="" Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.567 [INFO][5354] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.567 [INFO][5354] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.602 [INFO][5360] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" HandleID="k8s-pod-network.84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.602 [INFO][5360] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.602 [INFO][5360] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.609 [WARNING][5360] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" HandleID="k8s-pod-network.84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.609 [INFO][5360] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" HandleID="k8s-pod-network.84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.611 [INFO][5360] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:49.613860 env[1524]: 2025-03-17 18:50:49.612 [INFO][5354] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:49.614614 env[1524]: time="2025-03-17T18:50:49.614577815Z" level=info msg="TearDown network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\" successfully" Mar 17 18:50:49.614722 env[1524]: time="2025-03-17T18:50:49.614702616Z" level=info msg="StopPodSandbox for \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\" returns successfully" Mar 17 18:50:49.615344 env[1524]: time="2025-03-17T18:50:49.615311719Z" level=info msg="RemovePodSandbox for \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\"" Mar 17 18:50:49.615508 env[1524]: time="2025-03-17T18:50:49.615470220Z" level=info msg="Forcibly stopping sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\"" Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.710 [WARNING][5380] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0", GenerateName:"calico-apiserver-6b476d9448-", Namespace:"calico-apiserver", SelfLink:"", UID:"7db7f05a-c0bd-436a-b834-87825c403071", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6b476d9448", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"20c7149a49341464a348d14be453efa7b31e225d0eab85ac99268ce4119037f9", Pod:"calico-apiserver-6b476d9448-hjl2z", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.103.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2f2537c9527", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.710 [INFO][5380] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.710 [INFO][5380] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" iface="eth0" netns="" Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.711 [INFO][5380] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.711 [INFO][5380] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.774 [INFO][5386] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" HandleID="k8s-pod-network.84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.774 [INFO][5386] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.774 [INFO][5386] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.783 [WARNING][5386] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" HandleID="k8s-pod-network.84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.784 [INFO][5386] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" HandleID="k8s-pod-network.84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Workload="ci--3510.3.7--a--b312ad98ee-k8s-calico--apiserver--6b476d9448--hjl2z-eth0" Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.787 [INFO][5386] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:49.800565 env[1524]: 2025-03-17 18:50:49.795 [INFO][5380] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b" Mar 17 18:50:49.801735 env[1524]: time="2025-03-17T18:50:49.801232190Z" level=info msg="TearDown network for sandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\" successfully" Mar 17 18:50:49.814094 env[1524]: time="2025-03-17T18:50:49.814050063Z" level=info msg="RemovePodSandbox \"84af66cd97cb58a80f2073751406b6b97d911f1f3d04684a2833b2d62ef1b00b\" returns successfully" Mar 17 18:50:49.814699 env[1524]: time="2025-03-17T18:50:49.814654267Z" level=info msg="StopPodSandbox for \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\"" Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:49.953 [WARNING][5405] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7", Pod:"csi-node-driver-tt74p", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.103.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib10a649341f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:49.953 [INFO][5405] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:49.953 [INFO][5405] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" iface="eth0" netns="" Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:49.953 [INFO][5405] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:49.953 [INFO][5405] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:50.039 [INFO][5411] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" HandleID="k8s-pod-network.cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:50.039 [INFO][5411] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:50.040 [INFO][5411] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:50.047 [WARNING][5411] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" HandleID="k8s-pod-network.cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:50.047 [INFO][5411] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" HandleID="k8s-pod-network.cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:50.048 [INFO][5411] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:50.055687 env[1524]: 2025-03-17 18:50:50.053 [INFO][5405] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:50.056682 env[1524]: time="2025-03-17T18:50:50.056631457Z" level=info msg="TearDown network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\" successfully" Mar 17 18:50:50.056796 env[1524]: time="2025-03-17T18:50:50.056772257Z" level=info msg="StopPodSandbox for \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\" returns successfully" Mar 17 18:50:50.057386 env[1524]: time="2025-03-17T18:50:50.057346961Z" level=info msg="RemovePodSandbox for \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\"" Mar 17 18:50:50.057583 env[1524]: time="2025-03-17T18:50:50.057519162Z" level=info msg="Forcibly stopping sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\"" Mar 17 18:50:50.163002 env[1524]: time="2025-03-17T18:50:50.162564861Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:50.174133 env[1524]: time="2025-03-17T18:50:50.174089527Z" level=info msg="ImageCreate event &ImageCreate{Name:sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:50.179489 env[1524]: time="2025-03-17T18:50:50.178250551Z" level=info msg="ImageUpdate event &ImageUpdate{Name:ghcr.io/flatcar/calico/kube-controllers:v3.29.1,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:50.181790 env[1524]: time="2025-03-17T18:50:50.181754270Z" level=info msg="ImageCreate event &ImageCreate{Name:ghcr.io/flatcar/calico/kube-controllers@sha256:1072d6a98167a14ca361e9ce757733f9bae36d1f1c6a9621ea10934b6b1e10d9,Labels:map[string]string{io.cri-containerd.image: managed,},XXX_unrecognized:[],}" Mar 17 18:50:50.182074 env[1524]: time="2025-03-17T18:50:50.182039872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.29.1\" returns image reference \"sha256:6331715a2ae96b18a770a395cac108321d108e445e08b616e5bc9fbd1f9c21da\"" Mar 17 18:50:50.204808 env[1524]: time="2025-03-17T18:50:50.204112998Z" level=info msg="CreateContainer within sandbox \"1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.139 [WARNING][5432] cni-plugin/k8s.go 572: CNI_CONTAINERID does not match WorkloadEndpoint ContainerID, don't delete WEP. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" WorkloadEndpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"eef2eaf0-c357-4fb2-81b7-bc23a7b1f2ce", ResourceVersion:"879", Generation:0, CreationTimestamp:time.Date(2025, time.March, 17, 18, 50, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"65bf684474", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-3510.3.7-a-b312ad98ee", ContainerID:"486c536203e927ee6c1213bbe59b4c61e8911b378b9e24ec8f16a034aab6fec7", Pod:"csi-node-driver-tt74p", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.103.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calib10a649341f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil)}} Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.139 [INFO][5432] cni-plugin/k8s.go 608: Cleaning up netns ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.139 [INFO][5432] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" iface="eth0" netns="" Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.139 [INFO][5432] cni-plugin/k8s.go 615: Releasing IP address(es) ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.139 [INFO][5432] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.173 [INFO][5438] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" HandleID="k8s-pod-network.cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.175 [INFO][5438] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.176 [INFO][5438] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.181 [WARNING][5438] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" HandleID="k8s-pod-network.cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.201 [INFO][5438] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" HandleID="k8s-pod-network.cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Workload="ci--3510.3.7--a--b312ad98ee-k8s-csi--node--driver--tt74p-eth0" Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.206 [INFO][5438] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Mar 17 18:50:50.208657 env[1524]: 2025-03-17 18:50:50.207 [INFO][5432] cni-plugin/k8s.go 621: Teardown processing complete. ContainerID="cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e" Mar 17 18:50:50.209280 env[1524]: time="2025-03-17T18:50:50.208714724Z" level=info msg="TearDown network for sandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\" successfully" Mar 17 18:50:50.233981 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3122852353.mount: Deactivated successfully. Mar 17 18:50:50.237663 env[1524]: time="2025-03-17T18:50:50.237618789Z" level=info msg="RemovePodSandbox \"cdf02da6c4d5b0b30a3123a27ee56d9ed156598de22f827ed2ecda664e95e15e\" returns successfully" Mar 17 18:50:50.251221 env[1524]: time="2025-03-17T18:50:50.251183767Z" level=info msg="CreateContainer within sandbox \"1006f6c0fbaaa74e73e28eecb81d5b2a8f592563782a992247f239d589a88599\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29\"" Mar 17 18:50:50.252001 env[1524]: time="2025-03-17T18:50:50.251969871Z" level=info msg="StartContainer for \"2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29\"" Mar 17 18:50:50.324919 env[1524]: time="2025-03-17T18:50:50.324820187Z" level=info msg="StartContainer for \"2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29\" returns successfully" Mar 17 18:50:50.567990 kubelet[2704]: I0317 18:50:50.567099 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-tt74p" podStartSLOduration=32.554290924 podStartE2EDuration="39.567079869s" podCreationTimestamp="2025-03-17 18:50:11 +0000 UTC" firstStartedPulling="2025-03-17 18:50:40.065781424 +0000 UTC m=+52.313881522" lastFinishedPulling="2025-03-17 18:50:47.078570269 +0000 UTC m=+59.326670467" observedRunningTime="2025-03-17 18:50:47.546617712 +0000 UTC m=+59.794717910" watchObservedRunningTime="2025-03-17 18:50:50.567079869 +0000 UTC m=+62.815179967" Mar 17 18:50:50.567990 kubelet[2704]: I0317 18:50:50.567790 2704 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-ff8db6ccc-8lpcj" podStartSLOduration=32.246103927 podStartE2EDuration="39.567773173s" podCreationTimestamp="2025-03-17 18:50:11 +0000 UTC" firstStartedPulling="2025-03-17 18:50:42.861787734 +0000 UTC m=+55.109887932" lastFinishedPulling="2025-03-17 18:50:50.18345708 +0000 UTC m=+62.431557178" observedRunningTime="2025-03-17 18:50:50.564984657 +0000 UTC m=+62.813084855" watchObservedRunningTime="2025-03-17 18:50:50.567773173 +0000 UTC m=+62.815873271" Mar 17 18:50:55.764629 systemd[1]: run-containerd-runc-k8s.io-2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29-runc.AUO41u.mount: Deactivated successfully. Mar 17 18:51:09.477618 systemd[1]: run-containerd-runc-k8s.io-2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29-runc.lFo0ag.mount: Deactivated successfully. Mar 17 18:51:25.762997 systemd[1]: run-containerd-runc-k8s.io-2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29-runc.UQV6MA.mount: Deactivated successfully. Mar 17 18:51:43.294346 systemd[1]: run-containerd-runc-k8s.io-7724dc134a0c6248f84c12c30eed51e14a73f0ffa47fe0423eec6266d70ad547-runc.gsVhjy.mount: Deactivated successfully. Mar 17 18:51:55.763050 systemd[1]: run-containerd-runc-k8s.io-2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29-runc.3pTI8O.mount: Deactivated successfully. Mar 17 18:52:09.477908 systemd[1]: run-containerd-runc-k8s.io-2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29-runc.0J6x9l.mount: Deactivated successfully. Mar 17 18:52:13.295033 systemd[1]: run-containerd-runc-k8s.io-7724dc134a0c6248f84c12c30eed51e14a73f0ffa47fe0423eec6266d70ad547-runc.e7dARl.mount: Deactivated successfully. Mar 17 18:52:22.344771 systemd[1]: Started sshd@7-10.200.8.36:22-10.200.16.10:51618.service. Mar 17 18:52:22.365687 kernel: kauditd_printk_skb: 2 callbacks suppressed Mar 17 18:52:22.365840 kernel: audit: type=1130 audit(1742237542.345:414): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.200.8.36:22-10.200.16.10:51618 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:22.345000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.200.8.36:22-10.200.16.10:51618 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:22.973000 audit[5726]: USER_ACCT pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:22.993020 sshd[5726]: Accepted publickey for core from 10.200.16.10 port 51618 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:52:22.993459 sshd[5726]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:52:22.993757 kernel: audit: type=1101 audit(1742237542.973:415): pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:22.991000 audit[5726]: CRED_ACQ pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:23.003925 systemd[1]: Started session-10.scope. Mar 17 18:52:23.004966 systemd-logind[1500]: New session 10 of user core. Mar 17 18:52:23.010930 kernel: audit: type=1103 audit(1742237542.991:416): pid=5726 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:22.991000 audit[5726]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0e6a1bb0 a2=3 a3=0 items=0 ppid=1 pid=5726 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:23.038226 kernel: audit: type=1006 audit(1742237542.991:417): pid=5726 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Mar 17 18:52:23.038305 kernel: audit: type=1300 audit(1742237542.991:417): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd0e6a1bb0 a2=3 a3=0 items=0 ppid=1 pid=5726 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:22.991000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:23.043385 kernel: audit: type=1327 audit(1742237542.991:417): proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:23.043452 kernel: audit: type=1105 audit(1742237543.009:418): pid=5726 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:23.009000 audit[5726]: USER_START pid=5726 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:23.011000 audit[5729]: CRED_ACQ pid=5729 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:23.074107 kernel: audit: type=1103 audit(1742237543.011:419): pid=5729 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:23.487834 sshd[5726]: pam_unix(sshd:session): session closed for user core Mar 17 18:52:23.487000 audit[5726]: USER_END pid=5726 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:23.490853 systemd[1]: sshd@7-10.200.8.36:22-10.200.16.10:51618.service: Deactivated successfully. Mar 17 18:52:23.491788 systemd[1]: session-10.scope: Deactivated successfully. Mar 17 18:52:23.498781 systemd-logind[1500]: Session 10 logged out. Waiting for processes to exit. Mar 17 18:52:23.499709 systemd-logind[1500]: Removed session 10. Mar 17 18:52:23.487000 audit[5726]: CRED_DISP pid=5726 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:23.520784 kernel: audit: type=1106 audit(1742237543.487:420): pid=5726 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:23.520982 kernel: audit: type=1104 audit(1742237543.487:421): pid=5726 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:23.489000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.200.8.36:22-10.200.16.10:51618 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:25.769885 systemd[1]: run-containerd-runc-k8s.io-2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29-runc.vIMUIw.mount: Deactivated successfully. Mar 17 18:52:28.613598 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:52:28.613764 kernel: audit: type=1130 audit(1742237548.591:423): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.200.8.36:22-10.200.16.10:40146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:28.591000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.200.8.36:22-10.200.16.10:40146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:28.592517 systemd[1]: Started sshd@8-10.200.8.36:22-10.200.16.10:40146.service. Mar 17 18:52:29.213000 audit[5758]: USER_ACCT pid=5758 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.232548 sshd[5758]: Accepted publickey for core from 10.200.16.10 port 40146 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:52:29.232933 kernel: audit: type=1101 audit(1742237549.213:424): pid=5758 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.231000 audit[5758]: CRED_ACQ pid=5758 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.238424 systemd[1]: Started session-11.scope. Mar 17 18:52:29.233391 sshd[5758]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:52:29.248942 kernel: audit: type=1103 audit(1742237549.231:425): pid=5758 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.239441 systemd-logind[1500]: New session 11 of user core. Mar 17 18:52:29.231000 audit[5758]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd9190dca0 a2=3 a3=0 items=0 ppid=1 pid=5758 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:29.279367 kernel: audit: type=1006 audit(1742237549.231:426): pid=5758 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Mar 17 18:52:29.279521 kernel: audit: type=1300 audit(1742237549.231:426): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd9190dca0 a2=3 a3=0 items=0 ppid=1 pid=5758 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:29.231000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:29.247000 audit[5758]: USER_START pid=5758 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.285684 kernel: audit: type=1327 audit(1742237549.231:426): proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:29.285775 kernel: audit: type=1105 audit(1742237549.247:427): pid=5758 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.249000 audit[5761]: CRED_ACQ pid=5761 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.302688 kernel: audit: type=1103 audit(1742237549.249:428): pid=5761 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.714249 sshd[5758]: pam_unix(sshd:session): session closed for user core Mar 17 18:52:29.714000 audit[5758]: USER_END pid=5758 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.718093 systemd-logind[1500]: Session 11 logged out. Waiting for processes to exit. Mar 17 18:52:29.719542 systemd[1]: sshd@8-10.200.8.36:22-10.200.16.10:40146.service: Deactivated successfully. Mar 17 18:52:29.720513 systemd[1]: session-11.scope: Deactivated successfully. Mar 17 18:52:29.722053 systemd-logind[1500]: Removed session 11. Mar 17 18:52:29.714000 audit[5758]: CRED_DISP pid=5758 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.748424 kernel: audit: type=1106 audit(1742237549.714:429): pid=5758 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.748535 kernel: audit: type=1104 audit(1742237549.714:430): pid=5758 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:29.714000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.200.8.36:22-10.200.16.10:40146 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:34.829976 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:52:34.830097 kernel: audit: type=1130 audit(1742237554.816:432): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.200.8.36:22-10.200.16.10:40160 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:34.816000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.200.8.36:22-10.200.16.10:40160 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:34.818294 systemd[1]: Started sshd@9-10.200.8.36:22-10.200.16.10:40160.service. Mar 17 18:52:35.440000 audit[5771]: USER_ACCT pid=5771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.444072 sshd[5771]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:52:35.460954 kernel: audit: type=1101 audit(1742237555.440:433): pid=5771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.460991 sshd[5771]: Accepted publickey for core from 10.200.16.10 port 40160 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:52:35.442000 audit[5771]: CRED_ACQ pid=5771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.465736 systemd[1]: Started session-12.scope. Mar 17 18:52:35.466737 systemd-logind[1500]: New session 12 of user core. Mar 17 18:52:35.478708 kernel: audit: type=1103 audit(1742237555.442:434): pid=5771 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.489700 kernel: audit: type=1006 audit(1742237555.442:435): pid=5771 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Mar 17 18:52:35.442000 audit[5771]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffffb0871e0 a2=3 a3=0 items=0 ppid=1 pid=5771 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:35.506687 kernel: audit: type=1300 audit(1742237555.442:435): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffffb0871e0 a2=3 a3=0 items=0 ppid=1 pid=5771 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:35.506764 kernel: audit: type=1327 audit(1742237555.442:435): proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:35.442000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:35.469000 audit[5771]: USER_START pid=5771 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.526849 kernel: audit: type=1105 audit(1742237555.469:436): pid=5771 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.526927 kernel: audit: type=1103 audit(1742237555.488:437): pid=5776 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.488000 audit[5776]: CRED_ACQ pid=5776 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.948560 sshd[5771]: pam_unix(sshd:session): session closed for user core Mar 17 18:52:35.948000 audit[5771]: USER_END pid=5771 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.958072 systemd[1]: sshd@9-10.200.8.36:22-10.200.16.10:40160.service: Deactivated successfully. Mar 17 18:52:35.959028 systemd[1]: session-12.scope: Deactivated successfully. Mar 17 18:52:35.960433 systemd-logind[1500]: Session 12 logged out. Waiting for processes to exit. Mar 17 18:52:35.961384 systemd-logind[1500]: Removed session 12. Mar 17 18:52:35.969687 kernel: audit: type=1106 audit(1742237555.948:438): pid=5771 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.949000 audit[5771]: CRED_DISP pid=5771 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:35.956000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.200.8.36:22-10.200.16.10:40160 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:35.984768 kernel: audit: type=1104 audit(1742237555.949:439): pid=5771 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:41.073473 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:52:41.073642 kernel: audit: type=1130 audit(1742237561.051:441): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.200.8.36:22-10.200.16.10:37220 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:41.051000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.200.8.36:22-10.200.16.10:37220 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:41.052497 systemd[1]: Started sshd@10-10.200.8.36:22-10.200.16.10:37220.service. Mar 17 18:52:41.674000 audit[5787]: USER_ACCT pid=5787 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:41.676835 sshd[5787]: Accepted publickey for core from 10.200.16.10 port 37220 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:52:41.693694 kernel: audit: type=1101 audit(1742237561.674:442): pid=5787 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:41.692000 audit[5787]: CRED_ACQ pid=5787 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:41.694623 sshd[5787]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:52:41.699961 systemd[1]: Started session-13.scope. Mar 17 18:52:41.701084 systemd-logind[1500]: New session 13 of user core. Mar 17 18:52:41.720156 kernel: audit: type=1103 audit(1742237561.692:443): pid=5787 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:41.720261 kernel: audit: type=1006 audit(1742237561.692:444): pid=5787 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Mar 17 18:52:41.720292 kernel: audit: type=1300 audit(1742237561.692:444): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc802e3920 a2=3 a3=0 items=0 ppid=1 pid=5787 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:41.692000 audit[5787]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc802e3920 a2=3 a3=0 items=0 ppid=1 pid=5787 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:41.692000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:41.704000 audit[5787]: USER_START pid=5787 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:41.741693 kernel: audit: type=1327 audit(1742237561.692:444): proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:41.741743 kernel: audit: type=1105 audit(1742237561.704:445): pid=5787 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:41.706000 audit[5790]: CRED_ACQ pid=5790 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:41.757688 kernel: audit: type=1103 audit(1742237561.706:446): pid=5790 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:42.177181 sshd[5787]: pam_unix(sshd:session): session closed for user core Mar 17 18:52:42.177000 audit[5787]: USER_END pid=5787 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:42.181262 systemd-logind[1500]: Session 13 logged out. Waiting for processes to exit. Mar 17 18:52:42.183097 systemd[1]: sshd@10-10.200.8.36:22-10.200.16.10:37220.service: Deactivated successfully. Mar 17 18:52:42.184103 systemd[1]: session-13.scope: Deactivated successfully. Mar 17 18:52:42.186096 systemd-logind[1500]: Removed session 13. Mar 17 18:52:42.177000 audit[5787]: CRED_DISP pid=5787 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:42.211689 kernel: audit: type=1106 audit(1742237562.177:447): pid=5787 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:42.211788 kernel: audit: type=1104 audit(1742237562.177:448): pid=5787 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:42.181000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.200.8.36:22-10.200.16.10:37220 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:42.281014 systemd[1]: Started sshd@11-10.200.8.36:22-10.200.16.10:37232.service. Mar 17 18:52:42.279000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.200.8.36:22-10.200.16.10:37232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:42.900000 audit[5801]: USER_ACCT pid=5801 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:42.902550 sshd[5801]: Accepted publickey for core from 10.200.16.10 port 37232 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:52:42.902000 audit[5801]: CRED_ACQ pid=5801 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:42.902000 audit[5801]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff3063a200 a2=3 a3=0 items=0 ppid=1 pid=5801 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:42.902000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:42.904171 sshd[5801]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:52:42.909609 systemd[1]: Started session-14.scope. Mar 17 18:52:42.910422 systemd-logind[1500]: New session 14 of user core. Mar 17 18:52:42.914000 audit[5801]: USER_START pid=5801 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:42.916000 audit[5804]: CRED_ACQ pid=5804 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:43.488802 sshd[5801]: pam_unix(sshd:session): session closed for user core Mar 17 18:52:43.488000 audit[5801]: USER_END pid=5801 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:43.488000 audit[5801]: CRED_DISP pid=5801 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:43.492385 systemd-logind[1500]: Session 14 logged out. Waiting for processes to exit. Mar 17 18:52:43.492000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.200.8.36:22-10.200.16.10:37232 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:43.493503 systemd[1]: sshd@11-10.200.8.36:22-10.200.16.10:37232.service: Deactivated successfully. Mar 17 18:52:43.494588 systemd[1]: session-14.scope: Deactivated successfully. Mar 17 18:52:43.496520 systemd-logind[1500]: Removed session 14. Mar 17 18:52:43.593000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.200.8.36:22-10.200.16.10:37246 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:43.594764 systemd[1]: Started sshd@12-10.200.8.36:22-10.200.16.10:37246.service. Mar 17 18:52:44.224000 audit[5832]: USER_ACCT pid=5832 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:44.225336 sshd[5832]: Accepted publickey for core from 10.200.16.10 port 37246 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:52:44.226000 audit[5832]: CRED_ACQ pid=5832 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:44.226000 audit[5832]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeaa868950 a2=3 a3=0 items=0 ppid=1 pid=5832 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:44.226000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:44.227506 sshd[5832]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:52:44.232738 systemd-logind[1500]: New session 15 of user core. Mar 17 18:52:44.232999 systemd[1]: Started session-15.scope. Mar 17 18:52:44.239000 audit[5832]: USER_START pid=5832 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:44.240000 audit[5835]: CRED_ACQ pid=5835 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:44.722853 sshd[5832]: pam_unix(sshd:session): session closed for user core Mar 17 18:52:44.723000 audit[5832]: USER_END pid=5832 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:44.723000 audit[5832]: CRED_DISP pid=5832 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:44.726401 systemd[1]: sshd@12-10.200.8.36:22-10.200.16.10:37246.service: Deactivated successfully. Mar 17 18:52:44.726000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.200.8.36:22-10.200.16.10:37246 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:44.728841 systemd[1]: session-15.scope: Deactivated successfully. Mar 17 18:52:44.729714 systemd-logind[1500]: Session 15 logged out. Waiting for processes to exit. Mar 17 18:52:44.730702 systemd-logind[1500]: Removed session 15. Mar 17 18:52:49.848091 kernel: kauditd_printk_skb: 23 callbacks suppressed Mar 17 18:52:49.848277 kernel: audit: type=1130 audit(1742237569.826:468): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.200.8.36:22-10.200.16.10:58036 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:49.826000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.200.8.36:22-10.200.16.10:58036 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:49.827185 systemd[1]: Started sshd@13-10.200.8.36:22-10.200.16.10:58036.service. Mar 17 18:52:50.449000 audit[5851]: USER_ACCT pid=5851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.469560 sshd[5851]: Accepted publickey for core from 10.200.16.10 port 58036 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:52:50.469973 kernel: audit: type=1101 audit(1742237570.449:469): pid=5851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.468000 audit[5851]: CRED_ACQ pid=5851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.470083 sshd[5851]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:52:50.488278 systemd[1]: Started session-16.scope. Mar 17 18:52:50.489389 systemd-logind[1500]: New session 16 of user core. Mar 17 18:52:50.498421 kernel: audit: type=1103 audit(1742237570.468:470): pid=5851 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.498586 kernel: audit: type=1006 audit(1742237570.468:471): pid=5851 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Mar 17 18:52:50.468000 audit[5851]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdbb2e3cf0 a2=3 a3=0 items=0 ppid=1 pid=5851 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:50.523210 kernel: audit: type=1300 audit(1742237570.468:471): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdbb2e3cf0 a2=3 a3=0 items=0 ppid=1 pid=5851 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:50.523314 kernel: audit: type=1327 audit(1742237570.468:471): proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:50.468000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:50.498000 audit[5851]: USER_START pid=5851 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.544785 kernel: audit: type=1105 audit(1742237570.498:472): pid=5851 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.544904 kernel: audit: type=1103 audit(1742237570.501:473): pid=5854 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.501000 audit[5854]: CRED_ACQ pid=5854 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.969459 sshd[5851]: pam_unix(sshd:session): session closed for user core Mar 17 18:52:50.970000 audit[5851]: USER_END pid=5851 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.978139 systemd[1]: sshd@13-10.200.8.36:22-10.200.16.10:58036.service: Deactivated successfully. Mar 17 18:52:50.978937 systemd[1]: session-16.scope: Deactivated successfully. Mar 17 18:52:50.980413 systemd-logind[1500]: Session 16 logged out. Waiting for processes to exit. Mar 17 18:52:50.981336 systemd-logind[1500]: Removed session 16. Mar 17 18:52:50.970000 audit[5851]: CRED_DISP pid=5851 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:51.003327 kernel: audit: type=1106 audit(1742237570.970:474): pid=5851 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:51.003404 kernel: audit: type=1104 audit(1742237570.970:475): pid=5851 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:50.975000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.200.8.36:22-10.200.16.10:58036 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:55.760298 systemd[1]: run-containerd-runc-k8s.io-2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29-runc.wQCSlx.mount: Deactivated successfully. Mar 17 18:52:56.085024 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:52:56.085161 kernel: audit: type=1130 audit(1742237576.072:477): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.200.8.36:22-10.200.16.10:58048 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:56.072000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.200.8.36:22-10.200.16.10:58048 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:52:56.073320 systemd[1]: Started sshd@14-10.200.8.36:22-10.200.16.10:58048.service. Mar 17 18:52:56.696000 audit[5881]: USER_ACCT pid=5881 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:56.697000 audit[5881]: CRED_ACQ pid=5881 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:56.698315 sshd[5881]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:52:56.715045 sshd[5881]: Accepted publickey for core from 10.200.16.10 port 58048 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:52:56.720418 systemd[1]: Started session-17.scope. Mar 17 18:52:56.721692 systemd-logind[1500]: New session 17 of user core. Mar 17 18:52:56.730856 kernel: audit: type=1101 audit(1742237576.696:478): pid=5881 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:56.730965 kernel: audit: type=1103 audit(1742237576.697:479): pid=5881 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:56.741682 kernel: audit: type=1006 audit(1742237576.697:480): pid=5881 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Mar 17 18:52:56.697000 audit[5881]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe97e59110 a2=3 a3=0 items=0 ppid=1 pid=5881 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:56.697000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:56.758688 kernel: audit: type=1300 audit(1742237576.697:480): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe97e59110 a2=3 a3=0 items=0 ppid=1 pid=5881 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:52:56.758739 kernel: audit: type=1327 audit(1742237576.697:480): proctitle=737368643A20636F7265205B707269765D Mar 17 18:52:56.726000 audit[5881]: USER_START pid=5881 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:56.764723 kernel: audit: type=1105 audit(1742237576.726:481): pid=5881 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:56.733000 audit[5884]: CRED_ACQ pid=5884 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:56.794965 kernel: audit: type=1103 audit(1742237576.733:482): pid=5884 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:57.197620 sshd[5881]: pam_unix(sshd:session): session closed for user core Mar 17 18:52:57.198000 audit[5881]: USER_END pid=5881 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:57.201072 systemd[1]: sshd@14-10.200.8.36:22-10.200.16.10:58048.service: Deactivated successfully. Mar 17 18:52:57.202037 systemd[1]: session-17.scope: Deactivated successfully. Mar 17 18:52:57.209087 systemd-logind[1500]: Session 17 logged out. Waiting for processes to exit. Mar 17 18:52:57.210057 systemd-logind[1500]: Removed session 17. Mar 17 18:52:57.198000 audit[5881]: CRED_DISP pid=5881 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:57.231007 kernel: audit: type=1106 audit(1742237577.198:483): pid=5881 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:57.231097 kernel: audit: type=1104 audit(1742237577.198:484): pid=5881 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:52:57.198000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.200.8.36:22-10.200.16.10:58048 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:02.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.200.8.36:22-10.200.16.10:48430 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:02.302284 systemd[1]: Started sshd@15-10.200.8.36:22-10.200.16.10:48430.service. Mar 17 18:53:02.307148 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:53:02.307231 kernel: audit: type=1130 audit(1742237582.301:486): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.200.8.36:22-10.200.16.10:48430 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:02.933000 audit[5894]: USER_ACCT pid=5894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:02.934687 sshd[5894]: Accepted publickey for core from 10.200.16.10 port 48430 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:02.950000 audit[5894]: CRED_ACQ pid=5894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:02.951487 sshd[5894]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:02.957150 systemd[1]: Started session-18.scope. Mar 17 18:53:02.958315 systemd-logind[1500]: New session 18 of user core. Mar 17 18:53:02.966897 kernel: audit: type=1101 audit(1742237582.933:487): pid=5894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:02.966992 kernel: audit: type=1103 audit(1742237582.950:488): pid=5894 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:02.982688 kernel: audit: type=1006 audit(1742237582.950:489): pid=5894 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Mar 17 18:53:02.982793 kernel: audit: type=1300 audit(1742237582.950:489): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcede43340 a2=3 a3=0 items=0 ppid=1 pid=5894 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:02.950000 audit[5894]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcede43340 a2=3 a3=0 items=0 ppid=1 pid=5894 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:02.950000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:02.962000 audit[5894]: USER_START pid=5894 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:03.015921 kernel: audit: type=1327 audit(1742237582.950:489): proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:03.016036 kernel: audit: type=1105 audit(1742237582.962:490): pid=5894 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:03.016066 kernel: audit: type=1103 audit(1742237582.967:491): pid=5897 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:02.967000 audit[5897]: CRED_ACQ pid=5897 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:03.436236 sshd[5894]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:03.437000 audit[5894]: USER_END pid=5894 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:03.443433 systemd[1]: sshd@15-10.200.8.36:22-10.200.16.10:48430.service: Deactivated successfully. Mar 17 18:53:03.444472 systemd[1]: session-18.scope: Deactivated successfully. Mar 17 18:53:03.445976 systemd-logind[1500]: Session 18 logged out. Waiting for processes to exit. Mar 17 18:53:03.446907 systemd-logind[1500]: Removed session 18. Mar 17 18:53:03.437000 audit[5894]: CRED_DISP pid=5894 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:03.470618 kernel: audit: type=1106 audit(1742237583.437:492): pid=5894 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:03.470762 kernel: audit: type=1104 audit(1742237583.437:493): pid=5894 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:03.443000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.200.8.36:22-10.200.16.10:48430 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:03.539757 systemd[1]: Started sshd@16-10.200.8.36:22-10.200.16.10:48436.service. Mar 17 18:53:03.539000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.200.8.36:22-10.200.16.10:48436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:04.161000 audit[5907]: USER_ACCT pid=5907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:04.162427 sshd[5907]: Accepted publickey for core from 10.200.16.10 port 48436 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:04.163000 audit[5907]: CRED_ACQ pid=5907 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:04.163000 audit[5907]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffd3cc0eab0 a2=3 a3=0 items=0 ppid=1 pid=5907 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:04.163000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:04.165060 sshd[5907]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:04.170252 systemd[1]: Started session-19.scope. Mar 17 18:53:04.170509 systemd-logind[1500]: New session 19 of user core. Mar 17 18:53:04.175000 audit[5907]: USER_START pid=5907 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:04.177000 audit[5910]: CRED_ACQ pid=5910 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:04.728759 sshd[5907]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:04.729000 audit[5907]: USER_END pid=5907 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:04.730000 audit[5907]: CRED_DISP pid=5907 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:04.732520 systemd[1]: sshd@16-10.200.8.36:22-10.200.16.10:48436.service: Deactivated successfully. Mar 17 18:53:04.733652 systemd[1]: session-19.scope: Deactivated successfully. Mar 17 18:53:04.732000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.200.8.36:22-10.200.16.10:48436 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:04.735219 systemd-logind[1500]: Session 19 logged out. Waiting for processes to exit. Mar 17 18:53:04.736264 systemd-logind[1500]: Removed session 19. Mar 17 18:53:04.833243 systemd[1]: Started sshd@17-10.200.8.36:22-10.200.16.10:48452.service. Mar 17 18:53:04.832000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.200.8.36:22-10.200.16.10:48452 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:05.475000 audit[5917]: USER_ACCT pid=5917 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:05.476000 audit[5917]: CRED_ACQ pid=5917 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:05.476000 audit[5917]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb2c59590 a2=3 a3=0 items=0 ppid=1 pid=5917 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:05.476000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:05.477617 sshd[5917]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:05.480904 sshd[5917]: Accepted publickey for core from 10.200.16.10 port 48452 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:05.482760 systemd-logind[1500]: New session 20 of user core. Mar 17 18:53:05.483124 systemd[1]: Started session-20.scope. Mar 17 18:53:05.488000 audit[5917]: USER_START pid=5917 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:05.490000 audit[5923]: CRED_ACQ pid=5923 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:08.065743 kernel: kauditd_printk_skb: 20 callbacks suppressed Mar 17 18:53:08.065922 kernel: audit: type=1325 audit(1742237588.056:510): table=filter:118 family=2 entries=20 op=nft_register_rule pid=5933 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:08.056000 audit[5933]: NETFILTER_CFG table=filter:118 family=2 entries=20 op=nft_register_rule pid=5933 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:08.056000 audit[5933]: SYSCALL arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffc59773de0 a2=0 a3=7ffc59773dcc items=0 ppid=2885 pid=5933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:08.091946 kernel: audit: type=1300 audit(1742237588.056:510): arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffc59773de0 a2=0 a3=7ffc59773dcc items=0 ppid=2885 pid=5933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:08.056000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:08.101206 kernel: audit: type=1327 audit(1742237588.056:510): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:08.112000 audit[5933]: NETFILTER_CFG table=nat:119 family=2 entries=22 op=nft_register_rule pid=5933 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:08.112000 audit[5933]: SYSCALL arch=c000003e syscall=46 success=yes exit=6540 a0=3 a1=7ffc59773de0 a2=0 a3=0 items=0 ppid=2885 pid=5933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:08.144807 kernel: audit: type=1325 audit(1742237588.112:511): table=nat:119 family=2 entries=22 op=nft_register_rule pid=5933 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:08.144948 kernel: audit: type=1300 audit(1742237588.112:511): arch=c000003e syscall=46 success=yes exit=6540 a0=3 a1=7ffc59773de0 a2=0 a3=0 items=0 ppid=2885 pid=5933 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:08.144995 kernel: audit: type=1327 audit(1742237588.112:511): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:08.112000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:08.125000 audit[5935]: NETFILTER_CFG table=filter:120 family=2 entries=32 op=nft_register_rule pid=5935 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:08.163610 kernel: audit: type=1325 audit(1742237588.125:512): table=filter:120 family=2 entries=32 op=nft_register_rule pid=5935 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:08.163733 kernel: audit: type=1300 audit(1742237588.125:512): arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffda809a290 a2=0 a3=7ffda809a27c items=0 ppid=2885 pid=5935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:08.125000 audit[5935]: SYSCALL arch=c000003e syscall=46 success=yes exit=11860 a0=3 a1=7ffda809a290 a2=0 a3=7ffda809a27c items=0 ppid=2885 pid=5935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:08.189772 kernel: audit: type=1327 audit(1742237588.125:512): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:08.189886 kernel: audit: type=1325 audit(1742237588.156:513): table=nat:121 family=2 entries=22 op=nft_register_rule pid=5935 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:08.125000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:08.156000 audit[5935]: NETFILTER_CFG table=nat:121 family=2 entries=22 op=nft_register_rule pid=5935 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:08.156000 audit[5935]: SYSCALL arch=c000003e syscall=46 success=yes exit=6540 a0=3 a1=7ffda809a290 a2=0 a3=0 items=0 ppid=2885 pid=5935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:08.156000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:08.209195 sshd[5917]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:08.210000 audit[5917]: USER_END pid=5917 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:08.210000 audit[5917]: CRED_DISP pid=5917 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:08.212976 systemd[1]: sshd@17-10.200.8.36:22-10.200.16.10:48452.service: Deactivated successfully. Mar 17 18:53:08.214067 systemd[1]: session-20.scope: Deactivated successfully. Mar 17 18:53:08.215133 systemd-logind[1500]: Session 20 logged out. Waiting for processes to exit. Mar 17 18:53:08.216189 systemd-logind[1500]: Removed session 20. Mar 17 18:53:08.212000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.200.8.36:22-10.200.16.10:48452 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:08.312000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.200.8.36:22-10.200.16.10:48456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:08.312999 systemd[1]: Started sshd@18-10.200.8.36:22-10.200.16.10:48456.service. Mar 17 18:53:08.934000 audit[5938]: USER_ACCT pid=5938 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:08.935096 sshd[5938]: Accepted publickey for core from 10.200.16.10 port 48456 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:08.935000 audit[5938]: CRED_ACQ pid=5938 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:08.935000 audit[5938]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffe2f6af840 a2=3 a3=0 items=0 ppid=1 pid=5938 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:08.935000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:08.937055 sshd[5938]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:08.943152 systemd[1]: Started session-21.scope. Mar 17 18:53:08.943768 systemd-logind[1500]: New session 21 of user core. Mar 17 18:53:08.950000 audit[5938]: USER_START pid=5938 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:08.953000 audit[5944]: CRED_ACQ pid=5944 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:09.584911 sshd[5938]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:09.585000 audit[5938]: USER_END pid=5938 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:09.586000 audit[5938]: CRED_DISP pid=5938 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:09.588652 systemd[1]: sshd@18-10.200.8.36:22-10.200.16.10:48456.service: Deactivated successfully. Mar 17 18:53:09.588000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.200.8.36:22-10.200.16.10:48456 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:09.590803 systemd[1]: session-21.scope: Deactivated successfully. Mar 17 18:53:09.591530 systemd-logind[1500]: Session 21 logged out. Waiting for processes to exit. Mar 17 18:53:09.593384 systemd-logind[1500]: Removed session 21. Mar 17 18:53:09.689626 systemd[1]: Started sshd@19-10.200.8.36:22-10.200.16.10:52594.service. Mar 17 18:53:09.689000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.200.8.36:22-10.200.16.10:52594 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:10.310000 audit[5970]: USER_ACCT pid=5970 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:10.311590 sshd[5970]: Accepted publickey for core from 10.200.16.10 port 52594 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:10.311000 audit[5970]: CRED_ACQ pid=5970 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:10.312000 audit[5970]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fffaba378f0 a2=3 a3=0 items=0 ppid=1 pid=5970 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:10.312000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:10.313010 sshd[5970]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:10.318511 systemd-logind[1500]: New session 22 of user core. Mar 17 18:53:10.318740 systemd[1]: Started session-22.scope. Mar 17 18:53:10.323000 audit[5970]: USER_START pid=5970 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:10.325000 audit[5973]: CRED_ACQ pid=5973 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:10.819067 sshd[5970]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:10.819000 audit[5970]: USER_END pid=5970 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:10.819000 audit[5970]: CRED_DISP pid=5970 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:10.822123 systemd[1]: sshd@19-10.200.8.36:22-10.200.16.10:52594.service: Deactivated successfully. Mar 17 18:53:10.821000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.200.8.36:22-10.200.16.10:52594 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:10.823571 systemd[1]: session-22.scope: Deactivated successfully. Mar 17 18:53:10.824417 systemd-logind[1500]: Session 22 logged out. Waiting for processes to exit. Mar 17 18:53:10.826290 systemd-logind[1500]: Removed session 22. Mar 17 18:53:14.815707 kernel: kauditd_printk_skb: 27 callbacks suppressed Mar 17 18:53:14.815858 kernel: audit: type=1325 audit(1742237594.809:535): table=filter:122 family=2 entries=20 op=nft_register_rule pid=6006 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:14.809000 audit[6006]: NETFILTER_CFG table=filter:122 family=2 entries=20 op=nft_register_rule pid=6006 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:14.809000 audit[6006]: SYSCALL arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7fff41ed6d20 a2=0 a3=7fff41ed6d0c items=0 ppid=2885 pid=6006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:14.842780 kernel: audit: type=1300 audit(1742237594.809:535): arch=c000003e syscall=46 success=yes exit=2932 a0=3 a1=7fff41ed6d20 a2=0 a3=7fff41ed6d0c items=0 ppid=2885 pid=6006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:14.809000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:14.852203 kernel: audit: type=1327 audit(1742237594.809:535): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:14.824000 audit[6006]: NETFILTER_CFG table=nat:123 family=2 entries=106 op=nft_register_chain pid=6006 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:14.861722 kernel: audit: type=1325 audit(1742237594.824:536): table=nat:123 family=2 entries=106 op=nft_register_chain pid=6006 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Mar 17 18:53:14.869691 kernel: audit: type=1300 audit(1742237594.824:536): arch=c000003e syscall=46 success=yes exit=49452 a0=3 a1=7fff41ed6d20 a2=0 a3=7fff41ed6d0c items=0 ppid=2885 pid=6006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:14.824000 audit[6006]: SYSCALL arch=c000003e syscall=46 success=yes exit=49452 a0=3 a1=7fff41ed6d20 a2=0 a3=7fff41ed6d0c items=0 ppid=2885 pid=6006 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:14.824000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:14.880687 kernel: audit: type=1327 audit(1742237594.824:536): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Mar 17 18:53:15.923400 systemd[1]: Started sshd@20-10.200.8.36:22-10.200.16.10:52606.service. Mar 17 18:53:15.923000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.200.8.36:22-10.200.16.10:52606 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:15.941702 kernel: audit: type=1130 audit(1742237595.923:537): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.200.8.36:22-10.200.16.10:52606 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:16.545000 audit[6008]: USER_ACCT pid=6008 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:16.546378 sshd[6008]: Accepted publickey for core from 10.200.16.10 port 52606 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:16.563698 kernel: audit: type=1101 audit(1742237596.545:538): pid=6008 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:16.580974 kernel: audit: type=1103 audit(1742237596.563:539): pid=6008 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:16.563000 audit[6008]: CRED_ACQ pid=6008 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:16.569463 systemd[1]: Started session-23.scope. Mar 17 18:53:16.564329 sshd[6008]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:16.569967 systemd-logind[1500]: New session 23 of user core. Mar 17 18:53:16.563000 audit[6008]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffdbfb5dd00 a2=3 a3=0 items=0 ppid=1 pid=6008 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:16.591777 kernel: audit: type=1006 audit(1742237596.563:540): pid=6008 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Mar 17 18:53:16.563000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:16.581000 audit[6008]: USER_START pid=6008 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:16.581000 audit[6011]: CRED_ACQ pid=6011 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:17.056462 sshd[6008]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:17.056000 audit[6008]: USER_END pid=6008 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:17.057000 audit[6008]: CRED_DISP pid=6008 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:17.059000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.200.8.36:22-10.200.16.10:52606 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:17.059573 systemd[1]: sshd@20-10.200.8.36:22-10.200.16.10:52606.service: Deactivated successfully. Mar 17 18:53:17.061918 systemd[1]: session-23.scope: Deactivated successfully. Mar 17 18:53:17.062451 systemd-logind[1500]: Session 23 logged out. Waiting for processes to exit. Mar 17 18:53:17.063584 systemd-logind[1500]: Removed session 23. Mar 17 18:53:22.182829 kernel: kauditd_printk_skb: 7 callbacks suppressed Mar 17 18:53:22.182999 kernel: audit: type=1130 audit(1742237602.162:546): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.200.8.36:22-10.200.16.10:42938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:22.162000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.200.8.36:22-10.200.16.10:42938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:22.162744 systemd[1]: Started sshd@21-10.200.8.36:22-10.200.16.10:42938.service. Mar 17 18:53:22.787000 audit[6021]: USER_ACCT pid=6021 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:22.788920 sshd[6021]: Accepted publickey for core from 10.200.16.10 port 42938 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:22.806693 kernel: audit: type=1101 audit(1742237602.787:547): pid=6021 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:22.806000 audit[6021]: CRED_ACQ pid=6021 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:22.807507 sshd[6021]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:22.812810 systemd[1]: Started session-24.scope. Mar 17 18:53:22.813938 systemd-logind[1500]: New session 24 of user core. Mar 17 18:53:22.824690 kernel: audit: type=1103 audit(1742237602.806:548): pid=6021 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:22.806000 audit[6021]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeae3d9cc0 a2=3 a3=0 items=0 ppid=1 pid=6021 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:22.835810 kernel: audit: type=1006 audit(1742237602.806:549): pid=6021 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Mar 17 18:53:22.835850 kernel: audit: type=1300 audit(1742237602.806:549): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffeae3d9cc0 a2=3 a3=0 items=0 ppid=1 pid=6021 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:22.806000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:22.851692 kernel: audit: type=1327 audit(1742237602.806:549): proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:22.817000 audit[6021]: USER_START pid=6021 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:22.874272 kernel: audit: type=1105 audit(1742237602.817:550): pid=6021 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:22.874733 kernel: audit: type=1103 audit(1742237602.825:551): pid=6024 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:22.825000 audit[6024]: CRED_ACQ pid=6024 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:23.294940 sshd[6021]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:23.295000 audit[6021]: USER_END pid=6021 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:23.303972 systemd[1]: sshd@21-10.200.8.36:22-10.200.16.10:42938.service: Deactivated successfully. Mar 17 18:53:23.305543 systemd[1]: session-24.scope: Deactivated successfully. Mar 17 18:53:23.306725 systemd-logind[1500]: Session 24 logged out. Waiting for processes to exit. Mar 17 18:53:23.307724 systemd-logind[1500]: Removed session 24. Mar 17 18:53:23.296000 audit[6021]: CRED_DISP pid=6021 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:23.329382 kernel: audit: type=1106 audit(1742237603.295:552): pid=6021 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:23.329477 kernel: audit: type=1104 audit(1742237603.296:553): pid=6021 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:23.303000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.200.8.36:22-10.200.16.10:42938 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:25.762424 systemd[1]: run-containerd-runc-k8s.io-2eaa9c6a3598f1feed380b184f5535f8c24ec0ccd2d85b9d6b9c2aa6f66cde29-runc.Np7O9U.mount: Deactivated successfully. Mar 17 18:53:28.403750 systemd[1]: Started sshd@22-10.200.8.36:22-10.200.16.10:60472.service. Mar 17 18:53:28.404000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.200.8.36:22-10.200.16.10:60472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:28.409481 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:53:28.409562 kernel: audit: type=1130 audit(1742237608.404:555): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.200.8.36:22-10.200.16.10:60472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:29.027000 audit[6061]: USER_ACCT pid=6061 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.045794 kernel: audit: type=1101 audit(1742237609.027:556): pid=6061 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.030301 sshd[6061]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:29.046203 sshd[6061]: Accepted publickey for core from 10.200.16.10 port 60472 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:29.029000 audit[6061]: CRED_ACQ pid=6061 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.051430 systemd[1]: Started session-25.scope. Mar 17 18:53:29.051949 systemd-logind[1500]: New session 25 of user core. Mar 17 18:53:29.076234 kernel: audit: type=1103 audit(1742237609.029:557): pid=6061 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.076364 kernel: audit: type=1006 audit(1742237609.029:558): pid=6061 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Mar 17 18:53:29.076398 kernel: audit: type=1300 audit(1742237609.029:558): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc824b0ae0 a2=3 a3=0 items=0 ppid=1 pid=6061 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:29.029000 audit[6061]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc824b0ae0 a2=3 a3=0 items=0 ppid=1 pid=6061 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:29.029000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:29.096862 kernel: audit: type=1327 audit(1742237609.029:558): proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:29.062000 audit[6061]: USER_START pid=6061 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.114399 kernel: audit: type=1105 audit(1742237609.062:559): pid=6061 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.114520 kernel: audit: type=1103 audit(1742237609.064:560): pid=6064 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.064000 audit[6064]: CRED_ACQ pid=6064 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.544625 sshd[6061]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:29.545000 audit[6061]: USER_END pid=6061 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.548635 systemd-logind[1500]: Session 25 logged out. Waiting for processes to exit. Mar 17 18:53:29.550174 systemd[1]: sshd@22-10.200.8.36:22-10.200.16.10:60472.service: Deactivated successfully. Mar 17 18:53:29.551174 systemd[1]: session-25.scope: Deactivated successfully. Mar 17 18:53:29.552813 systemd-logind[1500]: Removed session 25. Mar 17 18:53:29.545000 audit[6061]: CRED_DISP pid=6061 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.579147 kernel: audit: type=1106 audit(1742237609.545:561): pid=6061 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.579269 kernel: audit: type=1104 audit(1742237609.545:562): pid=6061 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:29.545000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.200.8.36:22-10.200.16.10:60472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:34.649135 systemd[1]: Started sshd@23-10.200.8.36:22-10.200.16.10:60476.service. Mar 17 18:53:34.668699 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:53:34.668841 kernel: audit: type=1130 audit(1742237614.649:564): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.200.8.36:22-10.200.16.10:60476 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:34.649000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.200.8.36:22-10.200.16.10:60476 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:35.276000 audit[6077]: USER_ACCT pid=6077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.279173 sshd[6077]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:35.295559 sshd[6077]: Accepted publickey for core from 10.200.16.10 port 60476 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:35.295804 kernel: audit: type=1101 audit(1742237615.276:565): pid=6077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.278000 audit[6077]: CRED_ACQ pid=6077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.300884 systemd[1]: Started session-26.scope. Mar 17 18:53:35.301719 systemd-logind[1500]: New session 26 of user core. Mar 17 18:53:35.312729 kernel: audit: type=1103 audit(1742237615.278:566): pid=6077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.324690 kernel: audit: type=1006 audit(1742237615.278:567): pid=6077 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Mar 17 18:53:35.324775 kernel: audit: type=1300 audit(1742237615.278:567): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7599faf0 a2=3 a3=0 items=0 ppid=1 pid=6077 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:35.278000 audit[6077]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7fff7599faf0 a2=3 a3=0 items=0 ppid=1 pid=6077 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:35.340319 kernel: audit: type=1327 audit(1742237615.278:567): proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:35.278000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:35.306000 audit[6077]: USER_START pid=6077 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.360222 kernel: audit: type=1105 audit(1742237615.306:568): pid=6077 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.360330 kernel: audit: type=1103 audit(1742237615.313:569): pid=6080 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.313000 audit[6080]: CRED_ACQ pid=6080 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.785854 sshd[6077]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:35.786000 audit[6077]: USER_END pid=6077 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.790435 systemd-logind[1500]: Session 26 logged out. Waiting for processes to exit. Mar 17 18:53:35.791971 systemd[1]: sshd@23-10.200.8.36:22-10.200.16.10:60476.service: Deactivated successfully. Mar 17 18:53:35.793288 systemd[1]: session-26.scope: Deactivated successfully. Mar 17 18:53:35.794580 systemd-logind[1500]: Removed session 26. Mar 17 18:53:35.786000 audit[6077]: CRED_DISP pid=6077 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.819588 kernel: audit: type=1106 audit(1742237615.786:570): pid=6077 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.819786 kernel: audit: type=1104 audit(1742237615.786:571): pid=6077 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:35.791000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.200.8.36:22-10.200.16.10:60476 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:40.890000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.200.8.36:22-10.200.16.10:39174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:40.890631 systemd[1]: Started sshd@24-10.200.8.36:22-10.200.16.10:39174.service. Mar 17 18:53:40.895690 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:53:40.895765 kernel: audit: type=1130 audit(1742237620.890:573): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.200.8.36:22-10.200.16.10:39174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:41.513000 audit[6093]: USER_ACCT pid=6093 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:41.533312 sshd[6093]: Accepted publickey for core from 10.200.16.10 port 39174 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:41.533694 kernel: audit: type=1101 audit(1742237621.513:574): pid=6093 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:41.533824 sshd[6093]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:41.532000 audit[6093]: CRED_ACQ pid=6093 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:41.540475 systemd[1]: Started session-27.scope. Mar 17 18:53:41.541102 systemd-logind[1500]: New session 27 of user core. Mar 17 18:53:41.560849 kernel: audit: type=1103 audit(1742237621.532:575): pid=6093 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:41.560934 kernel: audit: type=1006 audit(1742237621.532:576): pid=6093 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Mar 17 18:53:41.560958 kernel: audit: type=1300 audit(1742237621.532:576): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc9b573320 a2=3 a3=0 items=0 ppid=1 pid=6093 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:41.532000 audit[6093]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffc9b573320 a2=3 a3=0 items=0 ppid=1 pid=6093 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:41.532000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:41.581625 kernel: audit: type=1327 audit(1742237621.532:576): proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:41.541000 audit[6093]: USER_START pid=6093 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:41.546000 audit[6096]: CRED_ACQ pid=6096 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:41.612595 kernel: audit: type=1105 audit(1742237621.541:577): pid=6093 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:41.612730 kernel: audit: type=1103 audit(1742237621.546:578): pid=6096 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:42.009938 sshd[6093]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:42.009000 audit[6093]: USER_END pid=6093 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:42.019374 systemd[1]: sshd@24-10.200.8.36:22-10.200.16.10:39174.service: Deactivated successfully. Mar 17 18:53:42.020630 systemd[1]: session-27.scope: Deactivated successfully. Mar 17 18:53:42.021836 systemd-logind[1500]: Session 27 logged out. Waiting for processes to exit. Mar 17 18:53:42.022783 systemd-logind[1500]: Removed session 27. Mar 17 18:53:42.009000 audit[6093]: CRED_DISP pid=6093 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:42.044404 kernel: audit: type=1106 audit(1742237622.009:579): pid=6093 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:42.044490 kernel: audit: type=1104 audit(1742237622.009:580): pid=6093 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:42.015000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.200.8.36:22-10.200.16.10:39174 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:43.293890 systemd[1]: run-containerd-runc-k8s.io-7724dc134a0c6248f84c12c30eed51e14a73f0ffa47fe0423eec6266d70ad547-runc.bwHmgK.mount: Deactivated successfully. Mar 17 18:53:47.115390 systemd[1]: Started sshd@25-10.200.8.36:22-10.200.16.10:39184.service. Mar 17 18:53:47.137780 kernel: kauditd_printk_skb: 1 callbacks suppressed Mar 17 18:53:47.137920 kernel: audit: type=1130 audit(1742237627.114:582): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.200.8.36:22-10.200.16.10:39184 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:47.114000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.200.8.36:22-10.200.16.10:39184 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Mar 17 18:53:47.736000 audit[6128]: USER_ACCT pid=6128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:47.756849 kernel: audit: type=1101 audit(1742237627.736:583): pid=6128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:47.739844 sshd[6128]: pam_unix(sshd:session): session opened for user core(uid=500) by (uid=0) Mar 17 18:53:47.757275 sshd[6128]: Accepted publickey for core from 10.200.16.10 port 39184 ssh2: RSA SHA256:Id7fTtJmja0nOLdf0IQA3jnxxJrUKKdGU1UW83zjTQg Mar 17 18:53:47.737000 audit[6128]: CRED_ACQ pid=6128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:47.761994 systemd[1]: Started session-28.scope. Mar 17 18:53:47.763138 systemd-logind[1500]: New session 28 of user core. Mar 17 18:53:47.774810 kernel: audit: type=1103 audit(1742237627.737:584): pid=6128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:47.774918 kernel: audit: type=1006 audit(1742237627.737:585): pid=6128 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Mar 17 18:53:47.737000 audit[6128]: SYSCALL arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb3d25ca0 a2=3 a3=0 items=0 ppid=1 pid=6128 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:47.799883 kernel: audit: type=1300 audit(1742237627.737:585): arch=c000003e syscall=1 success=yes exit=3 a0=5 a1=7ffcb3d25ca0 a2=3 a3=0 items=0 ppid=1 pid=6128 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd" exe="/usr/sbin/sshd" subj=system_u:system_r:kernel_t:s0 key=(null) Mar 17 18:53:47.737000 audit: PROCTITLE proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:47.805201 kernel: audit: type=1327 audit(1742237627.737:585): proctitle=737368643A20636F7265205B707269765D Mar 17 18:53:47.773000 audit[6128]: USER_START pid=6128 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:47.832853 kernel: audit: type=1105 audit(1742237627.773:586): pid=6128 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:47.832966 kernel: audit: type=1103 audit(1742237627.775:587): pid=6131 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:47.775000 audit[6131]: CRED_ACQ pid=6131 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:48.234546 sshd[6128]: pam_unix(sshd:session): session closed for user core Mar 17 18:53:48.235000 audit[6128]: USER_END pid=6128 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:48.242971 systemd[1]: sshd@25-10.200.8.36:22-10.200.16.10:39184.service: Deactivated successfully. Mar 17 18:53:48.243860 systemd[1]: session-28.scope: Deactivated successfully. Mar 17 18:53:48.245378 systemd-logind[1500]: Session 28 logged out. Waiting for processes to exit. Mar 17 18:53:48.246233 systemd-logind[1500]: Removed session 28. Mar 17 18:53:48.235000 audit[6128]: CRED_DISP pid=6128 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:48.268766 kernel: audit: type=1106 audit(1742237628.235:588): pid=6128 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:48.268847 kernel: audit: type=1104 audit(1742237628.235:589): pid=6128 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/sbin/sshd" hostname=10.200.16.10 addr=10.200.16.10 terminal=ssh res=success' Mar 17 18:53:48.241000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-10.200.8.36:22-10.200.16.10:39184 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success'