Jan 14 00:06:32.423882 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 14 00:06:32.423905 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Jan 13 22:00:26 -00 2026 Jan 14 00:06:32.423916 kernel: KASLR enabled Jan 14 00:06:32.423922 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jan 14 00:06:32.423927 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Jan 14 00:06:32.423933 kernel: random: crng init done Jan 14 00:06:32.423940 kernel: secureboot: Secure boot disabled Jan 14 00:06:32.423946 kernel: ACPI: Early table checksum verification disabled Jan 14 00:06:32.423952 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Jan 14 00:06:32.423960 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Jan 14 00:06:32.423966 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:06:32.423972 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:06:32.423978 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:06:32.423984 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:06:32.423993 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:06:32.423999 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:06:32.424006 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:06:32.424013 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:06:32.424019 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 14 00:06:32.424026 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Jan 14 00:06:32.424032 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 14 00:06:32.424038 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 14 00:06:32.424045 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Jan 14 00:06:32.424053 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Jan 14 00:06:32.424059 kernel: Zone ranges: Jan 14 00:06:32.424066 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 14 00:06:32.424072 kernel: DMA32 empty Jan 14 00:06:32.424078 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Jan 14 00:06:32.424085 kernel: Device empty Jan 14 00:06:32.424091 kernel: Movable zone start for each node Jan 14 00:06:32.424097 kernel: Early memory node ranges Jan 14 00:06:32.424104 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Jan 14 00:06:32.424110 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Jan 14 00:06:32.424117 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Jan 14 00:06:32.424123 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Jan 14 00:06:32.424131 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Jan 14 00:06:32.424137 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Jan 14 00:06:32.424144 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Jan 14 00:06:32.424150 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Jan 14 00:06:32.424157 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Jan 14 00:06:32.424166 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Jan 14 00:06:32.424174 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jan 14 00:06:32.424181 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jan 14 00:06:32.424188 kernel: psci: probing for conduit method from ACPI. Jan 14 00:06:32.424195 kernel: psci: PSCIv1.1 detected in firmware. Jan 14 00:06:32.424202 kernel: psci: Using standard PSCI v0.2 function IDs Jan 14 00:06:32.424208 kernel: psci: Trusted OS migration not required Jan 14 00:06:32.424215 kernel: psci: SMC Calling Convention v1.1 Jan 14 00:06:32.424222 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 14 00:06:32.424230 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 14 00:06:32.424237 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 14 00:06:32.424254 kernel: pcpu-alloc: [0] 0 [0] 1 Jan 14 00:06:32.424261 kernel: Detected PIPT I-cache on CPU0 Jan 14 00:06:32.424268 kernel: CPU features: detected: GIC system register CPU interface Jan 14 00:06:32.424275 kernel: CPU features: detected: Spectre-v4 Jan 14 00:06:32.424282 kernel: CPU features: detected: Spectre-BHB Jan 14 00:06:32.424289 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 14 00:06:32.424295 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 14 00:06:32.424302 kernel: CPU features: detected: ARM erratum 1418040 Jan 14 00:06:32.424309 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 14 00:06:32.424318 kernel: alternatives: applying boot alternatives Jan 14 00:06:32.424326 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=3d3f73de8d2693594dfefd279d2c8d77c282a05a4cbc54177503d31784261f6b Jan 14 00:06:32.424333 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 14 00:06:32.424340 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 14 00:06:32.424346 kernel: Fallback order for Node 0: 0 Jan 14 00:06:32.424353 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Jan 14 00:06:32.424360 kernel: Policy zone: Normal Jan 14 00:06:32.424367 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 14 00:06:32.424374 kernel: software IO TLB: area num 2. Jan 14 00:06:32.424380 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jan 14 00:06:32.424389 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 14 00:06:32.424396 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 14 00:06:32.424403 kernel: rcu: RCU event tracing is enabled. Jan 14 00:06:32.424411 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 14 00:06:32.424417 kernel: Trampoline variant of Tasks RCU enabled. Jan 14 00:06:32.424424 kernel: Tracing variant of Tasks RCU enabled. Jan 14 00:06:32.424431 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 14 00:06:32.424438 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 14 00:06:32.424445 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 00:06:32.424452 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 14 00:06:32.424459 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 14 00:06:32.424467 kernel: GICv3: 256 SPIs implemented Jan 14 00:06:32.424474 kernel: GICv3: 0 Extended SPIs implemented Jan 14 00:06:32.424480 kernel: Root IRQ handler: gic_handle_irq Jan 14 00:06:32.424487 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 14 00:06:32.424494 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 14 00:06:32.424501 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 14 00:06:32.424508 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 14 00:06:32.424515 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Jan 14 00:06:32.424560 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Jan 14 00:06:32.424567 kernel: GICv3: using LPI property table @0x0000000100120000 Jan 14 00:06:32.424574 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Jan 14 00:06:32.424583 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 14 00:06:32.424590 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 14 00:06:32.424597 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 14 00:06:32.424604 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 14 00:06:32.424611 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 14 00:06:32.424618 kernel: Console: colour dummy device 80x25 Jan 14 00:06:32.424625 kernel: ACPI: Core revision 20240827 Jan 14 00:06:32.424633 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 14 00:06:32.424640 kernel: pid_max: default: 32768 minimum: 301 Jan 14 00:06:32.424649 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 14 00:06:32.424657 kernel: landlock: Up and running. Jan 14 00:06:32.424664 kernel: SELinux: Initializing. Jan 14 00:06:32.424672 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 14 00:06:32.424679 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 14 00:06:32.424686 kernel: rcu: Hierarchical SRCU implementation. Jan 14 00:06:32.424694 kernel: rcu: Max phase no-delay instances is 400. Jan 14 00:06:32.424701 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 14 00:06:32.424711 kernel: Remapping and enabling EFI services. Jan 14 00:06:32.424718 kernel: smp: Bringing up secondary CPUs ... Jan 14 00:06:32.424725 kernel: Detected PIPT I-cache on CPU1 Jan 14 00:06:32.424733 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 14 00:06:32.424740 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Jan 14 00:06:32.424748 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 14 00:06:32.424755 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 14 00:06:32.424763 kernel: smp: Brought up 1 node, 2 CPUs Jan 14 00:06:32.424771 kernel: SMP: Total of 2 processors activated. Jan 14 00:06:32.424783 kernel: CPU: All CPU(s) started at EL1 Jan 14 00:06:32.424791 kernel: CPU features: detected: 32-bit EL0 Support Jan 14 00:06:32.424799 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 14 00:06:32.424807 kernel: CPU features: detected: Common not Private translations Jan 14 00:06:32.424815 kernel: CPU features: detected: CRC32 instructions Jan 14 00:06:32.424823 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 14 00:06:32.424832 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 14 00:06:32.424840 kernel: CPU features: detected: LSE atomic instructions Jan 14 00:06:32.424847 kernel: CPU features: detected: Privileged Access Never Jan 14 00:06:32.424855 kernel: CPU features: detected: RAS Extension Support Jan 14 00:06:32.424862 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 14 00:06:32.424872 kernel: alternatives: applying system-wide alternatives Jan 14 00:06:32.424880 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Jan 14 00:06:32.424888 kernel: Memory: 3885924K/4096000K available (11200K kernel code, 2458K rwdata, 9088K rodata, 12480K init, 1038K bss, 188596K reserved, 16384K cma-reserved) Jan 14 00:06:32.424895 kernel: devtmpfs: initialized Jan 14 00:06:32.424903 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 14 00:06:32.424911 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 14 00:06:32.424918 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 14 00:06:32.424926 kernel: 0 pages in range for non-PLT usage Jan 14 00:06:32.424935 kernel: 515168 pages in range for PLT usage Jan 14 00:06:32.424942 kernel: pinctrl core: initialized pinctrl subsystem Jan 14 00:06:32.424950 kernel: SMBIOS 3.0.0 present. Jan 14 00:06:32.424957 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Jan 14 00:06:32.424964 kernel: DMI: Memory slots populated: 1/1 Jan 14 00:06:32.424972 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 14 00:06:32.424980 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Jan 14 00:06:32.424989 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 14 00:06:32.424996 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 14 00:06:32.425004 kernel: audit: initializing netlink subsys (disabled) Jan 14 00:06:32.425011 kernel: audit: type=2000 audit(0.015:1): state=initialized audit_enabled=0 res=1 Jan 14 00:06:32.425019 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 14 00:06:32.425026 kernel: cpuidle: using governor menu Jan 14 00:06:32.425034 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 14 00:06:32.425043 kernel: ASID allocator initialised with 32768 entries Jan 14 00:06:32.425051 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 14 00:06:32.425059 kernel: Serial: AMBA PL011 UART driver Jan 14 00:06:32.425066 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 14 00:06:32.425074 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 14 00:06:32.425081 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 14 00:06:32.425089 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 14 00:06:32.425098 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 14 00:06:32.425105 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 14 00:06:32.425113 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 14 00:06:32.425120 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 14 00:06:32.425128 kernel: ACPI: Added _OSI(Module Device) Jan 14 00:06:32.425135 kernel: ACPI: Added _OSI(Processor Device) Jan 14 00:06:32.425143 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 14 00:06:32.425151 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 14 00:06:32.425159 kernel: ACPI: Interpreter enabled Jan 14 00:06:32.425166 kernel: ACPI: Using GIC for interrupt routing Jan 14 00:06:32.425174 kernel: ACPI: MCFG table detected, 1 entries Jan 14 00:06:32.425181 kernel: ACPI: CPU0 has been hot-added Jan 14 00:06:32.425188 kernel: ACPI: CPU1 has been hot-added Jan 14 00:06:32.425196 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 14 00:06:32.425204 kernel: printk: legacy console [ttyAMA0] enabled Jan 14 00:06:32.425212 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 14 00:06:32.425404 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 14 00:06:32.425494 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 14 00:06:32.425621 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 14 00:06:32.425721 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 14 00:06:32.425805 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 14 00:06:32.425820 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 14 00:06:32.425828 kernel: PCI host bridge to bus 0000:00 Jan 14 00:06:32.425915 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 14 00:06:32.425989 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 14 00:06:32.426061 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 14 00:06:32.426134 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 14 00:06:32.426229 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 14 00:06:32.426341 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Jan 14 00:06:32.426433 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Jan 14 00:06:32.426515 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Jan 14 00:06:32.426618 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:06:32.426703 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Jan 14 00:06:32.426785 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 14 00:06:32.426864 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 14 00:06:32.426944 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 14 00:06:32.427030 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:06:32.427110 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Jan 14 00:06:32.427192 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 14 00:06:32.427365 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 14 00:06:32.427461 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:06:32.427567 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Jan 14 00:06:32.427653 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 14 00:06:32.427738 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 14 00:06:32.427818 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 14 00:06:32.427910 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:06:32.427990 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Jan 14 00:06:32.428072 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 14 00:06:32.428151 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 14 00:06:32.430629 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 14 00:06:32.430762 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:06:32.430847 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Jan 14 00:06:32.430928 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 14 00:06:32.431079 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jan 14 00:06:32.431170 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 14 00:06:32.431289 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:06:32.431379 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Jan 14 00:06:32.431462 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 14 00:06:32.431570 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Jan 14 00:06:32.431655 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 14 00:06:32.431743 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:06:32.431828 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Jan 14 00:06:32.431993 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 14 00:06:32.432105 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Jan 14 00:06:32.432195 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Jan 14 00:06:32.432302 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:06:32.432387 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Jan 14 00:06:32.432574 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 14 00:06:32.434303 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Jan 14 00:06:32.434421 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 14 00:06:32.434506 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Jan 14 00:06:32.434620 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 14 00:06:32.434713 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Jan 14 00:06:32.434841 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Jan 14 00:06:32.434942 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Jan 14 00:06:32.435057 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 14 00:06:32.435200 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Jan 14 00:06:32.435320 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 14 00:06:32.435441 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 14 00:06:32.436391 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 14 00:06:32.436501 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Jan 14 00:06:32.436623 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jan 14 00:06:32.436727 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Jan 14 00:06:32.436835 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 14 00:06:32.436933 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 14 00:06:32.437016 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 14 00:06:32.437107 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 14 00:06:32.437192 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Jan 14 00:06:32.437308 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 14 00:06:32.437404 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jan 14 00:06:32.437973 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Jan 14 00:06:32.438128 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 14 00:06:32.439268 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 14 00:06:32.439380 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Jan 14 00:06:32.440229 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Jan 14 00:06:32.440387 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 14 00:06:32.440479 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 14 00:06:32.440867 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 14 00:06:32.440972 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 14 00:06:32.441644 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 14 00:06:32.441798 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 14 00:06:32.441903 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 14 00:06:32.441990 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 14 00:06:32.442073 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 14 00:06:32.442153 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 14 00:06:32.442261 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 14 00:06:32.442366 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 14 00:06:32.444669 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 14 00:06:32.444770 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 14 00:06:32.444853 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 14 00:06:32.444933 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 14 00:06:32.445025 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 14 00:06:32.445106 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 14 00:06:32.445185 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 14 00:06:32.445317 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 14 00:06:32.445403 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Jan 14 00:06:32.445579 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Jan 14 00:06:32.445679 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 14 00:06:32.445763 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 14 00:06:32.445851 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 14 00:06:32.445937 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 14 00:06:32.446017 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 14 00:06:32.446103 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 14 00:06:32.446188 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 14 00:06:32.446289 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 14 00:06:32.446411 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 14 00:06:32.446499 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 14 00:06:32.447031 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 14 00:06:32.447136 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 14 00:06:32.448780 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 14 00:06:32.448899 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 14 00:06:32.448985 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 14 00:06:32.449508 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 14 00:06:32.449633 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 14 00:06:32.449718 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 14 00:06:32.449809 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 14 00:06:32.449891 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 14 00:06:32.449977 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 14 00:06:32.450058 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 14 00:06:32.450142 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 14 00:06:32.450224 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 14 00:06:32.450339 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Jan 14 00:06:32.450423 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Jan 14 00:06:32.450506 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Jan 14 00:06:32.452048 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Jan 14 00:06:32.452144 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Jan 14 00:06:32.452233 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Jan 14 00:06:32.452376 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Jan 14 00:06:32.452460 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Jan 14 00:06:32.452565 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Jan 14 00:06:32.452670 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Jan 14 00:06:32.452776 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Jan 14 00:06:32.452865 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Jan 14 00:06:32.452948 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Jan 14 00:06:32.453028 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Jan 14 00:06:32.453111 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Jan 14 00:06:32.453191 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Jan 14 00:06:32.453290 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Jan 14 00:06:32.453375 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Jan 14 00:06:32.453459 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Jan 14 00:06:32.455630 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Jan 14 00:06:32.455744 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Jan 14 00:06:32.455838 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 14 00:06:32.455993 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 14 00:06:32.456108 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 14 00:06:32.456204 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 14 00:06:32.456304 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jan 14 00:06:32.456392 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Jan 14 00:06:32.456473 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 14 00:06:32.456585 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 14 00:06:32.456670 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 14 00:06:32.456752 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jan 14 00:06:32.456831 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Jan 14 00:06:32.456911 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 14 00:06:32.456999 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 14 00:06:32.457084 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 14 00:06:32.457166 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 14 00:06:32.457299 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jan 14 00:06:32.457405 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Jan 14 00:06:32.457491 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 14 00:06:32.458287 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 14 00:06:32.458391 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 14 00:06:32.458475 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jan 14 00:06:32.459648 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Jan 14 00:06:32.459801 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 14 00:06:32.459904 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 14 00:06:32.459997 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 14 00:06:32.460094 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 14 00:06:32.460183 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jan 14 00:06:32.462481 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jan 14 00:06:32.462604 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 14 00:06:32.462697 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 14 00:06:32.462781 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 14 00:06:32.462873 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 14 00:06:32.462967 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jan 14 00:06:32.463049 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 14 00:06:32.463129 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 14 00:06:32.463216 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Jan 14 00:06:32.463318 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Jan 14 00:06:32.463405 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Jan 14 00:06:32.463489 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 14 00:06:32.463624 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jan 14 00:06:32.463711 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 14 00:06:32.463792 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 14 00:06:32.463880 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 14 00:06:32.463962 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jan 14 00:06:32.464044 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 14 00:06:32.464125 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 14 00:06:32.464213 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 14 00:06:32.464309 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Jan 14 00:06:32.464391 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Jan 14 00:06:32.464472 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 14 00:06:32.464573 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 14 00:06:32.464653 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 14 00:06:32.464728 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 14 00:06:32.464815 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jan 14 00:06:32.464892 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 14 00:06:32.464966 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 14 00:06:32.465049 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Jan 14 00:06:32.465127 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 14 00:06:32.465201 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 14 00:06:32.465334 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Jan 14 00:06:32.465416 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 14 00:06:32.465492 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 14 00:06:32.469690 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Jan 14 00:06:32.469795 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 14 00:06:32.469873 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 14 00:06:32.469957 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Jan 14 00:06:32.470033 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 14 00:06:32.470108 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 14 00:06:32.470202 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Jan 14 00:06:32.470301 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 14 00:06:32.470379 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 14 00:06:32.470463 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Jan 14 00:06:32.470555 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 14 00:06:32.470632 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 14 00:06:32.470725 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Jan 14 00:06:32.470800 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 14 00:06:32.470874 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 14 00:06:32.470997 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Jan 14 00:06:32.471075 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 14 00:06:32.471155 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 14 00:06:32.471165 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 14 00:06:32.471174 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 14 00:06:32.471182 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 14 00:06:32.471190 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 14 00:06:32.471198 kernel: iommu: Default domain type: Translated Jan 14 00:06:32.471207 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 14 00:06:32.471216 kernel: efivars: Registered efivars operations Jan 14 00:06:32.471225 kernel: vgaarb: loaded Jan 14 00:06:32.471233 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 14 00:06:32.471241 kernel: VFS: Disk quotas dquot_6.6.0 Jan 14 00:06:32.471452 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 14 00:06:32.471461 kernel: pnp: PnP ACPI init Jan 14 00:06:32.471624 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 14 00:06:32.471643 kernel: pnp: PnP ACPI: found 1 devices Jan 14 00:06:32.471652 kernel: NET: Registered PF_INET protocol family Jan 14 00:06:32.471661 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 14 00:06:32.471670 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 14 00:06:32.471679 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 14 00:06:32.471688 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 14 00:06:32.471697 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 14 00:06:32.471708 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 14 00:06:32.471716 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 14 00:06:32.471725 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 14 00:06:32.471736 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 14 00:06:32.471912 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 14 00:06:32.471928 kernel: PCI: CLS 0 bytes, default 64 Jan 14 00:06:32.471937 kernel: kvm [1]: HYP mode not available Jan 14 00:06:32.471950 kernel: Initialise system trusted keyrings Jan 14 00:06:32.471959 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 14 00:06:32.471968 kernel: Key type asymmetric registered Jan 14 00:06:32.471977 kernel: Asymmetric key parser 'x509' registered Jan 14 00:06:32.471985 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 14 00:06:32.471994 kernel: io scheduler mq-deadline registered Jan 14 00:06:32.472004 kernel: io scheduler kyber registered Jan 14 00:06:32.472014 kernel: io scheduler bfq registered Jan 14 00:06:32.472023 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 14 00:06:32.472123 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Jan 14 00:06:32.472215 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Jan 14 00:06:32.472360 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 00:06:32.472459 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Jan 14 00:06:32.472632 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Jan 14 00:06:32.472785 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 00:06:32.472879 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Jan 14 00:06:32.472961 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Jan 14 00:06:32.473042 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 00:06:32.473133 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Jan 14 00:06:32.473215 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Jan 14 00:06:32.473318 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 00:06:32.473406 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Jan 14 00:06:32.473487 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Jan 14 00:06:32.473599 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 00:06:32.473685 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Jan 14 00:06:32.473766 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Jan 14 00:06:32.473850 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 00:06:32.473935 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Jan 14 00:06:32.474081 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Jan 14 00:06:32.474165 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 00:06:32.474286 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Jan 14 00:06:32.474373 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Jan 14 00:06:32.474454 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 00:06:32.474470 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 14 00:06:32.474587 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Jan 14 00:06:32.474675 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Jan 14 00:06:32.474758 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 14 00:06:32.474769 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 14 00:06:32.474777 kernel: ACPI: button: Power Button [PWRB] Jan 14 00:06:32.474789 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 14 00:06:32.474876 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 14 00:06:32.474964 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Jan 14 00:06:32.474976 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 14 00:06:32.474985 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 14 00:06:32.475072 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Jan 14 00:06:32.475083 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Jan 14 00:06:32.475093 kernel: thunder_xcv, ver 1.0 Jan 14 00:06:32.475102 kernel: thunder_bgx, ver 1.0 Jan 14 00:06:32.475110 kernel: nicpf, ver 1.0 Jan 14 00:06:32.475118 kernel: nicvf, ver 1.0 Jan 14 00:06:32.475220 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 14 00:06:32.475347 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-14T00:06:31 UTC (1768349191) Jan 14 00:06:32.475363 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 14 00:06:32.475372 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 14 00:06:32.475380 kernel: watchdog: NMI not fully supported Jan 14 00:06:32.475389 kernel: watchdog: Hard watchdog permanently disabled Jan 14 00:06:32.475397 kernel: NET: Registered PF_INET6 protocol family Jan 14 00:06:32.475405 kernel: Segment Routing with IPv6 Jan 14 00:06:32.475414 kernel: In-situ OAM (IOAM) with IPv6 Jan 14 00:06:32.475422 kernel: NET: Registered PF_PACKET protocol family Jan 14 00:06:32.475431 kernel: Key type dns_resolver registered Jan 14 00:06:32.475440 kernel: registered taskstats version 1 Jan 14 00:06:32.475448 kernel: Loading compiled-in X.509 certificates Jan 14 00:06:32.475458 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: d16d100cda59d8093883df975a5384fda36b7d35' Jan 14 00:06:32.475466 kernel: Demotion targets for Node 0: null Jan 14 00:06:32.475474 kernel: Key type .fscrypt registered Jan 14 00:06:32.475482 kernel: Key type fscrypt-provisioning registered Jan 14 00:06:32.475491 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 14 00:06:32.475500 kernel: ima: Allocated hash algorithm: sha1 Jan 14 00:06:32.475508 kernel: ima: No architecture policies found Jan 14 00:06:32.475527 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 14 00:06:32.475537 kernel: clk: Disabling unused clocks Jan 14 00:06:32.475545 kernel: PM: genpd: Disabling unused power domains Jan 14 00:06:32.475554 kernel: Freeing unused kernel memory: 12480K Jan 14 00:06:32.475563 kernel: Run /init as init process Jan 14 00:06:32.475572 kernel: with arguments: Jan 14 00:06:32.475580 kernel: /init Jan 14 00:06:32.475587 kernel: with environment: Jan 14 00:06:32.475595 kernel: HOME=/ Jan 14 00:06:32.475603 kernel: TERM=linux Jan 14 00:06:32.475611 kernel: ACPI: bus type USB registered Jan 14 00:06:32.475621 kernel: usbcore: registered new interface driver usbfs Jan 14 00:06:32.475629 kernel: usbcore: registered new interface driver hub Jan 14 00:06:32.475637 kernel: usbcore: registered new device driver usb Jan 14 00:06:32.475738 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 14 00:06:32.475823 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 14 00:06:32.475906 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 14 00:06:32.475988 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 14 00:06:32.476072 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 14 00:06:32.476155 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 14 00:06:32.476281 kernel: hub 1-0:1.0: USB hub found Jan 14 00:06:32.476375 kernel: hub 1-0:1.0: 4 ports detected Jan 14 00:06:32.476482 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 14 00:06:32.476625 kernel: hub 2-0:1.0: USB hub found Jan 14 00:06:32.476718 kernel: hub 2-0:1.0: 4 ports detected Jan 14 00:06:32.476742 kernel: SCSI subsystem initialized Jan 14 00:06:32.476845 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jan 14 00:06:32.476946 kernel: scsi host0: Virtio SCSI HBA Jan 14 00:06:32.477052 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Jan 14 00:06:32.477153 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jan 14 00:06:32.477279 kernel: sr 0:0:0:0: Power-on or device reset occurred Jan 14 00:06:32.477383 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Jan 14 00:06:32.477393 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 14 00:06:32.477480 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jan 14 00:06:32.477607 kernel: sd 0:0:0:1: Power-on or device reset occurred Jan 14 00:06:32.477703 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jan 14 00:06:32.477794 kernel: sd 0:0:0:1: [sda] Write Protect is off Jan 14 00:06:32.477882 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Jan 14 00:06:32.477971 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jan 14 00:06:32.477982 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 14 00:06:32.477993 kernel: GPT:25804799 != 80003071 Jan 14 00:06:32.478001 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 14 00:06:32.478010 kernel: GPT:25804799 != 80003071 Jan 14 00:06:32.478018 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 14 00:06:32.478026 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 14 00:06:32.478115 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Jan 14 00:06:32.478126 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 14 00:06:32.478136 kernel: device-mapper: uevent: version 1.0.3 Jan 14 00:06:32.478144 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 14 00:06:32.478152 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 14 00:06:32.478160 kernel: raid6: neonx8 gen() 15348 MB/s Jan 14 00:06:32.478169 kernel: raid6: neonx4 gen() 11178 MB/s Jan 14 00:06:32.478177 kernel: raid6: neonx2 gen() 13076 MB/s Jan 14 00:06:32.478185 kernel: raid6: neonx1 gen() 10337 MB/s Jan 14 00:06:32.478194 kernel: raid6: int64x8 gen() 6741 MB/s Jan 14 00:06:32.478202 kernel: raid6: int64x4 gen() 7303 MB/s Jan 14 00:06:32.478210 kernel: raid6: int64x2 gen() 6057 MB/s Jan 14 00:06:32.478337 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 14 00:06:32.478350 kernel: raid6: int64x1 gen() 4907 MB/s Jan 14 00:06:32.478359 kernel: raid6: using algorithm neonx8 gen() 15348 MB/s Jan 14 00:06:32.478367 kernel: raid6: .... xor() 11861 MB/s, rmw enabled Jan 14 00:06:32.478377 kernel: raid6: using neon recovery algorithm Jan 14 00:06:32.478386 kernel: xor: measuring software checksum speed Jan 14 00:06:32.478393 kernel: 8regs : 19918 MB/sec Jan 14 00:06:32.478401 kernel: 32regs : 21653 MB/sec Jan 14 00:06:32.478409 kernel: arm64_neon : 28205 MB/sec Jan 14 00:06:32.478417 kernel: xor: using function: arm64_neon (28205 MB/sec) Jan 14 00:06:32.478425 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 14 00:06:32.478435 kernel: BTRFS: device fsid 68b1ce8e-a637-4e91-acf8-5a2e05e289e5 devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (213) Jan 14 00:06:32.478444 kernel: BTRFS info (device dm-0): first mount of filesystem 68b1ce8e-a637-4e91-acf8-5a2e05e289e5 Jan 14 00:06:32.478452 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 14 00:06:32.478460 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 14 00:06:32.478469 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 14 00:06:32.478477 kernel: BTRFS info (device dm-0): enabling free space tree Jan 14 00:06:32.478485 kernel: loop: module loaded Jan 14 00:06:32.478495 kernel: loop0: detected capacity change from 0 to 91832 Jan 14 00:06:32.478503 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 14 00:06:32.478640 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 14 00:06:32.478654 systemd[1]: Successfully made /usr/ read-only. Jan 14 00:06:32.478665 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 00:06:32.478677 systemd[1]: Detected virtualization kvm. Jan 14 00:06:32.478686 systemd[1]: Detected architecture arm64. Jan 14 00:06:32.478695 systemd[1]: Running in initrd. Jan 14 00:06:32.478703 systemd[1]: No hostname configured, using default hostname. Jan 14 00:06:32.478712 systemd[1]: Hostname set to . Jan 14 00:06:32.478721 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 00:06:32.478730 systemd[1]: Queued start job for default target initrd.target. Jan 14 00:06:32.478740 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 00:06:32.478749 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 00:06:32.478758 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 00:06:32.478767 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 14 00:06:32.478776 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 00:06:32.478785 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 14 00:06:32.478795 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 14 00:06:32.478804 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 00:06:32.478813 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 00:06:32.478822 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 14 00:06:32.478831 systemd[1]: Reached target paths.target - Path Units. Jan 14 00:06:32.478840 systemd[1]: Reached target slices.target - Slice Units. Jan 14 00:06:32.478850 systemd[1]: Reached target swap.target - Swaps. Jan 14 00:06:32.478858 systemd[1]: Reached target timers.target - Timer Units. Jan 14 00:06:32.478867 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 00:06:32.478877 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 00:06:32.478886 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 00:06:32.478894 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 14 00:06:32.478903 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 14 00:06:32.478913 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 00:06:32.478922 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 00:06:32.478931 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 00:06:32.478940 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 00:06:32.478948 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 14 00:06:32.478957 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 14 00:06:32.478966 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 00:06:32.478976 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 14 00:06:32.478985 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 14 00:06:32.478993 systemd[1]: Starting systemd-fsck-usr.service... Jan 14 00:06:32.479002 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 00:06:32.479011 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 00:06:32.479021 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:06:32.479030 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 14 00:06:32.479039 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 00:06:32.479086 systemd-journald[349]: Collecting audit messages is enabled. Jan 14 00:06:32.479113 systemd[1]: Finished systemd-fsck-usr.service. Jan 14 00:06:32.479123 kernel: audit: type=1130 audit(1768349192.423:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.479132 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 14 00:06:32.479141 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 14 00:06:32.479151 kernel: Bridge firewalling registered Jan 14 00:06:32.479160 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:06:32.479169 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 00:06:32.479178 kernel: audit: type=1130 audit(1768349192.467:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.479187 kernel: audit: type=1130 audit(1768349192.470:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.479195 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 14 00:06:32.479206 kernel: audit: type=1130 audit(1768349192.473:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.479215 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 14 00:06:32.479225 systemd-journald[349]: Journal started Jan 14 00:06:32.479254 systemd-journald[349]: Runtime Journal (/run/log/journal/f1028fbb609b41beaa27b1d33249148f) is 8M, max 76.5M, 68.5M free. Jan 14 00:06:32.423000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.467000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.470000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.473000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.465812 systemd-modules-load[351]: Inserted module 'br_netfilter' Jan 14 00:06:32.483537 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 00:06:32.486734 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 00:06:32.487000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.489574 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 00:06:32.489619 kernel: audit: type=1130 audit(1768349192.487:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.492871 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 00:06:32.505681 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 00:06:32.505000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.510047 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 00:06:32.511129 kernel: audit: type=1130 audit(1768349192.505:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.510000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.514461 systemd-tmpfiles[371]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 14 00:06:32.515676 kernel: audit: type=1130 audit(1768349192.510:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.515000 audit: BPF prog-id=6 op=LOAD Jan 14 00:06:32.518566 kernel: audit: type=1334 audit(1768349192.515:9): prog-id=6 op=LOAD Jan 14 00:06:32.517719 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 00:06:32.523667 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 00:06:32.523000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.528072 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 00:06:32.531473 kernel: audit: type=1130 audit(1768349192.523:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.529000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.544150 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 14 00:06:32.577419 dracut-cmdline[389]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=3d3f73de8d2693594dfefd279d2c8d77c282a05a4cbc54177503d31784261f6b Jan 14 00:06:32.592862 systemd-resolved[383]: Positive Trust Anchors: Jan 14 00:06:32.592892 systemd-resolved[383]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 00:06:32.592895 systemd-resolved[383]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 00:06:32.592927 systemd-resolved[383]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 00:06:32.628353 systemd-resolved[383]: Defaulting to hostname 'linux'. Jan 14 00:06:32.630480 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 00:06:32.630000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.631272 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 00:06:32.691563 kernel: Loading iSCSI transport class v2.0-870. Jan 14 00:06:32.701544 kernel: iscsi: registered transport (tcp) Jan 14 00:06:32.715561 kernel: iscsi: registered transport (qla4xxx) Jan 14 00:06:32.715625 kernel: QLogic iSCSI HBA Driver Jan 14 00:06:32.743503 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 00:06:32.774986 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 00:06:32.777000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.779041 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 00:06:32.831438 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 14 00:06:32.832000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.833393 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 14 00:06:32.834747 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 14 00:06:32.878766 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 14 00:06:32.878000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.879000 audit: BPF prog-id=7 op=LOAD Jan 14 00:06:32.879000 audit: BPF prog-id=8 op=LOAD Jan 14 00:06:32.880768 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 00:06:32.918025 systemd-udevd[621]: Using default interface naming scheme 'v257'. Jan 14 00:06:32.927835 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 00:06:32.928000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.932790 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 14 00:06:32.965770 dracut-pre-trigger[679]: rd.md=0: removing MD RAID activation Jan 14 00:06:32.989681 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 00:06:32.991000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:32.992000 audit: BPF prog-id=9 op=LOAD Jan 14 00:06:32.994021 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 00:06:33.003716 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 00:06:33.004000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:33.009900 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 00:06:33.051196 systemd-networkd[761]: lo: Link UP Jan 14 00:06:33.051890 systemd-networkd[761]: lo: Gained carrier Jan 14 00:06:33.052000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:33.052812 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 00:06:33.053515 systemd[1]: Reached target network.target - Network. Jan 14 00:06:33.085000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:33.085007 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 00:06:33.089032 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 14 00:06:33.215268 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jan 14 00:06:33.247053 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jan 14 00:06:33.264175 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jan 14 00:06:33.282937 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 14 00:06:33.282271 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 14 00:06:33.285065 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 14 00:06:33.290531 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 14 00:06:33.290790 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 14 00:06:33.307130 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 00:06:33.308082 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:06:33.310000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:33.310737 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:06:33.314700 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:06:33.316661 systemd-networkd[761]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:06:33.316665 systemd-networkd[761]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 00:06:33.329555 disk-uuid[810]: Primary Header is updated. Jan 14 00:06:33.329555 disk-uuid[810]: Secondary Entries is updated. Jan 14 00:06:33.329555 disk-uuid[810]: Secondary Header is updated. Jan 14 00:06:33.320710 systemd-networkd[761]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:06:33.320714 systemd-networkd[761]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 00:06:33.322132 systemd-networkd[761]: eth1: Link UP Jan 14 00:06:33.322229 systemd-networkd[761]: eth1: Gained carrier Jan 14 00:06:33.322241 systemd-networkd[761]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:06:33.326030 systemd-networkd[761]: eth0: Link UP Jan 14 00:06:33.328793 systemd-networkd[761]: eth0: Gained carrier Jan 14 00:06:33.328807 systemd-networkd[761]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:06:33.357177 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 14 00:06:33.357389 kernel: usbcore: registered new interface driver usbhid Jan 14 00:06:33.357401 kernel: usbhid: USB HID core driver Jan 14 00:06:33.353000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:33.350235 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:06:33.368594 systemd-networkd[761]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 14 00:06:33.388649 systemd-networkd[761]: eth0: DHCPv4 address 46.224.77.139/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 14 00:06:33.420607 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 14 00:06:33.422000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:33.423426 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 00:06:33.424975 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 00:06:33.425643 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 00:06:33.428748 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 14 00:06:33.455353 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 14 00:06:33.456000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:34.371176 disk-uuid[812]: Warning: The kernel is still using the old partition table. Jan 14 00:06:34.371176 disk-uuid[812]: The new table will be used at the next reboot or after you Jan 14 00:06:34.371176 disk-uuid[812]: run partprobe(8) or kpartx(8) Jan 14 00:06:34.371176 disk-uuid[812]: The operation has completed successfully. Jan 14 00:06:34.380757 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 14 00:06:34.381000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:34.381000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:34.380889 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 14 00:06:34.382627 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 14 00:06:34.429596 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (840) Jan 14 00:06:34.432383 kernel: BTRFS info (device sda6): first mount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 14 00:06:34.432591 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 14 00:06:34.436567 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 14 00:06:34.436636 kernel: BTRFS info (device sda6): turning on async discard Jan 14 00:06:34.436658 kernel: BTRFS info (device sda6): enabling free space tree Jan 14 00:06:34.446066 kernel: BTRFS info (device sda6): last unmount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 14 00:06:34.446225 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 14 00:06:34.447000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:34.449697 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 14 00:06:34.490664 systemd-networkd[761]: eth1: Gained IPv6LL Jan 14 00:06:34.581700 ignition[859]: Ignition 2.24.0 Jan 14 00:06:34.581717 ignition[859]: Stage: fetch-offline Jan 14 00:06:34.581754 ignition[859]: no configs at "/usr/lib/ignition/base.d" Jan 14 00:06:34.581764 ignition[859]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 00:06:34.586715 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 00:06:34.587000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:34.581925 ignition[859]: parsed url from cmdline: "" Jan 14 00:06:34.589242 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 14 00:06:34.581928 ignition[859]: no config URL provided Jan 14 00:06:34.581936 ignition[859]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 00:06:34.581944 ignition[859]: no config at "/usr/lib/ignition/user.ign" Jan 14 00:06:34.581950 ignition[859]: failed to fetch config: resource requires networking Jan 14 00:06:34.582617 ignition[859]: Ignition finished successfully Jan 14 00:06:34.618794 ignition[868]: Ignition 2.24.0 Jan 14 00:06:34.618806 ignition[868]: Stage: fetch Jan 14 00:06:34.618950 ignition[868]: no configs at "/usr/lib/ignition/base.d" Jan 14 00:06:34.618958 ignition[868]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 00:06:34.619046 ignition[868]: parsed url from cmdline: "" Jan 14 00:06:34.619053 ignition[868]: no config URL provided Jan 14 00:06:34.619060 ignition[868]: reading system config file "/usr/lib/ignition/user.ign" Jan 14 00:06:34.619065 ignition[868]: no config at "/usr/lib/ignition/user.ign" Jan 14 00:06:34.619097 ignition[868]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jan 14 00:06:34.627994 ignition[868]: GET result: OK Jan 14 00:06:34.628112 ignition[868]: parsing config with SHA512: 95668b6f19ea8ff86ba482f07ad2394cc09d2ac7270575986e4110638b6a6cb429fc3a65b62ca70a4dd01b1e0e2409764c22e87659b040484007e9cffb54ca25 Jan 14 00:06:34.637334 unknown[868]: fetched base config from "system" Jan 14 00:06:34.637346 unknown[868]: fetched base config from "system" Jan 14 00:06:34.637727 ignition[868]: fetch: fetch complete Jan 14 00:06:34.637351 unknown[868]: fetched user config from "hetzner" Jan 14 00:06:34.637732 ignition[868]: fetch: fetch passed Jan 14 00:06:34.637775 ignition[868]: Ignition finished successfully Jan 14 00:06:34.645622 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 14 00:06:34.646000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:34.648321 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 14 00:06:34.675392 ignition[875]: Ignition 2.24.0 Jan 14 00:06:34.675407 ignition[875]: Stage: kargs Jan 14 00:06:34.675650 ignition[875]: no configs at "/usr/lib/ignition/base.d" Jan 14 00:06:34.675659 ignition[875]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 00:06:34.676468 ignition[875]: kargs: kargs passed Jan 14 00:06:34.676515 ignition[875]: Ignition finished successfully Jan 14 00:06:34.679013 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 14 00:06:34.679000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:34.681355 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 14 00:06:34.682795 systemd-networkd[761]: eth0: Gained IPv6LL Jan 14 00:06:34.714285 ignition[881]: Ignition 2.24.0 Jan 14 00:06:34.714303 ignition[881]: Stage: disks Jan 14 00:06:34.714466 ignition[881]: no configs at "/usr/lib/ignition/base.d" Jan 14 00:06:34.714474 ignition[881]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 00:06:34.715299 ignition[881]: disks: disks passed Jan 14 00:06:34.715346 ignition[881]: Ignition finished successfully Jan 14 00:06:34.717000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:34.717409 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 14 00:06:34.718653 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 14 00:06:34.720658 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 14 00:06:34.721269 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 00:06:34.721844 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 00:06:34.723077 systemd[1]: Reached target basic.target - Basic System. Jan 14 00:06:34.725090 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 14 00:06:34.766153 systemd-fsck[889]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 14 00:06:34.769717 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 14 00:06:34.770000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:34.772981 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 14 00:06:34.854578 kernel: EXT4-fs (sda9): mounted filesystem db887ae3-d64c-46de-9f1e-de51a801ae44 r/w with ordered data mode. Quota mode: none. Jan 14 00:06:34.856483 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 14 00:06:34.858597 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 14 00:06:34.860823 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 00:06:34.862847 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 14 00:06:34.871779 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jan 14 00:06:34.872668 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 14 00:06:34.872703 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 00:06:34.878814 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 14 00:06:34.882702 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (897) Jan 14 00:06:34.882726 kernel: BTRFS info (device sda6): first mount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 14 00:06:34.883907 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 14 00:06:34.886705 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 14 00:06:34.891816 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 14 00:06:34.891861 kernel: BTRFS info (device sda6): turning on async discard Jan 14 00:06:34.891873 kernel: BTRFS info (device sda6): enabling free space tree Jan 14 00:06:34.894337 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 00:06:34.941775 coreos-metadata[899]: Jan 14 00:06:34.941 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jan 14 00:06:34.944010 coreos-metadata[899]: Jan 14 00:06:34.943 INFO Fetch successful Jan 14 00:06:34.944010 coreos-metadata[899]: Jan 14 00:06:34.943 INFO wrote hostname ci-4547-0-0-n-fb1a601aa4 to /sysroot/etc/hostname Jan 14 00:06:34.947755 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 14 00:06:34.948000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:35.061866 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 14 00:06:35.061000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:35.063402 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 14 00:06:35.066741 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 14 00:06:35.092551 kernel: BTRFS info (device sda6): last unmount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 14 00:06:35.111578 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 14 00:06:35.112000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:35.115629 ignition[1002]: INFO : Ignition 2.24.0 Jan 14 00:06:35.115629 ignition[1002]: INFO : Stage: mount Jan 14 00:06:35.116723 ignition[1002]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 00:06:35.116723 ignition[1002]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 00:06:35.116723 ignition[1002]: INFO : mount: mount passed Jan 14 00:06:35.116723 ignition[1002]: INFO : Ignition finished successfully Jan 14 00:06:35.118000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:35.118619 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 14 00:06:35.121145 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 14 00:06:35.416441 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 14 00:06:35.418003 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 14 00:06:35.460578 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1012) Jan 14 00:06:35.462725 kernel: BTRFS info (device sda6): first mount of filesystem 2541ae65-ff33-437e-8e42-0fd50870835d Jan 14 00:06:35.463113 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 14 00:06:35.466745 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 14 00:06:35.466819 kernel: BTRFS info (device sda6): turning on async discard Jan 14 00:06:35.466841 kernel: BTRFS info (device sda6): enabling free space tree Jan 14 00:06:35.469277 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 14 00:06:35.500554 ignition[1029]: INFO : Ignition 2.24.0 Jan 14 00:06:35.500554 ignition[1029]: INFO : Stage: files Jan 14 00:06:35.500554 ignition[1029]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 00:06:35.500554 ignition[1029]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 00:06:35.504032 ignition[1029]: DEBUG : files: compiled without relabeling support, skipping Jan 14 00:06:35.504890 ignition[1029]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 14 00:06:35.504890 ignition[1029]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 14 00:06:35.510036 ignition[1029]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 14 00:06:35.512050 ignition[1029]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 14 00:06:35.514237 ignition[1029]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 14 00:06:35.514063 unknown[1029]: wrote ssh authorized keys file for user: core Jan 14 00:06:35.521334 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 14 00:06:35.523681 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jan 14 00:06:35.618142 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 14 00:06:35.693933 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 14 00:06:35.693933 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 14 00:06:35.697186 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 14 00:06:35.697186 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 14 00:06:35.697186 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 14 00:06:35.697186 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 00:06:35.697186 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 14 00:06:35.697186 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 00:06:35.697186 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 14 00:06:35.705658 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 00:06:35.705658 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 14 00:06:35.705658 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 14 00:06:35.709386 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 14 00:06:35.709386 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 14 00:06:35.709386 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Jan 14 00:06:36.061963 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 14 00:06:36.538585 ignition[1029]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Jan 14 00:06:36.538585 ignition[1029]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 14 00:06:36.543260 ignition[1029]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 00:06:36.545079 ignition[1029]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 14 00:06:36.545079 ignition[1029]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 14 00:06:36.545079 ignition[1029]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jan 14 00:06:36.545079 ignition[1029]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 14 00:06:36.545079 ignition[1029]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 14 00:06:36.545079 ignition[1029]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jan 14 00:06:36.545079 ignition[1029]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jan 14 00:06:36.545079 ignition[1029]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jan 14 00:06:36.556398 kernel: kauditd_printk_skb: 29 callbacks suppressed Jan 14 00:06:36.556445 kernel: audit: type=1130 audit(1768349196.551:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.551000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.556547 ignition[1029]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 14 00:06:36.556547 ignition[1029]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 14 00:06:36.556547 ignition[1029]: INFO : files: files passed Jan 14 00:06:36.556547 ignition[1029]: INFO : Ignition finished successfully Jan 14 00:06:36.550569 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 14 00:06:36.555682 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 14 00:06:36.560824 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 14 00:06:36.575574 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 14 00:06:36.576393 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 14 00:06:36.578000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.583113 kernel: audit: type=1130 audit(1768349196.578:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.583149 kernel: audit: type=1131 audit(1768349196.578:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.578000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.587226 initrd-setup-root-after-ignition[1060]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 00:06:36.587226 initrd-setup-root-after-ignition[1060]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 14 00:06:36.591121 initrd-setup-root-after-ignition[1064]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 14 00:06:36.594503 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 00:06:36.595000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.596446 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 14 00:06:36.599743 kernel: audit: type=1130 audit(1768349196.595:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.600956 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 14 00:06:36.656670 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 14 00:06:36.656833 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 14 00:06:36.665491 kernel: audit: type=1130 audit(1768349196.659:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.665575 kernel: audit: type=1131 audit(1768349196.659:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.659000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.659000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.660222 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 14 00:06:36.668721 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 14 00:06:36.670644 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 14 00:06:36.672945 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 14 00:06:36.707000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.706978 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 00:06:36.710748 kernel: audit: type=1130 audit(1768349196.707:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.712371 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 14 00:06:36.732599 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 14 00:06:36.732859 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 14 00:06:36.734935 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 00:06:36.736350 systemd[1]: Stopped target timers.target - Timer Units. Jan 14 00:06:36.737005 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 14 00:06:36.737131 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 14 00:06:36.738000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.739582 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 14 00:06:36.742375 kernel: audit: type=1131 audit(1768349196.738:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.741952 systemd[1]: Stopped target basic.target - Basic System. Jan 14 00:06:36.743169 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 14 00:06:36.744915 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 14 00:06:36.745819 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 14 00:06:36.747098 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 14 00:06:36.748229 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 14 00:06:36.749381 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 14 00:06:36.751594 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 14 00:06:36.753626 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 14 00:06:36.755229 systemd[1]: Stopped target swap.target - Swaps. Jan 14 00:06:36.755749 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 14 00:06:36.755876 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 14 00:06:36.759222 kernel: audit: type=1131 audit(1768349196.756:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.756000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.758526 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 14 00:06:36.759820 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 00:06:36.761181 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 14 00:06:36.761268 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 00:06:36.762749 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 14 00:06:36.764000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.762872 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 14 00:06:36.767986 kernel: audit: type=1131 audit(1768349196.764:49): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.766000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.764834 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 14 00:06:36.768000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.764968 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 14 00:06:36.769000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.767556 systemd[1]: ignition-files.service: Deactivated successfully. Jan 14 00:06:36.767662 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 14 00:06:36.768617 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jan 14 00:06:36.768720 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 14 00:06:36.770709 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 14 00:06:36.773113 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 14 00:06:36.773000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.773296 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 00:06:36.777426 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 14 00:06:36.779000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.778057 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 14 00:06:36.780000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.778183 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 00:06:36.781000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.779923 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 14 00:06:36.780021 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 00:06:36.780963 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 14 00:06:36.781065 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 14 00:06:36.791076 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 14 00:06:36.795572 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 14 00:06:36.795000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.795000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.803470 ignition[1084]: INFO : Ignition 2.24.0 Jan 14 00:06:36.803470 ignition[1084]: INFO : Stage: umount Jan 14 00:06:36.805619 ignition[1084]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 14 00:06:36.805619 ignition[1084]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 14 00:06:36.805619 ignition[1084]: INFO : umount: umount passed Jan 14 00:06:36.805619 ignition[1084]: INFO : Ignition finished successfully Jan 14 00:06:36.810000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.812000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.806363 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 14 00:06:36.813000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.814000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.819000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.806491 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 14 00:06:36.810869 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 14 00:06:36.810961 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 14 00:06:36.812439 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 14 00:06:36.812536 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 14 00:06:36.814043 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 14 00:06:36.814095 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 14 00:06:36.814777 systemd[1]: Stopped target network.target - Network. Jan 14 00:06:36.815299 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 14 00:06:36.815355 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 14 00:06:36.819701 systemd[1]: Stopped target paths.target - Path Units. Jan 14 00:06:36.824113 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 14 00:06:36.828034 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 00:06:36.829417 systemd[1]: Stopped target slices.target - Slice Units. Jan 14 00:06:36.839020 systemd[1]: Stopped target sockets.target - Socket Units. Jan 14 00:06:36.839587 systemd[1]: iscsid.socket: Deactivated successfully. Jan 14 00:06:36.844000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.849000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.839652 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 14 00:06:36.840165 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 14 00:06:36.840195 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 14 00:06:36.843495 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 14 00:06:36.843599 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 14 00:06:36.860000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.844837 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 14 00:06:36.862000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.844906 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 14 00:06:36.847632 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 14 00:06:36.847697 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 14 00:06:36.849741 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 14 00:06:36.851770 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 14 00:06:36.866000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.853788 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 14 00:06:36.857154 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 14 00:06:36.857304 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 14 00:06:36.861129 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 14 00:06:36.861231 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 14 00:06:36.866349 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 14 00:06:36.866471 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 14 00:06:36.872000 audit: BPF prog-id=6 op=UNLOAD Jan 14 00:06:36.872000 audit: BPF prog-id=9 op=UNLOAD Jan 14 00:06:36.873295 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 14 00:06:36.874122 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 14 00:06:36.874224 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 14 00:06:36.875000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.875136 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 14 00:06:36.875192 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 14 00:06:36.876999 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 14 00:06:36.879000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.878672 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 14 00:06:36.878739 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 14 00:06:36.881800 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 14 00:06:36.881000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.881858 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 14 00:06:36.882493 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 14 00:06:36.884348 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 14 00:06:36.884000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.885094 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 00:06:36.903918 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 14 00:06:36.904080 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 00:06:36.906000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.907388 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 14 00:06:36.907469 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 14 00:06:36.908356 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 14 00:06:36.908393 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 00:06:36.911809 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 14 00:06:36.911000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.911881 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 14 00:06:36.913000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.912984 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 14 00:06:36.914000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.913038 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 14 00:06:36.913727 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 14 00:06:36.913778 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 14 00:06:36.919000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.916450 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 14 00:06:36.920000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.917559 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 14 00:06:36.917628 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 00:06:36.920617 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 14 00:06:36.920676 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 00:06:36.922708 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 00:06:36.925000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.922761 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:06:36.926278 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 14 00:06:36.926383 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 14 00:06:36.926000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.935723 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 14 00:06:36.935950 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 14 00:06:36.939000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.939000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:36.940172 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 14 00:06:36.942017 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 14 00:06:36.960036 systemd[1]: Switching root. Jan 14 00:06:36.986893 systemd-journald[349]: Journal stopped Jan 14 00:06:37.933936 systemd-journald[349]: Received SIGTERM from PID 1 (systemd). Jan 14 00:06:37.934014 kernel: SELinux: policy capability network_peer_controls=1 Jan 14 00:06:37.934028 kernel: SELinux: policy capability open_perms=1 Jan 14 00:06:37.934039 kernel: SELinux: policy capability extended_socket_class=1 Jan 14 00:06:37.934049 kernel: SELinux: policy capability always_check_network=0 Jan 14 00:06:37.934063 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 14 00:06:37.934075 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 14 00:06:37.934089 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 14 00:06:37.934099 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 14 00:06:37.934110 kernel: SELinux: policy capability userspace_initial_context=0 Jan 14 00:06:37.934120 systemd[1]: Successfully loaded SELinux policy in 52.507ms. Jan 14 00:06:37.934142 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.624ms. Jan 14 00:06:37.934154 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 14 00:06:37.934166 systemd[1]: Detected virtualization kvm. Jan 14 00:06:37.934180 systemd[1]: Detected architecture arm64. Jan 14 00:06:37.934192 systemd[1]: Detected first boot. Jan 14 00:06:37.934203 systemd[1]: Hostname set to . Jan 14 00:06:37.934214 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 14 00:06:37.934226 zram_generator::config[1127]: No configuration found. Jan 14 00:06:37.934241 kernel: NET: Registered PF_VSOCK protocol family Jan 14 00:06:37.934273 systemd[1]: Populated /etc with preset unit settings. Jan 14 00:06:37.934285 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 14 00:06:37.934296 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 14 00:06:37.934308 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 14 00:06:37.934319 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 14 00:06:37.934330 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 14 00:06:37.934341 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 14 00:06:37.934353 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 14 00:06:37.934364 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 14 00:06:37.934376 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 14 00:06:37.934386 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 14 00:06:37.934397 systemd[1]: Created slice user.slice - User and Session Slice. Jan 14 00:06:37.934407 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 14 00:06:37.934421 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 14 00:06:37.934433 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 14 00:06:37.934444 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 14 00:06:37.934455 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 14 00:06:37.934466 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 14 00:06:37.934477 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 14 00:06:37.934487 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 14 00:06:37.934500 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 14 00:06:37.934511 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 14 00:06:37.934539 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 14 00:06:37.934550 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 14 00:06:37.937602 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 14 00:06:37.937619 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 14 00:06:37.937638 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 14 00:06:37.937650 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 14 00:06:37.937662 systemd[1]: Reached target slices.target - Slice Units. Jan 14 00:06:37.937674 systemd[1]: Reached target swap.target - Swaps. Jan 14 00:06:37.937686 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 14 00:06:37.937698 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 14 00:06:37.937713 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 14 00:06:37.937726 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 14 00:06:37.937737 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 14 00:06:37.937748 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 14 00:06:37.937760 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 14 00:06:37.937771 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 14 00:06:37.937784 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 14 00:06:37.937795 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 14 00:06:37.937808 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 14 00:06:37.937820 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 14 00:06:37.937831 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 14 00:06:37.937845 systemd[1]: Mounting media.mount - External Media Directory... Jan 14 00:06:37.937856 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 14 00:06:37.937867 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 14 00:06:37.937878 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 14 00:06:37.937891 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 14 00:06:37.937902 systemd[1]: Reached target machines.target - Containers. Jan 14 00:06:37.937913 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 14 00:06:37.937924 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 00:06:37.937935 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 14 00:06:37.937948 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 14 00:06:37.937959 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 00:06:37.937971 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 00:06:37.937982 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 00:06:37.937993 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 14 00:06:37.938005 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 00:06:37.938016 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 14 00:06:37.938029 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 14 00:06:37.938040 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 14 00:06:37.938051 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 14 00:06:37.938062 systemd[1]: Stopped systemd-fsck-usr.service. Jan 14 00:06:37.938074 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 00:06:37.938087 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 14 00:06:37.938099 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 14 00:06:37.938113 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 14 00:06:37.938126 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 14 00:06:37.938141 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 14 00:06:37.938155 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 14 00:06:37.938168 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 14 00:06:37.938181 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 14 00:06:37.938193 systemd[1]: Mounted media.mount - External Media Directory. Jan 14 00:06:37.938204 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 14 00:06:37.938216 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 14 00:06:37.938228 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 14 00:06:37.938240 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 14 00:06:37.938268 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 14 00:06:37.938281 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 14 00:06:37.938292 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 00:06:37.938303 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 00:06:37.938314 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 00:06:37.938328 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 00:06:37.938339 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 00:06:37.938350 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 00:06:37.938361 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 14 00:06:37.938373 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 14 00:06:37.938383 kernel: fuse: init (API version 7.41) Jan 14 00:06:37.938395 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 14 00:06:37.938407 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 14 00:06:37.938419 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 14 00:06:37.938429 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 14 00:06:37.938441 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 14 00:06:37.938452 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 00:06:37.938464 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 00:06:37.938476 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 14 00:06:37.938487 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 00:06:37.938499 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 14 00:06:37.938510 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 00:06:37.944094 systemd-journald[1190]: Collecting audit messages is enabled. Jan 14 00:06:37.944130 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 14 00:06:37.944142 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 14 00:06:37.944160 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 14 00:06:37.944171 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 14 00:06:37.944184 systemd-journald[1190]: Journal started Jan 14 00:06:37.944206 systemd-journald[1190]: Runtime Journal (/run/log/journal/f1028fbb609b41beaa27b1d33249148f) is 8M, max 76.5M, 68.5M free. Jan 14 00:06:37.806000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.808000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.809000 audit: BPF prog-id=14 op=UNLOAD Jan 14 00:06:37.809000 audit: BPF prog-id=13 op=UNLOAD Jan 14 00:06:37.814000 audit: BPF prog-id=15 op=LOAD Jan 14 00:06:37.814000 audit: BPF prog-id=16 op=LOAD Jan 14 00:06:37.814000 audit: BPF prog-id=17 op=LOAD Jan 14 00:06:37.862000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.873000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.873000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.877000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.877000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.881000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.881000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.884000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.884000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.887000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.890000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.925000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 14 00:06:37.925000 audit[1190]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=4 a1=ffffda177cc0 a2=4000 a3=0 items=0 ppid=1 pid=1190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:06:37.925000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 14 00:06:37.629687 systemd[1]: Queued start job for default target multi-user.target. Jan 14 00:06:37.655420 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 14 00:06:37.656057 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 14 00:06:37.951968 systemd[1]: Started systemd-journald.service - Journal Service. Jan 14 00:06:37.945000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.945000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.946000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.950000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.951000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.948656 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 14 00:06:37.950998 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 14 00:06:37.952726 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 14 00:06:37.971609 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 14 00:06:37.975544 kernel: ACPI: bus type drm_connector registered Jan 14 00:06:37.978587 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 14 00:06:37.980887 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 00:06:37.981064 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 00:06:37.981000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.981000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.982122 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 14 00:06:37.982000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:37.985414 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 14 00:06:37.992744 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 14 00:06:38.003565 kernel: loop1: detected capacity change from 0 to 211168 Jan 14 00:06:38.012484 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 14 00:06:38.015000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.033618 systemd-journald[1190]: Time spent on flushing to /var/log/journal/f1028fbb609b41beaa27b1d33249148f is 56.999ms for 1292 entries. Jan 14 00:06:38.033618 systemd-journald[1190]: System Journal (/var/log/journal/f1028fbb609b41beaa27b1d33249148f) is 8M, max 588.1M, 580.1M free. Jan 14 00:06:38.103572 systemd-journald[1190]: Received client request to flush runtime journal. Jan 14 00:06:38.103611 kernel: loop2: detected capacity change from 0 to 100192 Jan 14 00:06:38.103625 kernel: loop3: detected capacity change from 0 to 8 Jan 14 00:06:38.038000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.059000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.037575 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 14 00:06:38.057954 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 14 00:06:38.064827 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 14 00:06:38.106587 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 14 00:06:38.107000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.122544 kernel: loop4: detected capacity change from 0 to 45344 Jan 14 00:06:38.127141 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 14 00:06:38.127000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.130000 audit: BPF prog-id=18 op=LOAD Jan 14 00:06:38.132000 audit: BPF prog-id=19 op=LOAD Jan 14 00:06:38.132000 audit: BPF prog-id=20 op=LOAD Jan 14 00:06:38.134862 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 14 00:06:38.136000 audit: BPF prog-id=21 op=LOAD Jan 14 00:06:38.138785 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 14 00:06:38.143053 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 14 00:06:38.148000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.147242 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 14 00:06:38.155000 audit: BPF prog-id=22 op=LOAD Jan 14 00:06:38.155000 audit: BPF prog-id=23 op=LOAD Jan 14 00:06:38.155000 audit: BPF prog-id=24 op=LOAD Jan 14 00:06:38.156836 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 14 00:06:38.159000 audit: BPF prog-id=25 op=LOAD Jan 14 00:06:38.159000 audit: BPF prog-id=26 op=LOAD Jan 14 00:06:38.159000 audit: BPF prog-id=27 op=LOAD Jan 14 00:06:38.166421 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 14 00:06:38.182548 kernel: loop5: detected capacity change from 0 to 211168 Jan 14 00:06:38.215362 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. Jan 14 00:06:38.215379 systemd-tmpfiles[1267]: ACLs are not supported, ignoring. Jan 14 00:06:38.223787 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 14 00:06:38.224000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.227585 kernel: loop6: detected capacity change from 0 to 100192 Jan 14 00:06:38.230601 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 14 00:06:38.230000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.231844 systemd-nsresourced[1269]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 14 00:06:38.236033 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 14 00:06:38.236000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.246552 kernel: loop7: detected capacity change from 0 to 8 Jan 14 00:06:38.250814 kernel: loop1: detected capacity change from 0 to 45344 Jan 14 00:06:38.260919 (sd-merge)[1273]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Jan 14 00:06:38.266309 (sd-merge)[1273]: Merged extensions into '/usr'. Jan 14 00:06:38.275792 systemd[1]: Reload requested from client PID 1218 ('systemd-sysext') (unit systemd-sysext.service)... Jan 14 00:06:38.275811 systemd[1]: Reloading... Jan 14 00:06:38.374198 systemd-oomd[1265]: No swap; memory pressure usage will be degraded Jan 14 00:06:38.395980 systemd-resolved[1266]: Positive Trust Anchors: Jan 14 00:06:38.397576 systemd-resolved[1266]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 14 00:06:38.397584 systemd-resolved[1266]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 14 00:06:38.397619 systemd-resolved[1266]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 14 00:06:38.403006 zram_generator::config[1321]: No configuration found. Jan 14 00:06:38.412724 systemd-resolved[1266]: Using system hostname 'ci-4547-0-0-n-fb1a601aa4'. Jan 14 00:06:38.582160 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 14 00:06:38.582750 systemd[1]: Reloading finished in 306 ms. Jan 14 00:06:38.599000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.598423 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 14 00:06:38.599399 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 14 00:06:38.600000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.602169 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 14 00:06:38.602000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.604598 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 14 00:06:38.612782 systemd[1]: Starting ensure-sysext.service... Jan 14 00:06:38.619440 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 14 00:06:38.626000 audit: BPF prog-id=28 op=LOAD Jan 14 00:06:38.626000 audit: BPF prog-id=25 op=UNLOAD Jan 14 00:06:38.627000 audit: BPF prog-id=29 op=LOAD Jan 14 00:06:38.627000 audit: BPF prog-id=30 op=LOAD Jan 14 00:06:38.627000 audit: BPF prog-id=26 op=UNLOAD Jan 14 00:06:38.627000 audit: BPF prog-id=27 op=UNLOAD Jan 14 00:06:38.629000 audit: BPF prog-id=31 op=LOAD Jan 14 00:06:38.633000 audit: BPF prog-id=22 op=UNLOAD Jan 14 00:06:38.633000 audit: BPF prog-id=32 op=LOAD Jan 14 00:06:38.633000 audit: BPF prog-id=33 op=LOAD Jan 14 00:06:38.633000 audit: BPF prog-id=23 op=UNLOAD Jan 14 00:06:38.633000 audit: BPF prog-id=24 op=UNLOAD Jan 14 00:06:38.636000 audit: BPF prog-id=34 op=LOAD Jan 14 00:06:38.636000 audit: BPF prog-id=18 op=UNLOAD Jan 14 00:06:38.636000 audit: BPF prog-id=35 op=LOAD Jan 14 00:06:38.636000 audit: BPF prog-id=36 op=LOAD Jan 14 00:06:38.636000 audit: BPF prog-id=19 op=UNLOAD Jan 14 00:06:38.636000 audit: BPF prog-id=20 op=UNLOAD Jan 14 00:06:38.637000 audit: BPF prog-id=37 op=LOAD Jan 14 00:06:38.638000 audit: BPF prog-id=15 op=UNLOAD Jan 14 00:06:38.638000 audit: BPF prog-id=38 op=LOAD Jan 14 00:06:38.638000 audit: BPF prog-id=39 op=LOAD Jan 14 00:06:38.638000 audit: BPF prog-id=16 op=UNLOAD Jan 14 00:06:38.638000 audit: BPF prog-id=17 op=UNLOAD Jan 14 00:06:38.639000 audit: BPF prog-id=40 op=LOAD Jan 14 00:06:38.640000 audit: BPF prog-id=21 op=UNLOAD Jan 14 00:06:38.653691 systemd[1]: Reload requested from client PID 1354 ('systemctl') (unit ensure-sysext.service)... Jan 14 00:06:38.653708 systemd[1]: Reloading... Jan 14 00:06:38.660370 systemd-tmpfiles[1355]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 14 00:06:38.660403 systemd-tmpfiles[1355]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 14 00:06:38.660667 systemd-tmpfiles[1355]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 14 00:06:38.661663 systemd-tmpfiles[1355]: ACLs are not supported, ignoring. Jan 14 00:06:38.661724 systemd-tmpfiles[1355]: ACLs are not supported, ignoring. Jan 14 00:06:38.666983 systemd-tmpfiles[1355]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 00:06:38.666996 systemd-tmpfiles[1355]: Skipping /boot Jan 14 00:06:38.677236 systemd-tmpfiles[1355]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 00:06:38.677282 systemd-tmpfiles[1355]: Skipping /boot Jan 14 00:06:38.722568 zram_generator::config[1383]: No configuration found. Jan 14 00:06:38.887137 systemd[1]: Reloading finished in 233 ms. Jan 14 00:06:38.913578 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 14 00:06:38.913000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.916000 audit: BPF prog-id=41 op=LOAD Jan 14 00:06:38.917000 audit: BPF prog-id=34 op=UNLOAD Jan 14 00:06:38.917000 audit: BPF prog-id=42 op=LOAD Jan 14 00:06:38.917000 audit: BPF prog-id=43 op=LOAD Jan 14 00:06:38.917000 audit: BPF prog-id=35 op=UNLOAD Jan 14 00:06:38.917000 audit: BPF prog-id=36 op=UNLOAD Jan 14 00:06:38.918000 audit: BPF prog-id=44 op=LOAD Jan 14 00:06:38.918000 audit: BPF prog-id=37 op=UNLOAD Jan 14 00:06:38.918000 audit: BPF prog-id=45 op=LOAD Jan 14 00:06:38.918000 audit: BPF prog-id=46 op=LOAD Jan 14 00:06:38.918000 audit: BPF prog-id=38 op=UNLOAD Jan 14 00:06:38.918000 audit: BPF prog-id=39 op=UNLOAD Jan 14 00:06:38.919000 audit: BPF prog-id=47 op=LOAD Jan 14 00:06:38.919000 audit: BPF prog-id=40 op=UNLOAD Jan 14 00:06:38.920000 audit: BPF prog-id=48 op=LOAD Jan 14 00:06:38.920000 audit: BPF prog-id=31 op=UNLOAD Jan 14 00:06:38.921000 audit: BPF prog-id=49 op=LOAD Jan 14 00:06:38.921000 audit: BPF prog-id=50 op=LOAD Jan 14 00:06:38.921000 audit: BPF prog-id=32 op=UNLOAD Jan 14 00:06:38.921000 audit: BPF prog-id=33 op=UNLOAD Jan 14 00:06:38.932000 audit: BPF prog-id=51 op=LOAD Jan 14 00:06:38.933000 audit: BPF prog-id=28 op=UNLOAD Jan 14 00:06:38.933000 audit: BPF prog-id=52 op=LOAD Jan 14 00:06:38.933000 audit: BPF prog-id=53 op=LOAD Jan 14 00:06:38.933000 audit: BPF prog-id=29 op=UNLOAD Jan 14 00:06:38.933000 audit: BPF prog-id=30 op=UNLOAD Jan 14 00:06:38.938834 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 14 00:06:38.939000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:38.948554 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 14 00:06:38.950669 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 00:06:38.955070 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 14 00:06:38.963382 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 14 00:06:38.967092 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 14 00:06:38.968000 audit: BPF prog-id=7 op=UNLOAD Jan 14 00:06:38.968000 audit: BPF prog-id=8 op=UNLOAD Jan 14 00:06:38.970000 audit: BPF prog-id=54 op=LOAD Jan 14 00:06:38.970000 audit: BPF prog-id=55 op=LOAD Jan 14 00:06:38.971870 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 14 00:06:38.975514 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 14 00:06:38.981059 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 14 00:06:38.985907 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 00:06:38.992024 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 14 00:06:38.996769 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 14 00:06:39.005240 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 14 00:06:39.006632 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 00:06:39.006870 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 00:06:39.006967 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 00:06:39.013398 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 00:06:39.014094 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 00:06:39.014296 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 00:06:39.014390 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 00:06:39.019474 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 14 00:06:39.021000 audit[1432]: SYSTEM_BOOT pid=1432 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.037564 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 14 00:06:39.038767 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 14 00:06:39.038967 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 14 00:06:39.039055 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 14 00:06:39.051000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.051338 systemd[1]: Finished ensure-sysext.service. Jan 14 00:06:39.057449 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 14 00:06:39.058000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.058937 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 14 00:06:39.060610 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 14 00:06:39.062000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.062000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.066000 audit: BPF prog-id=56 op=LOAD Jan 14 00:06:39.069678 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 14 00:06:39.071389 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 14 00:06:39.078000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.078000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.078578 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 14 00:06:39.079645 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 14 00:06:39.095315 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 14 00:06:39.096000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.097938 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 14 00:06:39.100573 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 14 00:06:39.103000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.103000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.104453 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 14 00:06:39.105384 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 14 00:06:39.106000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.106000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.107403 systemd-udevd[1431]: Using default interface naming scheme 'v257'. Jan 14 00:06:39.111129 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 14 00:06:39.153605 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 14 00:06:39.153000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.154639 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 14 00:06:39.165231 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 14 00:06:39.169000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:06:39.170000 audit: BPF prog-id=57 op=LOAD Jan 14 00:06:39.171000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 14 00:06:39.171000 audit[1468]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffc198aca0 a2=420 a3=0 items=0 ppid=1427 pid=1468 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:06:39.171000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 00:06:39.172611 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 14 00:06:39.172862 augenrules[1468]: No rules Jan 14 00:06:39.180002 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 00:06:39.180674 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 00:06:39.210984 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 14 00:06:39.212152 systemd[1]: Reached target time-set.target - System Time Set. Jan 14 00:06:39.314207 systemd-networkd[1476]: lo: Link UP Jan 14 00:06:39.314222 systemd-networkd[1476]: lo: Gained carrier Jan 14 00:06:39.318436 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 14 00:06:39.319746 systemd[1]: Reached target network.target - Network. Jan 14 00:06:39.322728 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 14 00:06:39.326688 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 14 00:06:39.372075 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 14 00:06:39.373944 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 14 00:06:39.453591 systemd-networkd[1476]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:06:39.453603 systemd-networkd[1476]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 00:06:39.454279 systemd-networkd[1476]: eth0: Link UP Jan 14 00:06:39.454460 systemd-networkd[1476]: eth0: Gained carrier Jan 14 00:06:39.454480 systemd-networkd[1476]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:06:39.479779 systemd-networkd[1476]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:06:39.479794 systemd-networkd[1476]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 14 00:06:39.481683 systemd-networkd[1476]: eth1: Link UP Jan 14 00:06:39.482884 systemd-networkd[1476]: eth1: Gained carrier Jan 14 00:06:39.482925 systemd-networkd[1476]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 14 00:06:39.515712 systemd-networkd[1476]: eth0: DHCPv4 address 46.224.77.139/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 14 00:06:39.516816 systemd-timesyncd[1449]: Network configuration changed, trying to establish connection. Jan 14 00:06:39.521726 systemd-networkd[1476]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 14 00:06:39.522989 systemd-timesyncd[1449]: Network configuration changed, trying to establish connection. Jan 14 00:06:39.526843 ldconfig[1429]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 14 00:06:39.535057 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 14 00:06:39.539150 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 14 00:06:39.553618 kernel: mousedev: PS/2 mouse device common for all mice Jan 14 00:06:39.580608 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 14 00:06:39.590329 systemd[1]: Reached target sysinit.target - System Initialization. Jan 14 00:06:39.591727 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 14 00:06:39.592855 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 14 00:06:39.594640 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 14 00:06:39.595375 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 14 00:06:39.596423 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 14 00:06:39.597617 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 14 00:06:39.598438 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 14 00:06:39.599601 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 14 00:06:39.599641 systemd[1]: Reached target paths.target - Path Units. Jan 14 00:06:39.600326 systemd[1]: Reached target timers.target - Timer Units. Jan 14 00:06:39.603762 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 14 00:06:39.607019 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 14 00:06:39.612750 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 14 00:06:39.613723 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 14 00:06:39.614465 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 14 00:06:39.622364 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 14 00:06:39.623805 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 14 00:06:39.626384 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 14 00:06:39.629869 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 14 00:06:39.632462 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jan 14 00:06:39.632958 systemd[1]: Reached target sockets.target - Socket Units. Jan 14 00:06:39.633840 systemd[1]: Reached target basic.target - Basic System. Jan 14 00:06:39.634627 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 14 00:06:39.634655 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 14 00:06:39.637961 systemd[1]: Starting containerd.service - containerd container runtime... Jan 14 00:06:39.643589 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 14 00:06:39.647604 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 14 00:06:39.652905 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 14 00:06:39.658789 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 14 00:06:39.665340 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 14 00:06:39.666409 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 14 00:06:39.672453 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 14 00:06:39.674759 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 14 00:06:39.681473 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jan 14 00:06:39.684760 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 14 00:06:39.688999 jq[1531]: false Jan 14 00:06:39.699749 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 14 00:06:39.704779 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 14 00:06:39.718235 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 14 00:06:39.724766 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 14 00:06:39.725402 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 14 00:06:39.728053 systemd[1]: Starting update-engine.service - Update Engine... Jan 14 00:06:39.734433 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 14 00:06:39.741966 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 14 00:06:39.743487 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 14 00:06:39.745816 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 14 00:06:39.757538 coreos-metadata[1528]: Jan 14 00:06:39.756 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jan 14 00:06:39.758938 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 14 00:06:39.759663 extend-filesystems[1532]: Found /dev/sda6 Jan 14 00:06:39.766152 coreos-metadata[1528]: Jan 14 00:06:39.765 INFO Fetch successful Jan 14 00:06:39.766152 coreos-metadata[1528]: Jan 14 00:06:39.766 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jan 14 00:06:39.768547 coreos-metadata[1528]: Jan 14 00:06:39.766 INFO Fetch successful Jan 14 00:06:39.774349 extend-filesystems[1532]: Found /dev/sda9 Jan 14 00:06:39.781418 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 14 00:06:39.785744 extend-filesystems[1532]: Checking size of /dev/sda9 Jan 14 00:06:39.805309 jq[1550]: true Jan 14 00:06:39.806479 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 14 00:06:39.816044 tar[1553]: linux-arm64/LICENSE Jan 14 00:06:39.820725 tar[1553]: linux-arm64/helm Jan 14 00:06:39.835494 systemd[1]: motdgen.service: Deactivated successfully. Jan 14 00:06:39.836007 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 14 00:06:39.841080 update_engine[1548]: I20260114 00:06:39.831459 1548 main.cc:92] Flatcar Update Engine starting Jan 14 00:06:39.846622 extend-filesystems[1532]: Resized partition /dev/sda9 Jan 14 00:06:39.853085 extend-filesystems[1599]: resize2fs 1.47.3 (8-Jul-2025) Jan 14 00:06:39.868757 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 8410107 blocks Jan 14 00:06:39.868790 jq[1584]: true Jan 14 00:06:39.905878 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 14 00:06:39.905502 dbus-daemon[1529]: [system] SELinux support is enabled Jan 14 00:06:39.908931 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 14 00:06:39.908969 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 14 00:06:39.910842 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 14 00:06:39.910870 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 14 00:06:39.932883 update_engine[1548]: I20260114 00:06:39.932720 1548 update_check_scheduler.cc:74] Next update check in 10m32s Jan 14 00:06:39.933747 systemd[1]: Started update-engine.service - Update Engine. Jan 14 00:06:39.958225 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 14 00:06:40.015577 kernel: EXT4-fs (sda9): resized filesystem to 8410107 Jan 14 00:06:40.032968 extend-filesystems[1599]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jan 14 00:06:40.032968 extend-filesystems[1599]: old_desc_blocks = 1, new_desc_blocks = 5 Jan 14 00:06:40.032968 extend-filesystems[1599]: The filesystem on /dev/sda9 is now 8410107 (4k) blocks long. Jan 14 00:06:40.045828 extend-filesystems[1532]: Resized filesystem in /dev/sda9 Jan 14 00:06:40.036780 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 14 00:06:40.053924 bash[1624]: Updated "/home/core/.ssh/authorized_keys" Jan 14 00:06:40.037182 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 14 00:06:40.054365 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 14 00:06:40.055716 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 14 00:06:40.057422 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 14 00:06:40.059084 systemd[1]: Starting sshkeys.service... Jan 14 00:06:40.075566 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 14 00:06:40.078925 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 14 00:06:40.097747 systemd-logind[1545]: New seat seat0. Jan 14 00:06:40.109597 systemd[1]: Started systemd-logind.service - User Login Management. Jan 14 00:06:40.119544 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Jan 14 00:06:40.119616 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 14 00:06:40.119654 kernel: [drm] features: -context_init Jan 14 00:06:40.185083 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:06:40.204431 coreos-metadata[1630]: Jan 14 00:06:40.204 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jan 14 00:06:40.209926 coreos-metadata[1630]: Jan 14 00:06:40.209 INFO Fetch successful Jan 14 00:06:40.212626 unknown[1630]: wrote ssh authorized keys file for user: core Jan 14 00:06:40.238664 kernel: [drm] number of scanouts: 1 Jan 14 00:06:40.238769 kernel: [drm] number of cap sets: 0 Jan 14 00:06:40.256139 systemd-logind[1545]: Watching system buttons on /dev/input/event0 (Power Button) Jan 14 00:06:40.256913 update-ssh-keys[1641]: Updated "/home/core/.ssh/authorized_keys" Jan 14 00:06:40.258547 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 14 00:06:40.258949 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 14 00:06:40.269606 systemd[1]: Finished sshkeys.service. Jan 14 00:06:40.275167 systemd-logind[1545]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 14 00:06:40.283593 kernel: Console: switching to colour frame buffer device 160x50 Jan 14 00:06:40.296535 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 14 00:06:40.313108 containerd[1590]: time="2026-01-14T00:06:40Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 14 00:06:40.315430 containerd[1590]: time="2026-01-14T00:06:40.315314360Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 14 00:06:40.319282 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:06:40.335029 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 14 00:06:40.335866 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:06:40.337750 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:06:40.348004 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 14 00:06:40.375641 locksmithd[1607]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 14 00:06:40.401940 containerd[1590]: time="2026-01-14T00:06:40.401229760Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="12.28µs" Jan 14 00:06:40.401940 containerd[1590]: time="2026-01-14T00:06:40.401297920Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 14 00:06:40.401940 containerd[1590]: time="2026-01-14T00:06:40.401345920Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 14 00:06:40.401940 containerd[1590]: time="2026-01-14T00:06:40.401357880Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 14 00:06:40.401940 containerd[1590]: time="2026-01-14T00:06:40.401501800Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 14 00:06:40.408657 containerd[1590]: time="2026-01-14T00:06:40.408560880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 00:06:40.408920 containerd[1590]: time="2026-01-14T00:06:40.408740480Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 14 00:06:40.408920 containerd[1590]: time="2026-01-14T00:06:40.408762680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 00:06:40.409232 containerd[1590]: time="2026-01-14T00:06:40.409192400Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 14 00:06:40.409313 containerd[1590]: time="2026-01-14T00:06:40.409240920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 00:06:40.409313 containerd[1590]: time="2026-01-14T00:06:40.409269600Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 14 00:06:40.409313 containerd[1590]: time="2026-01-14T00:06:40.409281040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 00:06:40.417309 containerd[1590]: time="2026-01-14T00:06:40.417200400Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 14 00:06:40.417309 containerd[1590]: time="2026-01-14T00:06:40.417266920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 14 00:06:40.417515 containerd[1590]: time="2026-01-14T00:06:40.417483920Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 14 00:06:40.423617 containerd[1590]: time="2026-01-14T00:06:40.423571040Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 00:06:40.423703 containerd[1590]: time="2026-01-14T00:06:40.423640120Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 14 00:06:40.423703 containerd[1590]: time="2026-01-14T00:06:40.423652600Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 14 00:06:40.423703 containerd[1590]: time="2026-01-14T00:06:40.423692120Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 14 00:06:40.423996 containerd[1590]: time="2026-01-14T00:06:40.423971800Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 14 00:06:40.424113 containerd[1590]: time="2026-01-14T00:06:40.424063520Z" level=info msg="metadata content store policy set" policy=shared Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435597240Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435675240Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435774320Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435787160Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435800680Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435818280Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435830240Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435841920Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435853640Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435866400Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435878040Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435888200Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435900360Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 14 00:06:40.436548 containerd[1590]: time="2026-01-14T00:06:40.435913240Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436056440Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436075920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436091360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436101760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436111720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436123960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436141160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436152080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436163760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436179440Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436190800Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436216160Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436275360Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436299000Z" level=info msg="Start snapshots syncer" Jan 14 00:06:40.436848 containerd[1590]: time="2026-01-14T00:06:40.436335200Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 14 00:06:40.437759 containerd[1590]: time="2026-01-14T00:06:40.437642280Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 14 00:06:40.438291 containerd[1590]: time="2026-01-14T00:06:40.438133880Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439696960Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439864760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439887920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439898800Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439909160Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439921600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439934040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439945000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439955440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.439967640Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.440007000Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.440021640Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 14 00:06:40.441380 containerd[1590]: time="2026-01-14T00:06:40.440030400Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.440039800Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.440047920Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.440066040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.440076680Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.440154320Z" level=info msg="runtime interface created" Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.440159400Z" level=info msg="created NRI interface" Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.440167600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.440182520Z" level=info msg="Connect containerd service" Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.440203680Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 14 00:06:40.441668 containerd[1590]: time="2026-01-14T00:06:40.441004000Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 00:06:40.469247 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 14 00:06:40.595963 containerd[1590]: time="2026-01-14T00:06:40.595585800Z" level=info msg="Start subscribing containerd event" Jan 14 00:06:40.595963 containerd[1590]: time="2026-01-14T00:06:40.595657120Z" level=info msg="Start recovering state" Jan 14 00:06:40.595963 containerd[1590]: time="2026-01-14T00:06:40.595741160Z" level=info msg="Start event monitor" Jan 14 00:06:40.595963 containerd[1590]: time="2026-01-14T00:06:40.595754320Z" level=info msg="Start cni network conf syncer for default" Jan 14 00:06:40.595963 containerd[1590]: time="2026-01-14T00:06:40.595762560Z" level=info msg="Start streaming server" Jan 14 00:06:40.595963 containerd[1590]: time="2026-01-14T00:06:40.595774240Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 14 00:06:40.595963 containerd[1590]: time="2026-01-14T00:06:40.595781840Z" level=info msg="runtime interface starting up..." Jan 14 00:06:40.595963 containerd[1590]: time="2026-01-14T00:06:40.595795160Z" level=info msg="starting plugins..." Jan 14 00:06:40.595963 containerd[1590]: time="2026-01-14T00:06:40.595810320Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 14 00:06:40.596529 containerd[1590]: time="2026-01-14T00:06:40.596387760Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 14 00:06:40.596529 containerd[1590]: time="2026-01-14T00:06:40.596460600Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 14 00:06:40.596675 containerd[1590]: time="2026-01-14T00:06:40.596648560Z" level=info msg="containerd successfully booted in 0.284199s" Jan 14 00:06:40.596826 systemd[1]: Started containerd.service - containerd container runtime. Jan 14 00:06:40.728950 tar[1553]: linux-arm64/README.md Jan 14 00:06:40.749610 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 14 00:06:41.018708 systemd-networkd[1476]: eth1: Gained IPv6LL Jan 14 00:06:41.019663 systemd-timesyncd[1449]: Network configuration changed, trying to establish connection. Jan 14 00:06:41.025626 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 14 00:06:41.026855 systemd[1]: Reached target network-online.target - Network is Online. Jan 14 00:06:41.031036 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:06:41.033957 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 14 00:06:41.085494 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 14 00:06:41.466684 systemd-networkd[1476]: eth0: Gained IPv6LL Jan 14 00:06:41.468207 systemd-timesyncd[1449]: Network configuration changed, trying to establish connection. Jan 14 00:06:41.863749 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:06:41.874151 (kubelet)[1691]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:06:41.896614 sshd_keygen[1562]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 14 00:06:41.922675 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 14 00:06:41.926775 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 14 00:06:41.947833 systemd[1]: issuegen.service: Deactivated successfully. Jan 14 00:06:41.948364 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 14 00:06:41.953038 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 14 00:06:41.976751 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 14 00:06:41.981153 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 14 00:06:41.985279 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 14 00:06:41.988921 systemd[1]: Reached target getty.target - Login Prompts. Jan 14 00:06:41.989568 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 14 00:06:41.991999 systemd[1]: Startup finished in 1.794s (kernel) + 4.959s (initrd) + 4.938s (userspace) = 11.692s. Jan 14 00:06:42.421081 kubelet[1691]: E0114 00:06:42.421004 1691 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:06:42.425894 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:06:42.426400 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:06:42.427011 systemd[1]: kubelet.service: Consumed 900ms CPU time, 258.2M memory peak. Jan 14 00:06:52.586155 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 14 00:06:52.588958 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:06:52.755389 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:06:52.768008 (kubelet)[1727]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:06:52.810119 kubelet[1727]: E0114 00:06:52.810055 1727 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:06:52.814253 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:06:52.814438 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:06:52.815120 systemd[1]: kubelet.service: Consumed 170ms CPU time, 105.3M memory peak. Jan 14 00:07:02.836444 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 14 00:07:02.841367 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:07:03.005761 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:07:03.025389 (kubelet)[1741]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:07:03.070920 kubelet[1741]: E0114 00:07:03.070853 1741 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:07:03.073946 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:07:03.074144 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:07:03.074770 systemd[1]: kubelet.service: Consumed 170ms CPU time, 106.5M memory peak. Jan 14 00:07:11.649681 systemd-timesyncd[1449]: Contacted time server 152.53.191.142:123 (2.flatcar.pool.ntp.org). Jan 14 00:07:11.649821 systemd-timesyncd[1449]: Initial clock synchronization to Wed 2026-01-14 00:07:11.916237 UTC. Jan 14 00:07:13.091168 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 14 00:07:13.093666 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:07:13.258984 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:07:13.275483 (kubelet)[1756]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:07:13.321832 kubelet[1756]: E0114 00:07:13.321782 1756 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:07:13.326363 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:07:13.326511 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:07:13.327166 systemd[1]: kubelet.service: Consumed 166ms CPU time, 106.5M memory peak. Jan 14 00:07:13.558124 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 14 00:07:13.559950 systemd[1]: Started sshd@0-46.224.77.139:22-4.153.228.146:49012.service - OpenSSH per-connection server daemon (4.153.228.146:49012). Jan 14 00:07:14.177661 sshd[1764]: Accepted publickey for core from 4.153.228.146 port 49012 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:07:14.180595 sshd-session[1764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:07:14.192079 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 14 00:07:14.193059 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 14 00:07:14.201132 systemd-logind[1545]: New session 1 of user core. Jan 14 00:07:14.218591 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 14 00:07:14.222783 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 14 00:07:14.240575 (systemd)[1770]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:07:14.245956 systemd-logind[1545]: New session 2 of user core. Jan 14 00:07:14.377456 systemd[1770]: Queued start job for default target default.target. Jan 14 00:07:14.391204 systemd[1770]: Created slice app.slice - User Application Slice. Jan 14 00:07:14.391266 systemd[1770]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 14 00:07:14.391292 systemd[1770]: Reached target paths.target - Paths. Jan 14 00:07:14.391373 systemd[1770]: Reached target timers.target - Timers. Jan 14 00:07:14.393891 systemd[1770]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 14 00:07:14.397731 systemd[1770]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 14 00:07:14.426492 systemd[1770]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 14 00:07:14.426944 systemd[1770]: Reached target sockets.target - Sockets. Jan 14 00:07:14.427767 systemd[1770]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 14 00:07:14.427865 systemd[1770]: Reached target basic.target - Basic System. Jan 14 00:07:14.427943 systemd[1770]: Reached target default.target - Main User Target. Jan 14 00:07:14.427990 systemd[1770]: Startup finished in 173ms. Jan 14 00:07:14.428094 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 14 00:07:14.432866 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 14 00:07:14.756890 systemd[1]: Started sshd@1-46.224.77.139:22-4.153.228.146:59730.service - OpenSSH per-connection server daemon (4.153.228.146:59730). Jan 14 00:07:15.314619 sshd[1784]: Accepted publickey for core from 4.153.228.146 port 59730 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:07:15.316134 sshd-session[1784]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:07:15.322660 systemd-logind[1545]: New session 3 of user core. Jan 14 00:07:15.326835 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 14 00:07:15.612347 sshd[1788]: Connection closed by 4.153.228.146 port 59730 Jan 14 00:07:15.613012 sshd-session[1784]: pam_unix(sshd:session): session closed for user core Jan 14 00:07:15.621234 systemd[1]: sshd@1-46.224.77.139:22-4.153.228.146:59730.service: Deactivated successfully. Jan 14 00:07:15.625915 systemd[1]: session-3.scope: Deactivated successfully. Jan 14 00:07:15.627281 systemd-logind[1545]: Session 3 logged out. Waiting for processes to exit. Jan 14 00:07:15.629062 systemd-logind[1545]: Removed session 3. Jan 14 00:07:15.730407 systemd[1]: Started sshd@2-46.224.77.139:22-4.153.228.146:59738.service - OpenSSH per-connection server daemon (4.153.228.146:59738). Jan 14 00:07:16.292101 sshd[1794]: Accepted publickey for core from 4.153.228.146 port 59738 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:07:16.294079 sshd-session[1794]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:07:16.301164 systemd-logind[1545]: New session 4 of user core. Jan 14 00:07:16.304748 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 14 00:07:16.588713 sshd[1798]: Connection closed by 4.153.228.146 port 59738 Jan 14 00:07:16.589681 sshd-session[1794]: pam_unix(sshd:session): session closed for user core Jan 14 00:07:16.598155 systemd[1]: sshd@2-46.224.77.139:22-4.153.228.146:59738.service: Deactivated successfully. Jan 14 00:07:16.601458 systemd[1]: session-4.scope: Deactivated successfully. Jan 14 00:07:16.606690 systemd-logind[1545]: Session 4 logged out. Waiting for processes to exit. Jan 14 00:07:16.608594 systemd-logind[1545]: Removed session 4. Jan 14 00:07:16.704731 systemd[1]: Started sshd@3-46.224.77.139:22-4.153.228.146:59742.service - OpenSSH per-connection server daemon (4.153.228.146:59742). Jan 14 00:07:17.269479 sshd[1804]: Accepted publickey for core from 4.153.228.146 port 59742 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:07:17.271711 sshd-session[1804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:07:17.278008 systemd-logind[1545]: New session 5 of user core. Jan 14 00:07:17.287902 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 14 00:07:17.580325 sshd[1808]: Connection closed by 4.153.228.146 port 59742 Jan 14 00:07:17.579314 sshd-session[1804]: pam_unix(sshd:session): session closed for user core Jan 14 00:07:17.586519 systemd[1]: sshd@3-46.224.77.139:22-4.153.228.146:59742.service: Deactivated successfully. Jan 14 00:07:17.589806 systemd[1]: session-5.scope: Deactivated successfully. Jan 14 00:07:17.591188 systemd-logind[1545]: Session 5 logged out. Waiting for processes to exit. Jan 14 00:07:17.593708 systemd-logind[1545]: Removed session 5. Jan 14 00:07:17.686783 systemd[1]: Started sshd@4-46.224.77.139:22-4.153.228.146:59752.service - OpenSSH per-connection server daemon (4.153.228.146:59752). Jan 14 00:07:18.226594 sshd[1814]: Accepted publickey for core from 4.153.228.146 port 59752 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:07:18.228227 sshd-session[1814]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:07:18.234303 systemd-logind[1545]: New session 6 of user core. Jan 14 00:07:18.248958 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 14 00:07:18.436930 sudo[1819]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 14 00:07:18.437931 sudo[1819]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 00:07:18.457420 sudo[1819]: pam_unix(sudo:session): session closed for user root Jan 14 00:07:18.554898 sshd[1818]: Connection closed by 4.153.228.146 port 59752 Jan 14 00:07:18.556073 sshd-session[1814]: pam_unix(sshd:session): session closed for user core Jan 14 00:07:18.563163 systemd[1]: sshd@4-46.224.77.139:22-4.153.228.146:59752.service: Deactivated successfully. Jan 14 00:07:18.566732 systemd[1]: session-6.scope: Deactivated successfully. Jan 14 00:07:18.569390 systemd-logind[1545]: Session 6 logged out. Waiting for processes to exit. Jan 14 00:07:18.570937 systemd-logind[1545]: Removed session 6. Jan 14 00:07:18.669665 systemd[1]: Started sshd@5-46.224.77.139:22-4.153.228.146:59764.service - OpenSSH per-connection server daemon (4.153.228.146:59764). Jan 14 00:07:19.229374 sshd[1826]: Accepted publickey for core from 4.153.228.146 port 59764 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:07:19.231235 sshd-session[1826]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:07:19.237710 systemd-logind[1545]: New session 7 of user core. Jan 14 00:07:19.247932 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 14 00:07:19.433264 sudo[1832]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 14 00:07:19.433985 sudo[1832]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 00:07:19.437237 sudo[1832]: pam_unix(sudo:session): session closed for user root Jan 14 00:07:19.446501 sudo[1831]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 14 00:07:19.447083 sudo[1831]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 00:07:19.457980 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 14 00:07:19.496000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 00:07:19.499375 kernel: kauditd_printk_skb: 180 callbacks suppressed Jan 14 00:07:19.499504 kernel: audit: type=1305 audit(1768349239.496:226): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 14 00:07:19.499564 kernel: audit: type=1300 audit(1768349239.496:226): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffeb22c640 a2=420 a3=0 items=0 ppid=1837 pid=1856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:19.496000 audit[1856]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffeb22c640 a2=420 a3=0 items=0 ppid=1837 pid=1856 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:19.496000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 00:07:19.501886 augenrules[1856]: No rules Jan 14 00:07:19.502552 kernel: audit: type=1327 audit(1768349239.496:226): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 14 00:07:19.503571 systemd[1]: audit-rules.service: Deactivated successfully. Jan 14 00:07:19.503824 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 14 00:07:19.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.506158 sudo[1831]: pam_unix(sudo:session): session closed for user root Jan 14 00:07:19.509363 kernel: audit: type=1130 audit(1768349239.502:227): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.509434 kernel: audit: type=1131 audit(1768349239.502:228): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.502000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.504000 audit[1831]: USER_END pid=1831 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.511556 kernel: audit: type=1106 audit(1768349239.504:229): pid=1831 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.504000 audit[1831]: CRED_DISP pid=1831 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.513101 kernel: audit: type=1104 audit(1768349239.504:230): pid=1831 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.603505 sshd[1830]: Connection closed by 4.153.228.146 port 59764 Jan 14 00:07:19.603381 sshd-session[1826]: pam_unix(sshd:session): session closed for user core Jan 14 00:07:19.607000 audit[1826]: USER_END pid=1826 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:07:19.607000 audit[1826]: CRED_DISP pid=1826 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:07:19.614087 kernel: audit: type=1106 audit(1768349239.607:231): pid=1826 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:07:19.614157 kernel: audit: type=1104 audit(1768349239.607:232): pid=1826 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:07:19.614605 systemd[1]: sshd@5-46.224.77.139:22-4.153.228.146:59764.service: Deactivated successfully. Jan 14 00:07:19.618061 kernel: audit: type=1131 audit(1768349239.614:233): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-46.224.77.139:22-4.153.228.146:59764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.614000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-46.224.77.139:22-4.153.228.146:59764 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:19.618764 systemd[1]: session-7.scope: Deactivated successfully. Jan 14 00:07:19.620872 systemd-logind[1545]: Session 7 logged out. Waiting for processes to exit. Jan 14 00:07:19.622905 systemd-logind[1545]: Removed session 7. Jan 14 00:07:19.720936 systemd[1]: Started sshd@6-46.224.77.139:22-4.153.228.146:59768.service - OpenSSH per-connection server daemon (4.153.228.146:59768). Jan 14 00:07:19.720000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-46.224.77.139:22-4.153.228.146:59768 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:20.281000 audit[1865]: USER_ACCT pid=1865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:07:20.282402 sshd[1865]: Accepted publickey for core from 4.153.228.146 port 59768 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:07:20.283000 audit[1865]: CRED_ACQ pid=1865 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:07:20.283000 audit[1865]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffeca3d60 a2=3 a3=0 items=0 ppid=1 pid=1865 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:20.283000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:07:20.284365 sshd-session[1865]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:07:20.291152 systemd-logind[1545]: New session 8 of user core. Jan 14 00:07:20.296940 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 14 00:07:20.300000 audit[1865]: USER_START pid=1865 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:07:20.303000 audit[1869]: CRED_ACQ pid=1869 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:07:20.493000 audit[1870]: USER_ACCT pid=1870 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:07:20.495305 sudo[1870]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 14 00:07:20.494000 audit[1870]: CRED_REFR pid=1870 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:07:20.496192 sudo[1870]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 14 00:07:20.494000 audit[1870]: USER_START pid=1870 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:07:20.812880 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 14 00:07:20.835607 (dockerd)[1888]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 14 00:07:21.079285 dockerd[1888]: time="2026-01-14T00:07:21.079166295Z" level=info msg="Starting up" Jan 14 00:07:21.082200 dockerd[1888]: time="2026-01-14T00:07:21.081653134Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 14 00:07:21.095479 dockerd[1888]: time="2026-01-14T00:07:21.095363433Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 14 00:07:21.135126 dockerd[1888]: time="2026-01-14T00:07:21.135085390Z" level=info msg="Loading containers: start." Jan 14 00:07:21.144584 kernel: Initializing XFRM netlink socket Jan 14 00:07:21.193000 audit[1938]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1938 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.193000 audit[1938]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffffac94250 a2=0 a3=0 items=0 ppid=1888 pid=1938 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.193000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 00:07:21.195000 audit[1940]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1940 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.195000 audit[1940]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffe9c782a0 a2=0 a3=0 items=0 ppid=1888 pid=1940 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.195000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 00:07:21.197000 audit[1942]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1942 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.197000 audit[1942]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff61ebf70 a2=0 a3=0 items=0 ppid=1888 pid=1942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.197000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 00:07:21.199000 audit[1944]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1944 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.199000 audit[1944]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc235ba60 a2=0 a3=0 items=0 ppid=1888 pid=1944 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.199000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 00:07:21.200000 audit[1946]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=1946 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.200000 audit[1946]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffcc3db8a0 a2=0 a3=0 items=0 ppid=1888 pid=1946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.200000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 00:07:21.202000 audit[1948]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=1948 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.202000 audit[1948]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffeabc9c00 a2=0 a3=0 items=0 ppid=1888 pid=1948 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.202000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 00:07:21.205000 audit[1950]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1950 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.205000 audit[1950]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffd6ba8180 a2=0 a3=0 items=0 ppid=1888 pid=1950 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.205000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 00:07:21.207000 audit[1952]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=1952 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.207000 audit[1952]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffef037e90 a2=0 a3=0 items=0 ppid=1888 pid=1952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.207000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 00:07:21.233000 audit[1955]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=1955 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.233000 audit[1955]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffdb147d70 a2=0 a3=0 items=0 ppid=1888 pid=1955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.233000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 14 00:07:21.235000 audit[1957]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=1957 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.235000 audit[1957]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffffe56b8e0 a2=0 a3=0 items=0 ppid=1888 pid=1957 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.235000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 00:07:21.238000 audit[1959]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1959 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.238000 audit[1959]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffcccfe520 a2=0 a3=0 items=0 ppid=1888 pid=1959 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.238000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 00:07:21.241000 audit[1961]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=1961 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.241000 audit[1961]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffd2b1fb30 a2=0 a3=0 items=0 ppid=1888 pid=1961 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.241000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 00:07:21.246000 audit[1963]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=1963 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.246000 audit[1963]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffdd781ec0 a2=0 a3=0 items=0 ppid=1888 pid=1963 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.246000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 00:07:21.292000 audit[1993]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=1993 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.292000 audit[1993]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffd75b9f10 a2=0 a3=0 items=0 ppid=1888 pid=1993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.292000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 14 00:07:21.295000 audit[1995]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=1995 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.295000 audit[1995]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffe0223ac0 a2=0 a3=0 items=0 ppid=1888 pid=1995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.295000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 14 00:07:21.297000 audit[1997]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=1997 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.297000 audit[1997]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe04a6510 a2=0 a3=0 items=0 ppid=1888 pid=1997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.297000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 14 00:07:21.299000 audit[1999]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=1999 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.299000 audit[1999]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc1262ce0 a2=0 a3=0 items=0 ppid=1888 pid=1999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.299000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 14 00:07:21.301000 audit[2001]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2001 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.301000 audit[2001]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd62eb570 a2=0 a3=0 items=0 ppid=1888 pid=2001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.301000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 14 00:07:21.303000 audit[2003]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2003 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.303000 audit[2003]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc9664d00 a2=0 a3=0 items=0 ppid=1888 pid=2003 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.303000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 00:07:21.305000 audit[2005]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2005 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.305000 audit[2005]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffdb4d2060 a2=0 a3=0 items=0 ppid=1888 pid=2005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.305000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 00:07:21.307000 audit[2007]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2007 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.307000 audit[2007]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffd1e46810 a2=0 a3=0 items=0 ppid=1888 pid=2007 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.307000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 14 00:07:21.310000 audit[2009]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2009 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.310000 audit[2009]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffff43ba30 a2=0 a3=0 items=0 ppid=1888 pid=2009 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.310000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 14 00:07:21.312000 audit[2011]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2011 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.312000 audit[2011]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffca837320 a2=0 a3=0 items=0 ppid=1888 pid=2011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.312000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 14 00:07:21.315000 audit[2013]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2013 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.315000 audit[2013]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffd3905ac0 a2=0 a3=0 items=0 ppid=1888 pid=2013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.315000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 14 00:07:21.316000 audit[2015]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2015 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.316000 audit[2015]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffe996b780 a2=0 a3=0 items=0 ppid=1888 pid=2015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.316000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 14 00:07:21.318000 audit[2017]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2017 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.318000 audit[2017]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=fffff95f3b80 a2=0 a3=0 items=0 ppid=1888 pid=2017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.318000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 14 00:07:21.325000 audit[2022]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2022 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.325000 audit[2022]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc47d31e0 a2=0 a3=0 items=0 ppid=1888 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.325000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 00:07:21.328000 audit[2024]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2024 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.328000 audit[2024]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffc0422220 a2=0 a3=0 items=0 ppid=1888 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.328000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 00:07:21.331000 audit[2026]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2026 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.331000 audit[2026]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffe11943d0 a2=0 a3=0 items=0 ppid=1888 pid=2026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.331000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 00:07:21.333000 audit[2028]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2028 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.333000 audit[2028]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd67d9610 a2=0 a3=0 items=0 ppid=1888 pid=2028 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.333000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 14 00:07:21.336000 audit[2030]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2030 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.336000 audit[2030]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffd862c040 a2=0 a3=0 items=0 ppid=1888 pid=2030 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.336000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 14 00:07:21.338000 audit[2032]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2032 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:21.338000 audit[2032]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffc914a7a0 a2=0 a3=0 items=0 ppid=1888 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.338000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 14 00:07:21.363000 audit[2036]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2036 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.363000 audit[2036]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=fffff1738fd0 a2=0 a3=0 items=0 ppid=1888 pid=2036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.363000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 14 00:07:21.367000 audit[2038]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2038 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.367000 audit[2038]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffc277bef0 a2=0 a3=0 items=0 ppid=1888 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.367000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 14 00:07:21.377000 audit[2046]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2046 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.377000 audit[2046]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffe9d391e0 a2=0 a3=0 items=0 ppid=1888 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.377000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 14 00:07:21.388000 audit[2052]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2052 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.388000 audit[2052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffecea2bd0 a2=0 a3=0 items=0 ppid=1888 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.388000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 14 00:07:21.392000 audit[2054]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.392000 audit[2054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=fffffd8ca980 a2=0 a3=0 items=0 ppid=1888 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.392000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 14 00:07:21.394000 audit[2056]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.394000 audit[2056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=fffff9bd39e0 a2=0 a3=0 items=0 ppid=1888 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.394000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 14 00:07:21.396000 audit[2058]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2058 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.396000 audit[2058]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffc2d2ac10 a2=0 a3=0 items=0 ppid=1888 pid=2058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.396000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 14 00:07:21.398000 audit[2060]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2060 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:21.398000 audit[2060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe24490c0 a2=0 a3=0 items=0 ppid=1888 pid=2060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:21.398000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 14 00:07:21.400622 systemd-networkd[1476]: docker0: Link UP Jan 14 00:07:21.404986 dockerd[1888]: time="2026-01-14T00:07:21.404930543Z" level=info msg="Loading containers: done." Jan 14 00:07:21.421624 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck86341647-merged.mount: Deactivated successfully. Jan 14 00:07:21.432427 dockerd[1888]: time="2026-01-14T00:07:21.432343021Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 14 00:07:21.432696 dockerd[1888]: time="2026-01-14T00:07:21.432514443Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 14 00:07:21.432842 dockerd[1888]: time="2026-01-14T00:07:21.432787189Z" level=info msg="Initializing buildkit" Jan 14 00:07:21.458727 dockerd[1888]: time="2026-01-14T00:07:21.458678734Z" level=info msg="Completed buildkit initialization" Jan 14 00:07:21.466937 dockerd[1888]: time="2026-01-14T00:07:21.466874629Z" level=info msg="Daemon has completed initialization" Jan 14 00:07:21.467392 dockerd[1888]: time="2026-01-14T00:07:21.467174485Z" level=info msg="API listen on /run/docker.sock" Jan 14 00:07:21.472078 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 14 00:07:21.471000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:22.568060 containerd[1590]: time="2026-01-14T00:07:22.568001924Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Jan 14 00:07:23.141178 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount120675217.mount: Deactivated successfully. Jan 14 00:07:23.336124 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 14 00:07:23.339201 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:07:23.498033 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:07:23.497000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:23.516378 (kubelet)[2162]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:07:23.558011 kubelet[2162]: E0114 00:07:23.557926 2162 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:07:23.561359 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:07:23.561582 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:07:23.563000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 00:07:23.563667 systemd[1]: kubelet.service: Consumed 163ms CPU time, 107M memory peak. Jan 14 00:07:24.101720 containerd[1590]: time="2026-01-14T00:07:24.101609404Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:24.103498 containerd[1590]: time="2026-01-14T00:07:24.103435907Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=25791415" Jan 14 00:07:24.104473 containerd[1590]: time="2026-01-14T00:07:24.104046137Z" level=info msg="ImageCreate event name:\"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:24.107010 containerd[1590]: time="2026-01-14T00:07:24.106959932Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:24.108828 containerd[1590]: time="2026-01-14T00:07:24.108714395Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"27383880\" in 1.540651117s" Jan 14 00:07:24.108828 containerd[1590]: time="2026-01-14T00:07:24.108762032Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\"" Jan 14 00:07:24.110581 containerd[1590]: time="2026-01-14T00:07:24.110466280Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Jan 14 00:07:25.304290 update_engine[1548]: I20260114 00:07:25.303629 1548 update_attempter.cc:509] Updating boot flags... Jan 14 00:07:25.601902 containerd[1590]: time="2026-01-14T00:07:25.601851058Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:25.606322 containerd[1590]: time="2026-01-14T00:07:25.606281014Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=23544927" Jan 14 00:07:25.607383 containerd[1590]: time="2026-01-14T00:07:25.607314651Z" level=info msg="ImageCreate event name:\"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:25.613338 containerd[1590]: time="2026-01-14T00:07:25.613263439Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:25.614291 containerd[1590]: time="2026-01-14T00:07:25.614055426Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"25137562\" in 1.503506038s" Jan 14 00:07:25.614291 containerd[1590]: time="2026-01-14T00:07:25.614088338Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\"" Jan 14 00:07:25.615038 containerd[1590]: time="2026-01-14T00:07:25.614632195Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Jan 14 00:07:27.043554 containerd[1590]: time="2026-01-14T00:07:27.043256605Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:27.045180 containerd[1590]: time="2026-01-14T00:07:27.045129312Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=18289931" Jan 14 00:07:27.046561 containerd[1590]: time="2026-01-14T00:07:27.046163664Z" level=info msg="ImageCreate event name:\"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:27.049234 containerd[1590]: time="2026-01-14T00:07:27.049207937Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:27.050429 containerd[1590]: time="2026-01-14T00:07:27.050388182Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"19882566\" in 1.435722968s" Jan 14 00:07:27.050498 containerd[1590]: time="2026-01-14T00:07:27.050429769Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\"" Jan 14 00:07:27.051189 containerd[1590]: time="2026-01-14T00:07:27.051138502Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Jan 14 00:07:28.181620 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3989703229.mount: Deactivated successfully. Jan 14 00:07:28.528405 containerd[1590]: time="2026-01-14T00:07:28.528224438Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:28.529996 containerd[1590]: time="2026-01-14T00:07:28.529929644Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=28254952" Jan 14 00:07:28.531628 containerd[1590]: time="2026-01-14T00:07:28.531245644Z" level=info msg="ImageCreate event name:\"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:28.534259 containerd[1590]: time="2026-01-14T00:07:28.534219865Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:28.535069 containerd[1590]: time="2026-01-14T00:07:28.535020082Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"28257692\" in 1.483717652s" Jan 14 00:07:28.535069 containerd[1590]: time="2026-01-14T00:07:28.535066785Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\"" Jan 14 00:07:28.535727 containerd[1590]: time="2026-01-14T00:07:28.535572086Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Jan 14 00:07:29.115012 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4079768873.mount: Deactivated successfully. Jan 14 00:07:30.214921 containerd[1590]: time="2026-01-14T00:07:30.213264273Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:30.214921 containerd[1590]: time="2026-01-14T00:07:30.214855775Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=18338344" Jan 14 00:07:30.215799 containerd[1590]: time="2026-01-14T00:07:30.215727233Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:30.219163 containerd[1590]: time="2026-01-14T00:07:30.219111882Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:30.221380 containerd[1590]: time="2026-01-14T00:07:30.221330027Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.685714983s" Jan 14 00:07:30.221380 containerd[1590]: time="2026-01-14T00:07:30.221367058Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Jan 14 00:07:30.222371 containerd[1590]: time="2026-01-14T00:07:30.222297133Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jan 14 00:07:30.781383 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3258659774.mount: Deactivated successfully. Jan 14 00:07:30.795702 containerd[1590]: time="2026-01-14T00:07:30.795644662Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 00:07:30.796889 containerd[1590]: time="2026-01-14T00:07:30.796816704Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 14 00:07:30.797410 containerd[1590]: time="2026-01-14T00:07:30.797376706Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 00:07:30.799240 containerd[1590]: time="2026-01-14T00:07:30.799215671Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 14 00:07:30.799827 containerd[1590]: time="2026-01-14T00:07:30.799804079Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 577.294429ms" Jan 14 00:07:30.799935 containerd[1590]: time="2026-01-14T00:07:30.799919626Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jan 14 00:07:30.800695 containerd[1590]: time="2026-01-14T00:07:30.800671325Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Jan 14 00:07:31.391406 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2344532573.mount: Deactivated successfully. Jan 14 00:07:33.518563 containerd[1590]: time="2026-01-14T00:07:33.518222704Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:33.522239 containerd[1590]: time="2026-01-14T00:07:33.521546962Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=68134789" Jan 14 00:07:33.524790 containerd[1590]: time="2026-01-14T00:07:33.524733222Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:33.529288 containerd[1590]: time="2026-01-14T00:07:33.529248359Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:33.530216 containerd[1590]: time="2026-01-14T00:07:33.530034262Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.729333175s" Jan 14 00:07:33.530338 containerd[1590]: time="2026-01-14T00:07:33.530323245Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Jan 14 00:07:33.585970 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 14 00:07:33.589756 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:07:33.743875 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:07:33.746370 kernel: kauditd_printk_skb: 134 callbacks suppressed Jan 14 00:07:33.746462 kernel: audit: type=1130 audit(1768349253.742:286): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:33.742000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:33.757348 (kubelet)[2339]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 14 00:07:33.801599 kubelet[2339]: E0114 00:07:33.800923 2339 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 14 00:07:33.804354 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 14 00:07:33.804547 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 14 00:07:33.805300 systemd[1]: kubelet.service: Consumed 159ms CPU time, 105.2M memory peak. Jan 14 00:07:33.803000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 00:07:33.808571 kernel: audit: type=1131 audit(1768349253.803:287): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 00:07:38.107420 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:07:38.107000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:38.108721 systemd[1]: kubelet.service: Consumed 159ms CPU time, 105.2M memory peak. Jan 14 00:07:38.108000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:38.112876 kernel: audit: type=1130 audit(1768349258.107:288): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:38.112945 kernel: audit: type=1131 audit(1768349258.108:289): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:38.113193 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:07:38.150149 systemd[1]: Reload requested from client PID 2368 ('systemctl') (unit session-8.scope)... Jan 14 00:07:38.150169 systemd[1]: Reloading... Jan 14 00:07:38.287746 zram_generator::config[2418]: No configuration found. Jan 14 00:07:38.468308 systemd[1]: Reloading finished in 317 ms. Jan 14 00:07:38.489000 audit: BPF prog-id=61 op=LOAD Jan 14 00:07:38.492875 kernel: audit: type=1334 audit(1768349258.489:290): prog-id=61 op=LOAD Jan 14 00:07:38.492936 kernel: audit: type=1334 audit(1768349258.489:291): prog-id=57 op=UNLOAD Jan 14 00:07:38.492965 kernel: audit: type=1334 audit(1768349258.490:292): prog-id=62 op=LOAD Jan 14 00:07:38.489000 audit: BPF prog-id=57 op=UNLOAD Jan 14 00:07:38.490000 audit: BPF prog-id=62 op=LOAD Jan 14 00:07:38.493562 kernel: audit: type=1334 audit(1768349258.490:293): prog-id=44 op=UNLOAD Jan 14 00:07:38.490000 audit: BPF prog-id=44 op=UNLOAD Jan 14 00:07:38.494962 kernel: audit: type=1334 audit(1768349258.491:294): prog-id=63 op=LOAD Jan 14 00:07:38.495020 kernel: audit: type=1334 audit(1768349258.491:295): prog-id=64 op=LOAD Jan 14 00:07:38.491000 audit: BPF prog-id=63 op=LOAD Jan 14 00:07:38.491000 audit: BPF prog-id=64 op=LOAD Jan 14 00:07:38.491000 audit: BPF prog-id=45 op=UNLOAD Jan 14 00:07:38.491000 audit: BPF prog-id=46 op=UNLOAD Jan 14 00:07:38.491000 audit: BPF prog-id=65 op=LOAD Jan 14 00:07:38.491000 audit: BPF prog-id=56 op=UNLOAD Jan 14 00:07:38.493000 audit: BPF prog-id=66 op=LOAD Jan 14 00:07:38.493000 audit: BPF prog-id=51 op=UNLOAD Jan 14 00:07:38.493000 audit: BPF prog-id=67 op=LOAD Jan 14 00:07:38.493000 audit: BPF prog-id=68 op=LOAD Jan 14 00:07:38.493000 audit: BPF prog-id=52 op=UNLOAD Jan 14 00:07:38.493000 audit: BPF prog-id=53 op=UNLOAD Jan 14 00:07:38.493000 audit: BPF prog-id=69 op=LOAD Jan 14 00:07:38.494000 audit: BPF prog-id=70 op=LOAD Jan 14 00:07:38.494000 audit: BPF prog-id=54 op=UNLOAD Jan 14 00:07:38.494000 audit: BPF prog-id=55 op=UNLOAD Jan 14 00:07:38.495000 audit: BPF prog-id=71 op=LOAD Jan 14 00:07:38.495000 audit: BPF prog-id=58 op=UNLOAD Jan 14 00:07:38.495000 audit: BPF prog-id=72 op=LOAD Jan 14 00:07:38.495000 audit: BPF prog-id=73 op=LOAD Jan 14 00:07:38.495000 audit: BPF prog-id=59 op=UNLOAD Jan 14 00:07:38.495000 audit: BPF prog-id=60 op=UNLOAD Jan 14 00:07:38.496000 audit: BPF prog-id=74 op=LOAD Jan 14 00:07:38.496000 audit: BPF prog-id=41 op=UNLOAD Jan 14 00:07:38.496000 audit: BPF prog-id=75 op=LOAD Jan 14 00:07:38.496000 audit: BPF prog-id=76 op=LOAD Jan 14 00:07:38.496000 audit: BPF prog-id=42 op=UNLOAD Jan 14 00:07:38.496000 audit: BPF prog-id=43 op=UNLOAD Jan 14 00:07:38.497000 audit: BPF prog-id=77 op=LOAD Jan 14 00:07:38.497000 audit: BPF prog-id=47 op=UNLOAD Jan 14 00:07:38.497000 audit: BPF prog-id=78 op=LOAD Jan 14 00:07:38.497000 audit: BPF prog-id=48 op=UNLOAD Jan 14 00:07:38.497000 audit: BPF prog-id=79 op=LOAD Jan 14 00:07:38.498000 audit: BPF prog-id=80 op=LOAD Jan 14 00:07:38.498000 audit: BPF prog-id=49 op=UNLOAD Jan 14 00:07:38.498000 audit: BPF prog-id=50 op=UNLOAD Jan 14 00:07:38.509258 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 14 00:07:38.509356 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 14 00:07:38.510000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 14 00:07:38.510632 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:07:38.510690 systemd[1]: kubelet.service: Consumed 102ms CPU time, 95.1M memory peak. Jan 14 00:07:38.512847 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:07:38.664655 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:07:38.663000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:38.676045 (kubelet)[2461]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 00:07:38.719762 kubelet[2461]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 00:07:38.719762 kubelet[2461]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 00:07:38.719762 kubelet[2461]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 00:07:38.722210 kubelet[2461]: I0114 00:07:38.720464 2461 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 00:07:40.139558 kubelet[2461]: I0114 00:07:40.138666 2461 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 14 00:07:40.139558 kubelet[2461]: I0114 00:07:40.138705 2461 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 00:07:40.139558 kubelet[2461]: I0114 00:07:40.139063 2461 server.go:956] "Client rotation is on, will bootstrap in background" Jan 14 00:07:40.169058 kubelet[2461]: E0114 00:07:40.169021 2461 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://46.224.77.139:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 46.224.77.139:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 14 00:07:40.170173 kubelet[2461]: I0114 00:07:40.170145 2461 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 00:07:40.181674 kubelet[2461]: I0114 00:07:40.181637 2461 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 00:07:40.185989 kubelet[2461]: I0114 00:07:40.185941 2461 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 14 00:07:40.188580 kubelet[2461]: I0114 00:07:40.188499 2461 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 00:07:40.189013 kubelet[2461]: I0114 00:07:40.188741 2461 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-n-fb1a601aa4","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 00:07:40.189284 kubelet[2461]: I0114 00:07:40.189264 2461 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 00:07:40.189768 kubelet[2461]: I0114 00:07:40.189393 2461 container_manager_linux.go:303] "Creating device plugin manager" Jan 14 00:07:40.191084 kubelet[2461]: I0114 00:07:40.190744 2461 state_mem.go:36] "Initialized new in-memory state store" Jan 14 00:07:40.194608 kubelet[2461]: I0114 00:07:40.194508 2461 kubelet.go:480] "Attempting to sync node with API server" Jan 14 00:07:40.194692 kubelet[2461]: I0114 00:07:40.194681 2461 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 00:07:40.194775 kubelet[2461]: I0114 00:07:40.194767 2461 kubelet.go:386] "Adding apiserver pod source" Jan 14 00:07:40.194829 kubelet[2461]: I0114 00:07:40.194821 2461 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 00:07:40.198426 kubelet[2461]: E0114 00:07:40.198395 2461 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://46.224.77.139:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-n-fb1a601aa4&limit=500&resourceVersion=0\": dial tcp 46.224.77.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 14 00:07:40.198998 kubelet[2461]: E0114 00:07:40.198933 2461 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://46.224.77.139:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 46.224.77.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 14 00:07:40.200561 kubelet[2461]: I0114 00:07:40.199918 2461 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 00:07:40.200753 kubelet[2461]: I0114 00:07:40.200734 2461 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 14 00:07:40.200913 kubelet[2461]: W0114 00:07:40.200902 2461 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 14 00:07:40.205555 kubelet[2461]: I0114 00:07:40.205516 2461 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 14 00:07:40.205666 kubelet[2461]: I0114 00:07:40.205655 2461 server.go:1289] "Started kubelet" Jan 14 00:07:40.209513 kubelet[2461]: I0114 00:07:40.209465 2461 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 00:07:40.213542 kubelet[2461]: I0114 00:07:40.212786 2461 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 00:07:40.213542 kubelet[2461]: I0114 00:07:40.213140 2461 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 00:07:40.213642 kubelet[2461]: I0114 00:07:40.213608 2461 server.go:317] "Adding debug handlers to kubelet server" Jan 14 00:07:40.216738 kubelet[2461]: I0114 00:07:40.216697 2461 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 00:07:40.218828 kubelet[2461]: E0114 00:07:40.217291 2461 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://46.224.77.139:6443/api/v1/namespaces/default/events\": dial tcp 46.224.77.139:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-n-fb1a601aa4.188a7048756171f0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-n-fb1a601aa4,UID:ci-4547-0-0-n-fb1a601aa4,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-n-fb1a601aa4,},FirstTimestamp:2026-01-14 00:07:40.205617648 +0000 UTC m=+1.524397332,LastTimestamp:2026-01-14 00:07:40.205617648 +0000 UTC m=+1.524397332,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-n-fb1a601aa4,}" Jan 14 00:07:40.219787 kubelet[2461]: I0114 00:07:40.219758 2461 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 00:07:40.220779 kubelet[2461]: I0114 00:07:40.220760 2461 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 14 00:07:40.221094 kubelet[2461]: E0114 00:07:40.221068 2461 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" Jan 14 00:07:40.221892 kubelet[2461]: I0114 00:07:40.221865 2461 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 14 00:07:40.222137 kubelet[2461]: I0114 00:07:40.222123 2461 reconciler.go:26] "Reconciler: start to sync state" Jan 14 00:07:40.224168 kubelet[2461]: E0114 00:07:40.224142 2461 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://46.224.77.139:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 46.224.77.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 14 00:07:40.224350 kubelet[2461]: E0114 00:07:40.224319 2461 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.77.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-fb1a601aa4?timeout=10s\": dial tcp 46.224.77.139:6443: connect: connection refused" interval="200ms" Jan 14 00:07:40.224762 kubelet[2461]: I0114 00:07:40.224741 2461 factory.go:223] Registration of the systemd container factory successfully Jan 14 00:07:40.224923 kubelet[2461]: I0114 00:07:40.224905 2461 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 00:07:40.228883 kubelet[2461]: I0114 00:07:40.228850 2461 factory.go:223] Registration of the containerd container factory successfully Jan 14 00:07:40.231000 audit[2476]: NETFILTER_CFG table=mangle:42 family=10 entries=2 op=nft_register_chain pid=2476 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:40.232807 kernel: kauditd_printk_skb: 36 callbacks suppressed Jan 14 00:07:40.232860 kernel: audit: type=1325 audit(1768349260.231:332): table=mangle:42 family=10 entries=2 op=nft_register_chain pid=2476 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:40.233771 kubelet[2461]: I0114 00:07:40.233630 2461 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 14 00:07:40.231000 audit[2476]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffee242740 a2=0 a3=0 items=0 ppid=2461 pid=2476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.236541 kernel: audit: type=1300 audit(1768349260.231:332): arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffee242740 a2=0 a3=0 items=0 ppid=2461 pid=2476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.236614 kernel: audit: type=1327 audit(1768349260.231:332): proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 00:07:40.231000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 00:07:40.237000 audit[2477]: NETFILTER_CFG table=mangle:43 family=2 entries=2 op=nft_register_chain pid=2477 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.238556 kernel: audit: type=1325 audit(1768349260.237:333): table=mangle:43 family=2 entries=2 op=nft_register_chain pid=2477 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.237000 audit[2477]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffddf85be0 a2=0 a3=0 items=0 ppid=2461 pid=2477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.241697 kernel: audit: type=1300 audit(1768349260.237:333): arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffddf85be0 a2=0 a3=0 items=0 ppid=2461 pid=2477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.241755 kernel: audit: type=1327 audit(1768349260.237:333): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 00:07:40.237000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 14 00:07:40.242626 kernel: audit: type=1325 audit(1768349260.238:334): table=filter:44 family=2 entries=1 op=nft_register_chain pid=2478 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.238000 audit[2478]: NETFILTER_CFG table=filter:44 family=2 entries=1 op=nft_register_chain pid=2478 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.238000 audit[2478]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcff603a0 a2=0 a3=0 items=0 ppid=2461 pid=2478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.245933 kernel: audit: type=1300 audit(1768349260.238:334): arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcff603a0 a2=0 a3=0 items=0 ppid=2461 pid=2478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.245992 kernel: audit: type=1327 audit(1768349260.238:334): proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 00:07:40.238000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 00:07:40.238000 audit[2479]: NETFILTER_CFG table=mangle:45 family=10 entries=1 op=nft_register_chain pid=2479 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:40.247594 kernel: audit: type=1325 audit(1768349260.238:335): table=mangle:45 family=10 entries=1 op=nft_register_chain pid=2479 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:40.238000 audit[2479]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcdd7aff0 a2=0 a3=0 items=0 ppid=2461 pid=2479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.238000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 00:07:40.238000 audit[2480]: NETFILTER_CFG table=nat:46 family=10 entries=1 op=nft_register_chain pid=2480 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:40.238000 audit[2480]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffeb475900 a2=0 a3=0 items=0 ppid=2461 pid=2480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.238000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 00:07:40.241000 audit[2481]: NETFILTER_CFG table=filter:47 family=10 entries=1 op=nft_register_chain pid=2481 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:40.241000 audit[2481]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffde7a67b0 a2=0 a3=0 items=0 ppid=2461 pid=2481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.241000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 00:07:40.242000 audit[2483]: NETFILTER_CFG table=filter:48 family=2 entries=2 op=nft_register_chain pid=2483 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.242000 audit[2483]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffdc189df0 a2=0 a3=0 items=0 ppid=2461 pid=2483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.242000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 00:07:40.245000 audit[2485]: NETFILTER_CFG table=filter:49 family=2 entries=2 op=nft_register_chain pid=2485 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.245000 audit[2485]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffff66fc870 a2=0 a3=0 items=0 ppid=2461 pid=2485 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.245000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 00:07:40.253000 audit[2488]: NETFILTER_CFG table=filter:50 family=2 entries=1 op=nft_register_rule pid=2488 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.253000 audit[2488]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffd6c22330 a2=0 a3=0 items=0 ppid=2461 pid=2488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.253000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Jan 14 00:07:40.255915 kubelet[2461]: I0114 00:07:40.255599 2461 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 14 00:07:40.255915 kubelet[2461]: I0114 00:07:40.255628 2461 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 14 00:07:40.255915 kubelet[2461]: I0114 00:07:40.255648 2461 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 00:07:40.255915 kubelet[2461]: I0114 00:07:40.255656 2461 kubelet.go:2436] "Starting kubelet main sync loop" Jan 14 00:07:40.255915 kubelet[2461]: E0114 00:07:40.255695 2461 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 00:07:40.256000 audit[2489]: NETFILTER_CFG table=mangle:51 family=2 entries=1 op=nft_register_chain pid=2489 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.256000 audit[2489]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc441f960 a2=0 a3=0 items=0 ppid=2461 pid=2489 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.256000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 14 00:07:40.257000 audit[2490]: NETFILTER_CFG table=nat:52 family=2 entries=1 op=nft_register_chain pid=2490 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.257000 audit[2490]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffde3ecbe0 a2=0 a3=0 items=0 ppid=2461 pid=2490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.257000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 14 00:07:40.258000 audit[2492]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2492 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:40.258000 audit[2492]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe10b6740 a2=0 a3=0 items=0 ppid=2461 pid=2492 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.258000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 14 00:07:40.261870 kubelet[2461]: E0114 00:07:40.261796 2461 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://46.224.77.139:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.224.77.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 14 00:07:40.265074 kubelet[2461]: E0114 00:07:40.265008 2461 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 14 00:07:40.268155 kubelet[2461]: I0114 00:07:40.268124 2461 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 00:07:40.268639 kubelet[2461]: I0114 00:07:40.268328 2461 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 00:07:40.268639 kubelet[2461]: I0114 00:07:40.268369 2461 state_mem.go:36] "Initialized new in-memory state store" Jan 14 00:07:40.270951 kubelet[2461]: I0114 00:07:40.270924 2461 policy_none.go:49] "None policy: Start" Jan 14 00:07:40.271042 kubelet[2461]: I0114 00:07:40.271032 2461 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 14 00:07:40.271116 kubelet[2461]: I0114 00:07:40.271106 2461 state_mem.go:35] "Initializing new in-memory state store" Jan 14 00:07:40.277741 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 14 00:07:40.298836 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 14 00:07:40.303081 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 14 00:07:40.311165 kubelet[2461]: E0114 00:07:40.311120 2461 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 14 00:07:40.312391 kubelet[2461]: I0114 00:07:40.311474 2461 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 00:07:40.312391 kubelet[2461]: I0114 00:07:40.311544 2461 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 00:07:40.312391 kubelet[2461]: I0114 00:07:40.311941 2461 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 00:07:40.315362 kubelet[2461]: E0114 00:07:40.315320 2461 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 00:07:40.315418 kubelet[2461]: E0114 00:07:40.315394 2461 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-n-fb1a601aa4\" not found" Jan 14 00:07:40.374061 systemd[1]: Created slice kubepods-burstable-pod08b119e21354abecec3569c2fc59abfa.slice - libcontainer container kubepods-burstable-pod08b119e21354abecec3569c2fc59abfa.slice. Jan 14 00:07:40.385472 kubelet[2461]: E0114 00:07:40.385404 2461 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.391226 systemd[1]: Created slice kubepods-burstable-pod0eec82d12248a2d372141dae549c6f8e.slice - libcontainer container kubepods-burstable-pod0eec82d12248a2d372141dae549c6f8e.slice. Jan 14 00:07:40.395926 kubelet[2461]: E0114 00:07:40.395355 2461 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.400714 systemd[1]: Created slice kubepods-burstable-podddea395bf63b2d8c86b065d7e97bf509.slice - libcontainer container kubepods-burstable-podddea395bf63b2d8c86b065d7e97bf509.slice. Jan 14 00:07:40.403285 kubelet[2461]: E0114 00:07:40.403004 2461 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.414049 kubelet[2461]: I0114 00:07:40.414021 2461 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.414825 kubelet[2461]: E0114 00:07:40.414784 2461 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.77.139:6443/api/v1/nodes\": dial tcp 46.224.77.139:6443: connect: connection refused" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.425878 kubelet[2461]: E0114 00:07:40.425831 2461 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.77.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-fb1a601aa4?timeout=10s\": dial tcp 46.224.77.139:6443: connect: connection refused" interval="400ms" Jan 14 00:07:40.524265 kubelet[2461]: I0114 00:07:40.524159 2461 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.524265 kubelet[2461]: I0114 00:07:40.524230 2461 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.524672 kubelet[2461]: I0114 00:07:40.524584 2461 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.524836 kubelet[2461]: I0114 00:07:40.524801 2461 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/08b119e21354abecec3569c2fc59abfa-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-n-fb1a601aa4\" (UID: \"08b119e21354abecec3569c2fc59abfa\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.524925 kubelet[2461]: I0114 00:07:40.524911 2461 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/08b119e21354abecec3569c2fc59abfa-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-n-fb1a601aa4\" (UID: \"08b119e21354abecec3569c2fc59abfa\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.525049 kubelet[2461]: I0114 00:07:40.524988 2461 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.525049 kubelet[2461]: I0114 00:07:40.525008 2461 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ddea395bf63b2d8c86b065d7e97bf509-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-n-fb1a601aa4\" (UID: \"ddea395bf63b2d8c86b065d7e97bf509\") " pod="kube-system/kube-scheduler-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.525049 kubelet[2461]: I0114 00:07:40.525023 2461 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/08b119e21354abecec3569c2fc59abfa-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-n-fb1a601aa4\" (UID: \"08b119e21354abecec3569c2fc59abfa\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.525168 kubelet[2461]: I0114 00:07:40.525146 2461 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.617893 kubelet[2461]: I0114 00:07:40.617766 2461 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.618467 kubelet[2461]: E0114 00:07:40.618433 2461 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.77.139:6443/api/v1/nodes\": dial tcp 46.224.77.139:6443: connect: connection refused" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:40.688698 containerd[1590]: time="2026-01-14T00:07:40.687796459Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-n-fb1a601aa4,Uid:08b119e21354abecec3569c2fc59abfa,Namespace:kube-system,Attempt:0,}" Jan 14 00:07:40.698305 containerd[1590]: time="2026-01-14T00:07:40.698244296Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-n-fb1a601aa4,Uid:0eec82d12248a2d372141dae549c6f8e,Namespace:kube-system,Attempt:0,}" Jan 14 00:07:40.704432 containerd[1590]: time="2026-01-14T00:07:40.704386041Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-n-fb1a601aa4,Uid:ddea395bf63b2d8c86b065d7e97bf509,Namespace:kube-system,Attempt:0,}" Jan 14 00:07:40.726962 containerd[1590]: time="2026-01-14T00:07:40.726760205Z" level=info msg="connecting to shim 86c7de5fa85603f06dfa46a727e6625a5335e0d1b8360bdeeefacdb5e35823cc" address="unix:///run/containerd/s/199cb30911d1ec970fdf542f1848d5962aaa9c030a39e6d27a9248fbf1cbf4e8" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:07:40.752484 containerd[1590]: time="2026-01-14T00:07:40.752416330Z" level=info msg="connecting to shim 9af0c7d2661630c3775ff45822448f50355062151f77f37b2a32a278aaa3241c" address="unix:///run/containerd/s/8e1b18e1028523ed8504520f8b9c01baa280c084711098e66301b22c64d48f1c" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:07:40.759441 containerd[1590]: time="2026-01-14T00:07:40.759201785Z" level=info msg="connecting to shim 7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41" address="unix:///run/containerd/s/e2e8f688f643db3c26029dc7cb440cdf2959ea46d3db5dc1f392128db2a0374e" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:07:40.761874 systemd[1]: Started cri-containerd-86c7de5fa85603f06dfa46a727e6625a5335e0d1b8360bdeeefacdb5e35823cc.scope - libcontainer container 86c7de5fa85603f06dfa46a727e6625a5335e0d1b8360bdeeefacdb5e35823cc. Jan 14 00:07:40.782000 audit: BPF prog-id=81 op=LOAD Jan 14 00:07:40.783000 audit: BPF prog-id=82 op=LOAD Jan 14 00:07:40.783000 audit[2517]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400019c180 a2=98 a3=0 items=0 ppid=2505 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.783000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836633764653566613835363033663036646661343661373237653636 Jan 14 00:07:40.784000 audit: BPF prog-id=82 op=UNLOAD Jan 14 00:07:40.784000 audit[2517]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2505 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.784000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836633764653566613835363033663036646661343661373237653636 Jan 14 00:07:40.784000 audit: BPF prog-id=83 op=LOAD Jan 14 00:07:40.784000 audit[2517]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400019c3e8 a2=98 a3=0 items=0 ppid=2505 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.784000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836633764653566613835363033663036646661343661373237653636 Jan 14 00:07:40.784000 audit: BPF prog-id=84 op=LOAD Jan 14 00:07:40.784000 audit[2517]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=400019c168 a2=98 a3=0 items=0 ppid=2505 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.784000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836633764653566613835363033663036646661343661373237653636 Jan 14 00:07:40.784000 audit: BPF prog-id=84 op=UNLOAD Jan 14 00:07:40.784000 audit[2517]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2505 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.784000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836633764653566613835363033663036646661343661373237653636 Jan 14 00:07:40.784000 audit: BPF prog-id=83 op=UNLOAD Jan 14 00:07:40.784000 audit[2517]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2505 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.784000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836633764653566613835363033663036646661343661373237653636 Jan 14 00:07:40.785000 audit: BPF prog-id=85 op=LOAD Jan 14 00:07:40.785000 audit[2517]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400019c648 a2=98 a3=0 items=0 ppid=2505 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.785000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3836633764653566613835363033663036646661343661373237653636 Jan 14 00:07:40.793852 systemd[1]: Started cri-containerd-9af0c7d2661630c3775ff45822448f50355062151f77f37b2a32a278aaa3241c.scope - libcontainer container 9af0c7d2661630c3775ff45822448f50355062151f77f37b2a32a278aaa3241c. Jan 14 00:07:40.810734 systemd[1]: Started cri-containerd-7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41.scope - libcontainer container 7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41. Jan 14 00:07:40.834736 containerd[1590]: time="2026-01-14T00:07:40.834699673Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-n-fb1a601aa4,Uid:08b119e21354abecec3569c2fc59abfa,Namespace:kube-system,Attempt:0,} returns sandbox id \"86c7de5fa85603f06dfa46a727e6625a5335e0d1b8360bdeeefacdb5e35823cc\"" Jan 14 00:07:40.836000 audit: BPF prog-id=86 op=LOAD Jan 14 00:07:40.839514 kubelet[2461]: E0114 00:07:40.839462 2461 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://46.224.77.139:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-n-fb1a601aa4?timeout=10s\": dial tcp 46.224.77.139:6443: connect: connection refused" interval="800ms" Jan 14 00:07:40.838000 audit: BPF prog-id=87 op=LOAD Jan 14 00:07:40.838000 audit[2564]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=2541 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.838000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961663063376432363631363330633337373566663435383232343438 Jan 14 00:07:40.838000 audit: BPF prog-id=87 op=UNLOAD Jan 14 00:07:40.838000 audit[2564]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2541 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.838000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961663063376432363631363330633337373566663435383232343438 Jan 14 00:07:40.838000 audit: BPF prog-id=88 op=LOAD Jan 14 00:07:40.838000 audit[2564]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=2541 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.838000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961663063376432363631363330633337373566663435383232343438 Jan 14 00:07:40.839000 audit: BPF prog-id=89 op=LOAD Jan 14 00:07:40.839000 audit[2564]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=2541 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961663063376432363631363330633337373566663435383232343438 Jan 14 00:07:40.839000 audit: BPF prog-id=89 op=UNLOAD Jan 14 00:07:40.839000 audit[2564]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2541 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961663063376432363631363330633337373566663435383232343438 Jan 14 00:07:40.839000 audit: BPF prog-id=88 op=UNLOAD Jan 14 00:07:40.839000 audit[2564]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2541 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961663063376432363631363330633337373566663435383232343438 Jan 14 00:07:40.839000 audit: BPF prog-id=90 op=LOAD Jan 14 00:07:40.839000 audit[2564]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=2541 pid=2564 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.839000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3961663063376432363631363330633337373566663435383232343438 Jan 14 00:07:40.842000 audit: BPF prog-id=91 op=LOAD Jan 14 00:07:40.843000 audit: BPF prog-id=92 op=LOAD Jan 14 00:07:40.843000 audit[2582]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2555 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.843000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738363135393461376438346362323632363262393237646232616433 Jan 14 00:07:40.843000 audit: BPF prog-id=92 op=UNLOAD Jan 14 00:07:40.843000 audit[2582]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2555 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.843000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738363135393461376438346362323632363262393237646232616433 Jan 14 00:07:40.845000 audit: BPF prog-id=93 op=LOAD Jan 14 00:07:40.845000 audit[2582]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2555 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738363135393461376438346362323632363262393237646232616433 Jan 14 00:07:40.845000 audit: BPF prog-id=94 op=LOAD Jan 14 00:07:40.845000 audit[2582]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2555 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738363135393461376438346362323632363262393237646232616433 Jan 14 00:07:40.845000 audit: BPF prog-id=94 op=UNLOAD Jan 14 00:07:40.845000 audit[2582]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2555 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738363135393461376438346362323632363262393237646232616433 Jan 14 00:07:40.845000 audit: BPF prog-id=93 op=UNLOAD Jan 14 00:07:40.845000 audit[2582]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2555 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738363135393461376438346362323632363262393237646232616433 Jan 14 00:07:40.845000 audit: BPF prog-id=95 op=LOAD Jan 14 00:07:40.845000 audit[2582]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2555 pid=2582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3738363135393461376438346362323632363262393237646232616433 Jan 14 00:07:40.851850 containerd[1590]: time="2026-01-14T00:07:40.851474281Z" level=info msg="CreateContainer within sandbox \"86c7de5fa85603f06dfa46a727e6625a5335e0d1b8360bdeeefacdb5e35823cc\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 14 00:07:40.874960 containerd[1590]: time="2026-01-14T00:07:40.874897917Z" level=info msg="Container a48336692ce359755af31eb72390eaed09cfa4c74e52f92a54aae43335e2310b: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:07:40.886268 containerd[1590]: time="2026-01-14T00:07:40.886224810Z" level=info msg="CreateContainer within sandbox \"86c7de5fa85603f06dfa46a727e6625a5335e0d1b8360bdeeefacdb5e35823cc\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"a48336692ce359755af31eb72390eaed09cfa4c74e52f92a54aae43335e2310b\"" Jan 14 00:07:40.887546 containerd[1590]: time="2026-01-14T00:07:40.887434488Z" level=info msg="StartContainer for \"a48336692ce359755af31eb72390eaed09cfa4c74e52f92a54aae43335e2310b\"" Jan 14 00:07:40.887864 containerd[1590]: time="2026-01-14T00:07:40.887837327Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-n-fb1a601aa4,Uid:ddea395bf63b2d8c86b065d7e97bf509,Namespace:kube-system,Attempt:0,} returns sandbox id \"9af0c7d2661630c3775ff45822448f50355062151f77f37b2a32a278aaa3241c\"" Jan 14 00:07:40.888803 containerd[1590]: time="2026-01-14T00:07:40.888572629Z" level=info msg="connecting to shim a48336692ce359755af31eb72390eaed09cfa4c74e52f92a54aae43335e2310b" address="unix:///run/containerd/s/199cb30911d1ec970fdf542f1848d5962aaa9c030a39e6d27a9248fbf1cbf4e8" protocol=ttrpc version=3 Jan 14 00:07:40.895293 containerd[1590]: time="2026-01-14T00:07:40.895215972Z" level=info msg="CreateContainer within sandbox \"9af0c7d2661630c3775ff45822448f50355062151f77f37b2a32a278aaa3241c\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 14 00:07:40.896603 containerd[1590]: time="2026-01-14T00:07:40.896545105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-n-fb1a601aa4,Uid:0eec82d12248a2d372141dae549c6f8e,Namespace:kube-system,Attempt:0,} returns sandbox id \"7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41\"" Jan 14 00:07:40.901607 containerd[1590]: time="2026-01-14T00:07:40.901380936Z" level=info msg="CreateContainer within sandbox \"7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 14 00:07:40.911822 containerd[1590]: time="2026-01-14T00:07:40.911759197Z" level=info msg="Container f4e1bb910e9ead011550006449a851c3c10e04a29594f69723d313440c15cd0c: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:07:40.916939 systemd[1]: Started cri-containerd-a48336692ce359755af31eb72390eaed09cfa4c74e52f92a54aae43335e2310b.scope - libcontainer container a48336692ce359755af31eb72390eaed09cfa4c74e52f92a54aae43335e2310b. Jan 14 00:07:40.918152 containerd[1590]: time="2026-01-14T00:07:40.917704147Z" level=info msg="Container f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:07:40.924196 containerd[1590]: time="2026-01-14T00:07:40.924131919Z" level=info msg="CreateContainer within sandbox \"9af0c7d2661630c3775ff45822448f50355062151f77f37b2a32a278aaa3241c\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"f4e1bb910e9ead011550006449a851c3c10e04a29594f69723d313440c15cd0c\"" Jan 14 00:07:40.924964 containerd[1590]: time="2026-01-14T00:07:40.924892001Z" level=info msg="StartContainer for \"f4e1bb910e9ead011550006449a851c3c10e04a29594f69723d313440c15cd0c\"" Jan 14 00:07:40.927386 containerd[1590]: time="2026-01-14T00:07:40.927311037Z" level=info msg="connecting to shim f4e1bb910e9ead011550006449a851c3c10e04a29594f69723d313440c15cd0c" address="unix:///run/containerd/s/8e1b18e1028523ed8504520f8b9c01baa280c084711098e66301b22c64d48f1c" protocol=ttrpc version=3 Jan 14 00:07:40.929444 containerd[1590]: time="2026-01-14T00:07:40.929361221Z" level=info msg="CreateContainer within sandbox \"7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7\"" Jan 14 00:07:40.930506 containerd[1590]: time="2026-01-14T00:07:40.930479787Z" level=info msg="StartContainer for \"f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7\"" Jan 14 00:07:40.931000 audit: BPF prog-id=96 op=LOAD Jan 14 00:07:40.934089 containerd[1590]: time="2026-01-14T00:07:40.933665191Z" level=info msg="connecting to shim f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7" address="unix:///run/containerd/s/e2e8f688f643db3c26029dc7cb440cdf2959ea46d3db5dc1f392128db2a0374e" protocol=ttrpc version=3 Jan 14 00:07:40.933000 audit: BPF prog-id=97 op=LOAD Jan 14 00:07:40.933000 audit[2633]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=2505 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134383333363639326365333539373535616633316562373233393065 Jan 14 00:07:40.933000 audit: BPF prog-id=97 op=UNLOAD Jan 14 00:07:40.933000 audit[2633]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2505 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134383333363639326365333539373535616633316562373233393065 Jan 14 00:07:40.934000 audit: BPF prog-id=98 op=LOAD Jan 14 00:07:40.934000 audit[2633]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=2505 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134383333363639326365333539373535616633316562373233393065 Jan 14 00:07:40.934000 audit: BPF prog-id=99 op=LOAD Jan 14 00:07:40.934000 audit[2633]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=2505 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134383333363639326365333539373535616633316562373233393065 Jan 14 00:07:40.934000 audit: BPF prog-id=99 op=UNLOAD Jan 14 00:07:40.934000 audit[2633]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2505 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134383333363639326365333539373535616633316562373233393065 Jan 14 00:07:40.934000 audit: BPF prog-id=98 op=UNLOAD Jan 14 00:07:40.934000 audit[2633]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2505 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134383333363639326365333539373535616633316562373233393065 Jan 14 00:07:40.934000 audit: BPF prog-id=100 op=LOAD Jan 14 00:07:40.934000 audit[2633]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=2505 pid=2633 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.934000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134383333363639326365333539373535616633316562373233393065 Jan 14 00:07:40.956875 systemd[1]: Started cri-containerd-f4e1bb910e9ead011550006449a851c3c10e04a29594f69723d313440c15cd0c.scope - libcontainer container f4e1bb910e9ead011550006449a851c3c10e04a29594f69723d313440c15cd0c. Jan 14 00:07:40.967727 systemd[1]: Started cri-containerd-f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7.scope - libcontainer container f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7. Jan 14 00:07:40.981000 audit: BPF prog-id=101 op=LOAD Jan 14 00:07:40.982000 audit: BPF prog-id=102 op=LOAD Jan 14 00:07:40.982000 audit[2655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2541 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.982000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634653162623931306539656164303131353530303036343439613835 Jan 14 00:07:40.982000 audit: BPF prog-id=102 op=UNLOAD Jan 14 00:07:40.982000 audit[2655]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2541 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.982000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634653162623931306539656164303131353530303036343439613835 Jan 14 00:07:40.983000 audit: BPF prog-id=103 op=LOAD Jan 14 00:07:40.983000 audit[2655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2541 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.983000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634653162623931306539656164303131353530303036343439613835 Jan 14 00:07:40.983000 audit: BPF prog-id=104 op=LOAD Jan 14 00:07:40.983000 audit[2655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2541 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.983000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634653162623931306539656164303131353530303036343439613835 Jan 14 00:07:40.983000 audit: BPF prog-id=104 op=UNLOAD Jan 14 00:07:40.983000 audit[2655]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2541 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.983000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634653162623931306539656164303131353530303036343439613835 Jan 14 00:07:40.984000 audit: BPF prog-id=103 op=UNLOAD Jan 14 00:07:40.984000 audit[2655]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2541 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634653162623931306539656164303131353530303036343439613835 Jan 14 00:07:40.984000 audit: BPF prog-id=105 op=LOAD Jan 14 00:07:40.984000 audit[2655]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2541 pid=2655 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.984000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6634653162623931306539656164303131353530303036343439613835 Jan 14 00:07:40.991599 containerd[1590]: time="2026-01-14T00:07:40.991487997Z" level=info msg="StartContainer for \"a48336692ce359755af31eb72390eaed09cfa4c74e52f92a54aae43335e2310b\" returns successfully" Jan 14 00:07:40.996000 audit: BPF prog-id=106 op=LOAD Jan 14 00:07:40.997000 audit: BPF prog-id=107 op=LOAD Jan 14 00:07:40.997000 audit[2656]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2555 pid=2656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632616534343733353835303163316665633261663637643436313663 Jan 14 00:07:40.997000 audit: BPF prog-id=107 op=UNLOAD Jan 14 00:07:40.997000 audit[2656]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2555 pid=2656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632616534343733353835303163316665633261663637643436313663 Jan 14 00:07:40.997000 audit: BPF prog-id=108 op=LOAD Jan 14 00:07:40.997000 audit[2656]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2555 pid=2656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632616534343733353835303163316665633261663637643436313663 Jan 14 00:07:40.997000 audit: BPF prog-id=109 op=LOAD Jan 14 00:07:40.997000 audit[2656]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2555 pid=2656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632616534343733353835303163316665633261663637643436313663 Jan 14 00:07:40.997000 audit: BPF prog-id=109 op=UNLOAD Jan 14 00:07:40.997000 audit[2656]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2555 pid=2656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632616534343733353835303163316665633261663637643436313663 Jan 14 00:07:40.997000 audit: BPF prog-id=108 op=UNLOAD Jan 14 00:07:40.997000 audit[2656]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2555 pid=2656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632616534343733353835303163316665633261663637643436313663 Jan 14 00:07:40.997000 audit: BPF prog-id=110 op=LOAD Jan 14 00:07:40.997000 audit[2656]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2555 pid=2656 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:40.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632616534343733353835303163316665633261663637643436313663 Jan 14 00:07:41.021847 kubelet[2461]: I0114 00:07:41.021020 2461 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:41.021847 kubelet[2461]: E0114 00:07:41.021336 2461 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://46.224.77.139:6443/api/v1/nodes\": dial tcp 46.224.77.139:6443: connect: connection refused" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:41.039513 containerd[1590]: time="2026-01-14T00:07:41.039469771Z" level=info msg="StartContainer for \"f4e1bb910e9ead011550006449a851c3c10e04a29594f69723d313440c15cd0c\" returns successfully" Jan 14 00:07:41.047234 containerd[1590]: time="2026-01-14T00:07:41.047171390Z" level=info msg="StartContainer for \"f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7\" returns successfully" Jan 14 00:07:41.084840 kubelet[2461]: E0114 00:07:41.084791 2461 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://46.224.77.139:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 46.224.77.139:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 14 00:07:41.273292 kubelet[2461]: E0114 00:07:41.273194 2461 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:41.280846 kubelet[2461]: E0114 00:07:41.280690 2461 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:41.282764 kubelet[2461]: E0114 00:07:41.282744 2461 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:41.823632 kubelet[2461]: I0114 00:07:41.823550 2461 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:42.285892 kubelet[2461]: E0114 00:07:42.285827 2461 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:42.286192 kubelet[2461]: E0114 00:07:42.286180 2461 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:43.200754 kubelet[2461]: I0114 00:07:43.200472 2461 apiserver.go:52] "Watching apiserver" Jan 14 00:07:43.280487 kubelet[2461]: E0114 00:07:43.280417 2461 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:43.288057 kubelet[2461]: E0114 00:07:43.288017 2461 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:43.322741 kubelet[2461]: I0114 00:07:43.322697 2461 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 14 00:07:43.385196 kubelet[2461]: I0114 00:07:43.385147 2461 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:43.421876 kubelet[2461]: I0114 00:07:43.421822 2461 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:43.443207 kubelet[2461]: E0114 00:07:43.443150 2461 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-fb1a601aa4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:43.443207 kubelet[2461]: I0114 00:07:43.443189 2461 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:43.455383 kubelet[2461]: E0114 00:07:43.455239 2461 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-fb1a601aa4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:43.455383 kubelet[2461]: I0114 00:07:43.455276 2461 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:43.465584 kubelet[2461]: E0114 00:07:43.465012 2461 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:45.626196 systemd[1]: Reload requested from client PID 2740 ('systemctl') (unit session-8.scope)... Jan 14 00:07:45.626229 systemd[1]: Reloading... Jan 14 00:07:45.741558 zram_generator::config[2790]: No configuration found. Jan 14 00:07:45.951123 systemd[1]: Reloading finished in 324 ms. Jan 14 00:07:45.990837 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:07:46.009019 systemd[1]: kubelet.service: Deactivated successfully. Jan 14 00:07:46.009861 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:07:46.008000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:46.011579 kernel: kauditd_printk_skb: 158 callbacks suppressed Jan 14 00:07:46.011682 kernel: audit: type=1131 audit(1768349266.008:392): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:46.013146 systemd[1]: kubelet.service: Consumed 1.949s CPU time, 127.2M memory peak. Jan 14 00:07:46.015746 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 14 00:07:46.018688 kernel: audit: type=1334 audit(1768349266.015:393): prog-id=111 op=LOAD Jan 14 00:07:46.018776 kernel: audit: type=1334 audit(1768349266.015:394): prog-id=71 op=UNLOAD Jan 14 00:07:46.015000 audit: BPF prog-id=111 op=LOAD Jan 14 00:07:46.015000 audit: BPF prog-id=71 op=UNLOAD Jan 14 00:07:46.020839 kernel: audit: type=1334 audit(1768349266.015:395): prog-id=112 op=LOAD Jan 14 00:07:46.020999 kernel: audit: type=1334 audit(1768349266.015:396): prog-id=113 op=LOAD Jan 14 00:07:46.015000 audit: BPF prog-id=112 op=LOAD Jan 14 00:07:46.015000 audit: BPF prog-id=113 op=LOAD Jan 14 00:07:46.027046 kernel: audit: type=1334 audit(1768349266.015:397): prog-id=72 op=UNLOAD Jan 14 00:07:46.027144 kernel: audit: type=1334 audit(1768349266.015:398): prog-id=73 op=UNLOAD Jan 14 00:07:46.027165 kernel: audit: type=1334 audit(1768349266.016:399): prog-id=114 op=LOAD Jan 14 00:07:46.027182 kernel: audit: type=1334 audit(1768349266.016:400): prog-id=62 op=UNLOAD Jan 14 00:07:46.027210 kernel: audit: type=1334 audit(1768349266.017:401): prog-id=115 op=LOAD Jan 14 00:07:46.015000 audit: BPF prog-id=72 op=UNLOAD Jan 14 00:07:46.015000 audit: BPF prog-id=73 op=UNLOAD Jan 14 00:07:46.016000 audit: BPF prog-id=114 op=LOAD Jan 14 00:07:46.016000 audit: BPF prog-id=62 op=UNLOAD Jan 14 00:07:46.017000 audit: BPF prog-id=115 op=LOAD Jan 14 00:07:46.017000 audit: BPF prog-id=116 op=LOAD Jan 14 00:07:46.017000 audit: BPF prog-id=63 op=UNLOAD Jan 14 00:07:46.017000 audit: BPF prog-id=64 op=UNLOAD Jan 14 00:07:46.017000 audit: BPF prog-id=117 op=LOAD Jan 14 00:07:46.017000 audit: BPF prog-id=77 op=UNLOAD Jan 14 00:07:46.019000 audit: BPF prog-id=118 op=LOAD Jan 14 00:07:46.019000 audit: BPF prog-id=66 op=UNLOAD Jan 14 00:07:46.019000 audit: BPF prog-id=119 op=LOAD Jan 14 00:07:46.019000 audit: BPF prog-id=120 op=LOAD Jan 14 00:07:46.019000 audit: BPF prog-id=67 op=UNLOAD Jan 14 00:07:46.019000 audit: BPF prog-id=68 op=UNLOAD Jan 14 00:07:46.020000 audit: BPF prog-id=121 op=LOAD Jan 14 00:07:46.020000 audit: BPF prog-id=122 op=LOAD Jan 14 00:07:46.020000 audit: BPF prog-id=69 op=UNLOAD Jan 14 00:07:46.020000 audit: BPF prog-id=70 op=UNLOAD Jan 14 00:07:46.022000 audit: BPF prog-id=123 op=LOAD Jan 14 00:07:46.022000 audit: BPF prog-id=78 op=UNLOAD Jan 14 00:07:46.022000 audit: BPF prog-id=124 op=LOAD Jan 14 00:07:46.022000 audit: BPF prog-id=125 op=LOAD Jan 14 00:07:46.022000 audit: BPF prog-id=79 op=UNLOAD Jan 14 00:07:46.022000 audit: BPF prog-id=80 op=UNLOAD Jan 14 00:07:46.023000 audit: BPF prog-id=126 op=LOAD Jan 14 00:07:46.023000 audit: BPF prog-id=61 op=UNLOAD Jan 14 00:07:46.024000 audit: BPF prog-id=127 op=LOAD Jan 14 00:07:46.024000 audit: BPF prog-id=74 op=UNLOAD Jan 14 00:07:46.024000 audit: BPF prog-id=128 op=LOAD Jan 14 00:07:46.025000 audit: BPF prog-id=129 op=LOAD Jan 14 00:07:46.025000 audit: BPF prog-id=75 op=UNLOAD Jan 14 00:07:46.025000 audit: BPF prog-id=76 op=UNLOAD Jan 14 00:07:46.026000 audit: BPF prog-id=130 op=LOAD Jan 14 00:07:46.026000 audit: BPF prog-id=65 op=UNLOAD Jan 14 00:07:46.178630 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 14 00:07:46.178000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:07:46.192184 (kubelet)[2832]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 14 00:07:46.243330 kubelet[2832]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 00:07:46.243330 kubelet[2832]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 14 00:07:46.243330 kubelet[2832]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 14 00:07:46.243819 kubelet[2832]: I0114 00:07:46.243312 2832 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 14 00:07:46.257502 kubelet[2832]: I0114 00:07:46.257421 2832 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Jan 14 00:07:46.257502 kubelet[2832]: I0114 00:07:46.257454 2832 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 14 00:07:46.257717 kubelet[2832]: I0114 00:07:46.257698 2832 server.go:956] "Client rotation is on, will bootstrap in background" Jan 14 00:07:46.259071 kubelet[2832]: I0114 00:07:46.259028 2832 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 14 00:07:46.261497 kubelet[2832]: I0114 00:07:46.261450 2832 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 14 00:07:46.272591 kubelet[2832]: I0114 00:07:46.270776 2832 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 14 00:07:46.273583 kubelet[2832]: I0114 00:07:46.273515 2832 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jan 14 00:07:46.273988 kubelet[2832]: I0114 00:07:46.273939 2832 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 14 00:07:46.274244 kubelet[2832]: I0114 00:07:46.274067 2832 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-n-fb1a601aa4","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 14 00:07:46.274438 kubelet[2832]: I0114 00:07:46.274405 2832 topology_manager.go:138] "Creating topology manager with none policy" Jan 14 00:07:46.274497 kubelet[2832]: I0114 00:07:46.274489 2832 container_manager_linux.go:303] "Creating device plugin manager" Jan 14 00:07:46.274618 kubelet[2832]: I0114 00:07:46.274608 2832 state_mem.go:36] "Initialized new in-memory state store" Jan 14 00:07:46.274863 kubelet[2832]: I0114 00:07:46.274847 2832 kubelet.go:480] "Attempting to sync node with API server" Jan 14 00:07:46.274946 kubelet[2832]: I0114 00:07:46.274934 2832 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 14 00:07:46.275034 kubelet[2832]: I0114 00:07:46.275025 2832 kubelet.go:386] "Adding apiserver pod source" Jan 14 00:07:46.275109 kubelet[2832]: I0114 00:07:46.275099 2832 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 14 00:07:46.277782 kubelet[2832]: I0114 00:07:46.277749 2832 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 14 00:07:46.278596 kubelet[2832]: I0114 00:07:46.278573 2832 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 14 00:07:46.287543 kubelet[2832]: I0114 00:07:46.286911 2832 watchdog_linux.go:99] "Systemd watchdog is not enabled" Jan 14 00:07:46.287543 kubelet[2832]: I0114 00:07:46.286964 2832 server.go:1289] "Started kubelet" Jan 14 00:07:46.291565 kubelet[2832]: I0114 00:07:46.289837 2832 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 14 00:07:46.301655 kubelet[2832]: I0114 00:07:46.301585 2832 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 14 00:07:46.304715 kubelet[2832]: I0114 00:07:46.304318 2832 server.go:317] "Adding debug handlers to kubelet server" Jan 14 00:07:46.311490 kubelet[2832]: I0114 00:07:46.311421 2832 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 14 00:07:46.311806 kubelet[2832]: I0114 00:07:46.311789 2832 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 14 00:07:46.312115 kubelet[2832]: I0114 00:07:46.312096 2832 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 14 00:07:46.314457 kubelet[2832]: I0114 00:07:46.314420 2832 volume_manager.go:297] "Starting Kubelet Volume Manager" Jan 14 00:07:46.314831 kubelet[2832]: E0114 00:07:46.314808 2832 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-n-fb1a601aa4\" not found" Jan 14 00:07:46.324483 kubelet[2832]: I0114 00:07:46.324402 2832 factory.go:223] Registration of the systemd container factory successfully Jan 14 00:07:46.324612 kubelet[2832]: I0114 00:07:46.324586 2832 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 14 00:07:46.325954 kubelet[2832]: I0114 00:07:46.325933 2832 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Jan 14 00:07:46.327229 kubelet[2832]: E0114 00:07:46.326214 2832 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 14 00:07:46.327434 kubelet[2832]: I0114 00:07:46.327420 2832 reconciler.go:26] "Reconciler: start to sync state" Jan 14 00:07:46.330083 kubelet[2832]: I0114 00:07:46.328866 2832 factory.go:223] Registration of the containerd container factory successfully Jan 14 00:07:46.330342 kubelet[2832]: I0114 00:07:46.330280 2832 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Jan 14 00:07:46.332183 kubelet[2832]: I0114 00:07:46.331494 2832 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Jan 14 00:07:46.332183 kubelet[2832]: I0114 00:07:46.331529 2832 status_manager.go:230] "Starting to sync pod status with apiserver" Jan 14 00:07:46.332183 kubelet[2832]: I0114 00:07:46.331560 2832 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 14 00:07:46.332183 kubelet[2832]: I0114 00:07:46.331567 2832 kubelet.go:2436] "Starting kubelet main sync loop" Jan 14 00:07:46.332183 kubelet[2832]: E0114 00:07:46.331608 2832 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 14 00:07:46.383375 kubelet[2832]: I0114 00:07:46.383343 2832 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 14 00:07:46.383557 kubelet[2832]: I0114 00:07:46.383540 2832 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 14 00:07:46.383622 kubelet[2832]: I0114 00:07:46.383615 2832 state_mem.go:36] "Initialized new in-memory state store" Jan 14 00:07:46.383869 kubelet[2832]: I0114 00:07:46.383847 2832 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 14 00:07:46.383987 kubelet[2832]: I0114 00:07:46.383956 2832 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 14 00:07:46.384062 kubelet[2832]: I0114 00:07:46.384052 2832 policy_none.go:49] "None policy: Start" Jan 14 00:07:46.384129 kubelet[2832]: I0114 00:07:46.384120 2832 memory_manager.go:186] "Starting memorymanager" policy="None" Jan 14 00:07:46.384201 kubelet[2832]: I0114 00:07:46.384191 2832 state_mem.go:35] "Initializing new in-memory state store" Jan 14 00:07:46.384589 kubelet[2832]: I0114 00:07:46.384466 2832 state_mem.go:75] "Updated machine memory state" Jan 14 00:07:46.389858 kubelet[2832]: E0114 00:07:46.389834 2832 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 14 00:07:46.390240 kubelet[2832]: I0114 00:07:46.390227 2832 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 14 00:07:46.390368 kubelet[2832]: I0114 00:07:46.390331 2832 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 14 00:07:46.390933 kubelet[2832]: I0114 00:07:46.390916 2832 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 14 00:07:46.393950 kubelet[2832]: E0114 00:07:46.393899 2832 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 14 00:07:46.433570 kubelet[2832]: I0114 00:07:46.433065 2832 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.435120 kubelet[2832]: I0114 00:07:46.435085 2832 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.435456 kubelet[2832]: I0114 00:07:46.435084 2832 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.495732 kubelet[2832]: I0114 00:07:46.495579 2832 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.509988 kubelet[2832]: I0114 00:07:46.509612 2832 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.509988 kubelet[2832]: I0114 00:07:46.509704 2832 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.528907 kubelet[2832]: I0114 00:07:46.528852 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/08b119e21354abecec3569c2fc59abfa-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-n-fb1a601aa4\" (UID: \"08b119e21354abecec3569c2fc59abfa\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.529221 kubelet[2832]: I0114 00:07:46.529179 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/08b119e21354abecec3569c2fc59abfa-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-n-fb1a601aa4\" (UID: \"08b119e21354abecec3569c2fc59abfa\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.529312 kubelet[2832]: I0114 00:07:46.529295 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.529481 kubelet[2832]: I0114 00:07:46.529448 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.529737 kubelet[2832]: I0114 00:07:46.529680 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/08b119e21354abecec3569c2fc59abfa-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-n-fb1a601aa4\" (UID: \"08b119e21354abecec3569c2fc59abfa\") " pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.529892 kubelet[2832]: I0114 00:07:46.529834 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.530055 kubelet[2832]: I0114 00:07:46.529868 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.530055 kubelet[2832]: I0114 00:07:46.530007 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0eec82d12248a2d372141dae549c6f8e-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-n-fb1a601aa4\" (UID: \"0eec82d12248a2d372141dae549c6f8e\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:46.530384 kubelet[2832]: I0114 00:07:46.530347 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/ddea395bf63b2d8c86b065d7e97bf509-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-n-fb1a601aa4\" (UID: \"ddea395bf63b2d8c86b065d7e97bf509\") " pod="kube-system/kube-scheduler-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:47.276759 kubelet[2832]: I0114 00:07:47.276709 2832 apiserver.go:52] "Watching apiserver" Jan 14 00:07:47.327196 kubelet[2832]: I0114 00:07:47.327123 2832 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Jan 14 00:07:47.369588 kubelet[2832]: I0114 00:07:47.368450 2832 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:47.369588 kubelet[2832]: I0114 00:07:47.368970 2832 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:47.385501 kubelet[2832]: E0114 00:07:47.385463 2832 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-n-fb1a601aa4\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:47.386365 kubelet[2832]: E0114 00:07:47.386045 2832 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-n-fb1a601aa4\" already exists" pod="kube-system/kube-scheduler-ci-4547-0-0-n-fb1a601aa4" Jan 14 00:07:47.401343 kubelet[2832]: I0114 00:07:47.401257 2832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-n-fb1a601aa4" podStartSLOduration=1.401239179 podStartE2EDuration="1.401239179s" podCreationTimestamp="2026-01-14 00:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 00:07:47.400560326 +0000 UTC m=+1.202971633" watchObservedRunningTime="2026-01-14 00:07:47.401239179 +0000 UTC m=+1.203650486" Jan 14 00:07:47.435566 kubelet[2832]: I0114 00:07:47.433971 2832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-n-fb1a601aa4" podStartSLOduration=1.433945838 podStartE2EDuration="1.433945838s" podCreationTimestamp="2026-01-14 00:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 00:07:47.41881457 +0000 UTC m=+1.221225877" watchObservedRunningTime="2026-01-14 00:07:47.433945838 +0000 UTC m=+1.236357185" Jan 14 00:07:47.455444 kubelet[2832]: I0114 00:07:47.455307 2832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-n-fb1a601aa4" podStartSLOduration=1.455282633 podStartE2EDuration="1.455282633s" podCreationTimestamp="2026-01-14 00:07:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 00:07:47.436111193 +0000 UTC m=+1.238522500" watchObservedRunningTime="2026-01-14 00:07:47.455282633 +0000 UTC m=+1.257694020" Jan 14 00:07:51.838936 kubelet[2832]: I0114 00:07:51.838877 2832 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 14 00:07:51.839378 containerd[1590]: time="2026-01-14T00:07:51.839204627Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 14 00:07:51.839736 kubelet[2832]: I0114 00:07:51.839387 2832 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 14 00:07:52.560500 systemd[1]: Created slice kubepods-besteffort-pod50895e1c_ac3b_40b8_802b_592fc0469b8d.slice - libcontainer container kubepods-besteffort-pod50895e1c_ac3b_40b8_802b_592fc0469b8d.slice. Jan 14 00:07:52.573867 kubelet[2832]: I0114 00:07:52.573815 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/50895e1c-ac3b-40b8-802b-592fc0469b8d-xtables-lock\") pod \"kube-proxy-djh8f\" (UID: \"50895e1c-ac3b-40b8-802b-592fc0469b8d\") " pod="kube-system/kube-proxy-djh8f" Jan 14 00:07:52.573867 kubelet[2832]: I0114 00:07:52.573860 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd9pt\" (UniqueName: \"kubernetes.io/projected/50895e1c-ac3b-40b8-802b-592fc0469b8d-kube-api-access-jd9pt\") pod \"kube-proxy-djh8f\" (UID: \"50895e1c-ac3b-40b8-802b-592fc0469b8d\") " pod="kube-system/kube-proxy-djh8f" Jan 14 00:07:52.574026 kubelet[2832]: I0114 00:07:52.573924 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/50895e1c-ac3b-40b8-802b-592fc0469b8d-kube-proxy\") pod \"kube-proxy-djh8f\" (UID: \"50895e1c-ac3b-40b8-802b-592fc0469b8d\") " pod="kube-system/kube-proxy-djh8f" Jan 14 00:07:52.574026 kubelet[2832]: I0114 00:07:52.573962 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/50895e1c-ac3b-40b8-802b-592fc0469b8d-lib-modules\") pod \"kube-proxy-djh8f\" (UID: \"50895e1c-ac3b-40b8-802b-592fc0469b8d\") " pod="kube-system/kube-proxy-djh8f" Jan 14 00:07:52.773661 systemd[1]: Created slice kubepods-besteffort-pod67487d81_7ef0_446e_9b49_669d47cd83cc.slice - libcontainer container kubepods-besteffort-pod67487d81_7ef0_446e_9b49_669d47cd83cc.slice. Jan 14 00:07:52.872423 containerd[1590]: time="2026-01-14T00:07:52.872106734Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-djh8f,Uid:50895e1c-ac3b-40b8-802b-592fc0469b8d,Namespace:kube-system,Attempt:0,}" Jan 14 00:07:52.876063 kubelet[2832]: I0114 00:07:52.875948 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrbq2\" (UniqueName: \"kubernetes.io/projected/67487d81-7ef0-446e-9b49-669d47cd83cc-kube-api-access-qrbq2\") pod \"tigera-operator-7dcd859c48-h5t2k\" (UID: \"67487d81-7ef0-446e-9b49-669d47cd83cc\") " pod="tigera-operator/tigera-operator-7dcd859c48-h5t2k" Jan 14 00:07:52.876063 kubelet[2832]: I0114 00:07:52.876010 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/67487d81-7ef0-446e-9b49-669d47cd83cc-var-lib-calico\") pod \"tigera-operator-7dcd859c48-h5t2k\" (UID: \"67487d81-7ef0-446e-9b49-669d47cd83cc\") " pod="tigera-operator/tigera-operator-7dcd859c48-h5t2k" Jan 14 00:07:52.899062 containerd[1590]: time="2026-01-14T00:07:52.898704659Z" level=info msg="connecting to shim 35208f41759db0500ae0aa177be3e8109eda432e46c6515eec5c744db5099999" address="unix:///run/containerd/s/3f796bd9799a912caf8ee23338b155b0a70a070afbe7d1653bb43c03d67df648" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:07:52.929915 systemd[1]: Started cri-containerd-35208f41759db0500ae0aa177be3e8109eda432e46c6515eec5c744db5099999.scope - libcontainer container 35208f41759db0500ae0aa177be3e8109eda432e46c6515eec5c744db5099999. Jan 14 00:07:52.943693 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 14 00:07:52.943782 kernel: audit: type=1334 audit(1768349272.941:434): prog-id=131 op=LOAD Jan 14 00:07:52.941000 audit: BPF prog-id=131 op=LOAD Jan 14 00:07:52.946548 kernel: audit: type=1334 audit(1768349272.942:435): prog-id=132 op=LOAD Jan 14 00:07:52.946671 kernel: audit: type=1300 audit(1768349272.942:435): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.942000 audit: BPF prog-id=132 op=LOAD Jan 14 00:07:52.942000 audit[2901]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.942000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.948860 kernel: audit: type=1327 audit(1768349272.942:435): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.942000 audit: BPF prog-id=132 op=UNLOAD Jan 14 00:07:52.949824 kernel: audit: type=1334 audit(1768349272.942:436): prog-id=132 op=UNLOAD Jan 14 00:07:52.942000 audit[2901]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.952475 kernel: audit: type=1300 audit(1768349272.942:436): arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.942000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.955001 kernel: audit: type=1327 audit(1768349272.942:436): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.942000 audit: BPF prog-id=133 op=LOAD Jan 14 00:07:52.956188 kernel: audit: type=1334 audit(1768349272.942:437): prog-id=133 op=LOAD Jan 14 00:07:52.942000 audit[2901]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.958547 kernel: audit: type=1300 audit(1768349272.942:437): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.942000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.961094 kernel: audit: type=1327 audit(1768349272.942:437): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.943000 audit: BPF prog-id=134 op=LOAD Jan 14 00:07:52.943000 audit[2901]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.945000 audit: BPF prog-id=134 op=UNLOAD Jan 14 00:07:52.945000 audit[2901]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.945000 audit: BPF prog-id=133 op=UNLOAD Jan 14 00:07:52.945000 audit[2901]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.945000 audit: BPF prog-id=135 op=LOAD Jan 14 00:07:52.945000 audit[2901]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=2890 pid=2901 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:52.945000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3335323038663431373539646230353030616530616131373762653365 Jan 14 00:07:52.981156 containerd[1590]: time="2026-01-14T00:07:52.981113720Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-djh8f,Uid:50895e1c-ac3b-40b8-802b-592fc0469b8d,Namespace:kube-system,Attempt:0,} returns sandbox id \"35208f41759db0500ae0aa177be3e8109eda432e46c6515eec5c744db5099999\"" Jan 14 00:07:52.992767 containerd[1590]: time="2026-01-14T00:07:52.992712516Z" level=info msg="CreateContainer within sandbox \"35208f41759db0500ae0aa177be3e8109eda432e46c6515eec5c744db5099999\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 14 00:07:53.018762 containerd[1590]: time="2026-01-14T00:07:53.018692306Z" level=info msg="Container db97c2a3acc1d308fd1d3b43dbc819c77f659ba68d45242dc0ebe7241dc5a9af: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:07:53.029302 containerd[1590]: time="2026-01-14T00:07:53.029224039Z" level=info msg="CreateContainer within sandbox \"35208f41759db0500ae0aa177be3e8109eda432e46c6515eec5c744db5099999\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"db97c2a3acc1d308fd1d3b43dbc819c77f659ba68d45242dc0ebe7241dc5a9af\"" Jan 14 00:07:53.031676 containerd[1590]: time="2026-01-14T00:07:53.031630172Z" level=info msg="StartContainer for \"db97c2a3acc1d308fd1d3b43dbc819c77f659ba68d45242dc0ebe7241dc5a9af\"" Jan 14 00:07:53.034336 containerd[1590]: time="2026-01-14T00:07:53.034296900Z" level=info msg="connecting to shim db97c2a3acc1d308fd1d3b43dbc819c77f659ba68d45242dc0ebe7241dc5a9af" address="unix:///run/containerd/s/3f796bd9799a912caf8ee23338b155b0a70a070afbe7d1653bb43c03d67df648" protocol=ttrpc version=3 Jan 14 00:07:53.054766 systemd[1]: Started cri-containerd-db97c2a3acc1d308fd1d3b43dbc819c77f659ba68d45242dc0ebe7241dc5a9af.scope - libcontainer container db97c2a3acc1d308fd1d3b43dbc819c77f659ba68d45242dc0ebe7241dc5a9af. Jan 14 00:07:53.079766 containerd[1590]: time="2026-01-14T00:07:53.079643482Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-h5t2k,Uid:67487d81-7ef0-446e-9b49-669d47cd83cc,Namespace:tigera-operator,Attempt:0,}" Jan 14 00:07:53.102000 audit: BPF prog-id=136 op=LOAD Jan 14 00:07:53.102000 audit[2928]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2890 pid=2928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.102000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462393763326133616363316433303866643164336234336462633831 Jan 14 00:07:53.102000 audit: BPF prog-id=137 op=LOAD Jan 14 00:07:53.102000 audit[2928]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2890 pid=2928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.102000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462393763326133616363316433303866643164336234336462633831 Jan 14 00:07:53.102000 audit: BPF prog-id=137 op=UNLOAD Jan 14 00:07:53.102000 audit[2928]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2890 pid=2928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.102000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462393763326133616363316433303866643164336234336462633831 Jan 14 00:07:53.103000 audit: BPF prog-id=136 op=UNLOAD Jan 14 00:07:53.103000 audit[2928]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2890 pid=2928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.103000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462393763326133616363316433303866643164336234336462633831 Jan 14 00:07:53.103000 audit: BPF prog-id=138 op=LOAD Jan 14 00:07:53.103000 audit[2928]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2890 pid=2928 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.103000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462393763326133616363316433303866643164336234336462633831 Jan 14 00:07:53.105692 containerd[1590]: time="2026-01-14T00:07:53.105589246Z" level=info msg="connecting to shim e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495" address="unix:///run/containerd/s/efa8e772016b0eb7b52b1c2214aef417240b316ade27f4c8e890824fd30a31ce" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:07:53.143362 containerd[1590]: time="2026-01-14T00:07:53.142216169Z" level=info msg="StartContainer for \"db97c2a3acc1d308fd1d3b43dbc819c77f659ba68d45242dc0ebe7241dc5a9af\" returns successfully" Jan 14 00:07:53.149765 systemd[1]: Started cri-containerd-e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495.scope - libcontainer container e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495. Jan 14 00:07:53.161000 audit: BPF prog-id=139 op=LOAD Jan 14 00:07:53.162000 audit: BPF prog-id=140 op=LOAD Jan 14 00:07:53.162000 audit[2976]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2956 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.162000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533316237306336643765333730333665303439656133353631353831 Jan 14 00:07:53.162000 audit: BPF prog-id=140 op=UNLOAD Jan 14 00:07:53.162000 audit[2976]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2956 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.162000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533316237306336643765333730333665303439656133353631353831 Jan 14 00:07:53.165000 audit: BPF prog-id=141 op=LOAD Jan 14 00:07:53.165000 audit[2976]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2956 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.165000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533316237306336643765333730333665303439656133353631353831 Jan 14 00:07:53.165000 audit: BPF prog-id=142 op=LOAD Jan 14 00:07:53.165000 audit[2976]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=2956 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.165000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533316237306336643765333730333665303439656133353631353831 Jan 14 00:07:53.165000 audit: BPF prog-id=142 op=UNLOAD Jan 14 00:07:53.165000 audit[2976]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2956 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.165000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533316237306336643765333730333665303439656133353631353831 Jan 14 00:07:53.165000 audit: BPF prog-id=141 op=UNLOAD Jan 14 00:07:53.165000 audit[2976]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2956 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.165000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533316237306336643765333730333665303439656133353631353831 Jan 14 00:07:53.165000 audit: BPF prog-id=143 op=LOAD Jan 14 00:07:53.165000 audit[2976]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=2956 pid=2976 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.165000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6533316237306336643765333730333665303439656133353631353831 Jan 14 00:07:53.200297 containerd[1590]: time="2026-01-14T00:07:53.200255350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-h5t2k,Uid:67487d81-7ef0-446e-9b49-669d47cd83cc,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495\"" Jan 14 00:07:53.203545 containerd[1590]: time="2026-01-14T00:07:53.203465996Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 14 00:07:53.335000 audit[3040]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3040 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.335000 audit[3040]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc3afc6d0 a2=0 a3=1 items=0 ppid=2942 pid=3040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.335000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 00:07:53.338000 audit[3044]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3044 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.338000 audit[3044]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdc4c7030 a2=0 a3=1 items=0 ppid=2942 pid=3044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.338000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 00:07:53.341000 audit[3045]: NETFILTER_CFG table=filter:56 family=2 entries=1 op=nft_register_chain pid=3045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.341000 audit[3045]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffc6c5330 a2=0 a3=1 items=0 ppid=2942 pid=3045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.341000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 00:07:53.345000 audit[3046]: NETFILTER_CFG table=mangle:57 family=10 entries=1 op=nft_register_chain pid=3046 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.345000 audit[3046]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffee4bd340 a2=0 a3=1 items=0 ppid=2942 pid=3046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.345000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 14 00:07:53.350000 audit[3047]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3047 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.350000 audit[3047]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe373eb60 a2=0 a3=1 items=0 ppid=2942 pid=3047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.350000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 14 00:07:53.352000 audit[3048]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3048 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.352000 audit[3048]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffd9a0790 a2=0 a3=1 items=0 ppid=2942 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.352000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 14 00:07:53.402483 kubelet[2832]: I0114 00:07:53.402303 2832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-djh8f" podStartSLOduration=1.402279235 podStartE2EDuration="1.402279235s" podCreationTimestamp="2026-01-14 00:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 00:07:53.402193037 +0000 UTC m=+7.204604385" watchObservedRunningTime="2026-01-14 00:07:53.402279235 +0000 UTC m=+7.204690582" Jan 14 00:07:53.442000 audit[3049]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3049 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.442000 audit[3049]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffcb742150 a2=0 a3=1 items=0 ppid=2942 pid=3049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.442000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 00:07:53.445000 audit[3051]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.445000 audit[3051]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd5ae3820 a2=0 a3=1 items=0 ppid=2942 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.445000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Jan 14 00:07:53.449000 audit[3054]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.449000 audit[3054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd90ccac0 a2=0 a3=1 items=0 ppid=2942 pid=3054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.449000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Jan 14 00:07:53.451000 audit[3055]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3055 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.451000 audit[3055]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffeb3c0030 a2=0 a3=1 items=0 ppid=2942 pid=3055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.451000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 00:07:53.455000 audit[3057]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.455000 audit[3057]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffff793490 a2=0 a3=1 items=0 ppid=2942 pid=3057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.455000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 00:07:53.457000 audit[3058]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3058 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.457000 audit[3058]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdc4ba250 a2=0 a3=1 items=0 ppid=2942 pid=3058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.457000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 00:07:53.461000 audit[3060]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3060 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.461000 audit[3060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffffca0f0c0 a2=0 a3=1 items=0 ppid=2942 pid=3060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.461000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 14 00:07:53.466000 audit[3063]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3063 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.466000 audit[3063]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffc6b5e6a0 a2=0 a3=1 items=0 ppid=2942 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.466000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Jan 14 00:07:53.467000 audit[3064]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3064 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.467000 audit[3064]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc797b930 a2=0 a3=1 items=0 ppid=2942 pid=3064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.467000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 00:07:53.470000 audit[3066]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.470000 audit[3066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffcd0fb440 a2=0 a3=1 items=0 ppid=2942 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.470000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 00:07:53.471000 audit[3067]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3067 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.471000 audit[3067]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffce13faa0 a2=0 a3=1 items=0 ppid=2942 pid=3067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.471000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 00:07:53.474000 audit[3069]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3069 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.474000 audit[3069]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffc36a9850 a2=0 a3=1 items=0 ppid=2942 pid=3069 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.474000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 00:07:53.478000 audit[3072]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3072 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.478000 audit[3072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffee323930 a2=0 a3=1 items=0 ppid=2942 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.478000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 00:07:53.483000 audit[3075]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3075 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.483000 audit[3075]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffec97d720 a2=0 a3=1 items=0 ppid=2942 pid=3075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.483000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 14 00:07:53.484000 audit[3076]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3076 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.484000 audit[3076]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffec4005a0 a2=0 a3=1 items=0 ppid=2942 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.484000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 00:07:53.487000 audit[3078]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3078 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.487000 audit[3078]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=fffffdd31890 a2=0 a3=1 items=0 ppid=2942 pid=3078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.487000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 00:07:53.490000 audit[3081]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3081 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.490000 audit[3081]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc0c169c0 a2=0 a3=1 items=0 ppid=2942 pid=3081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.490000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 00:07:53.492000 audit[3082]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3082 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.492000 audit[3082]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc38ef520 a2=0 a3=1 items=0 ppid=2942 pid=3082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.492000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 00:07:53.496000 audit[3084]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3084 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 14 00:07:53.496000 audit[3084]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffe5027a60 a2=0 a3=1 items=0 ppid=2942 pid=3084 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.496000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 00:07:53.520000 audit[3090]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3090 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:07:53.520000 audit[3090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe7c2f5c0 a2=0 a3=1 items=0 ppid=2942 pid=3090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.520000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:07:53.526000 audit[3090]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3090 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:07:53.526000 audit[3090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffe7c2f5c0 a2=0 a3=1 items=0 ppid=2942 pid=3090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.526000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:07:53.528000 audit[3095]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3095 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.528000 audit[3095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffe33a2da0 a2=0 a3=1 items=0 ppid=2942 pid=3095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.528000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 14 00:07:53.532000 audit[3097]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.532000 audit[3097]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=fffff9ea21a0 a2=0 a3=1 items=0 ppid=2942 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.532000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Jan 14 00:07:53.536000 audit[3100]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.536000 audit[3100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffeb05ad60 a2=0 a3=1 items=0 ppid=2942 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.536000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Jan 14 00:07:53.538000 audit[3101]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3101 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.538000 audit[3101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd26d10b0 a2=0 a3=1 items=0 ppid=2942 pid=3101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.538000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 14 00:07:53.540000 audit[3103]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3103 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.540000 audit[3103]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdc91e690 a2=0 a3=1 items=0 ppid=2942 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.540000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 14 00:07:53.541000 audit[3104]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3104 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.541000 audit[3104]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc28a2dc0 a2=0 a3=1 items=0 ppid=2942 pid=3104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.541000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Jan 14 00:07:53.544000 audit[3106]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3106 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.544000 audit[3106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffef337490 a2=0 a3=1 items=0 ppid=2942 pid=3106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.544000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Jan 14 00:07:53.547000 audit[3109]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.547000 audit[3109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffd5ed9080 a2=0 a3=1 items=0 ppid=2942 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.547000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Jan 14 00:07:53.548000 audit[3110]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3110 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.548000 audit[3110]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffed513a00 a2=0 a3=1 items=0 ppid=2942 pid=3110 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.548000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Jan 14 00:07:53.550000 audit[3112]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.550000 audit[3112]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdc7fa760 a2=0 a3=1 items=0 ppid=2942 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.550000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 14 00:07:53.551000 audit[3113]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3113 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.551000 audit[3113]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd3ccdac0 a2=0 a3=1 items=0 ppid=2942 pid=3113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.551000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 14 00:07:53.554000 audit[3115]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3115 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.554000 audit[3115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffca6a450 a2=0 a3=1 items=0 ppid=2942 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.554000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Jan 14 00:07:53.557000 audit[3118]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3118 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.557000 audit[3118]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd7bdb950 a2=0 a3=1 items=0 ppid=2942 pid=3118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.557000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Jan 14 00:07:53.561000 audit[3121]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.561000 audit[3121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcb276ec0 a2=0 a3=1 items=0 ppid=2942 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.561000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Jan 14 00:07:53.562000 audit[3122]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3122 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.562000 audit[3122]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffcebadbb0 a2=0 a3=1 items=0 ppid=2942 pid=3122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.562000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Jan 14 00:07:53.564000 audit[3124]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.564000 audit[3124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffdc24ff50 a2=0 a3=1 items=0 ppid=2942 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.564000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 00:07:53.567000 audit[3127]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3127 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.567000 audit[3127]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc78abdc0 a2=0 a3=1 items=0 ppid=2942 pid=3127 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.567000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 14 00:07:53.569000 audit[3128]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.569000 audit[3128]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc8699310 a2=0 a3=1 items=0 ppid=2942 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.569000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 14 00:07:53.571000 audit[3130]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.571000 audit[3130]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffef24e7e0 a2=0 a3=1 items=0 ppid=2942 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.571000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 14 00:07:53.572000 audit[3131]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3131 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.572000 audit[3131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc89571a0 a2=0 a3=1 items=0 ppid=2942 pid=3131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.572000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 14 00:07:53.575000 audit[3133]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3133 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.575000 audit[3133]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd189e700 a2=0 a3=1 items=0 ppid=2942 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.575000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 00:07:53.580000 audit[3136]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3136 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 14 00:07:53.580000 audit[3136]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=fffff6a480f0 a2=0 a3=1 items=0 ppid=2942 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.580000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 14 00:07:53.585000 audit[3138]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3138 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 00:07:53.585000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffe45cb2d0 a2=0 a3=1 items=0 ppid=2942 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.585000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:07:53.586000 audit[3138]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 14 00:07:53.586000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffe45cb2d0 a2=0 a3=1 items=0 ppid=2942 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:53.586000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:07:55.122126 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3918897326.mount: Deactivated successfully. Jan 14 00:07:55.519615 containerd[1590]: time="2026-01-14T00:07:55.519481360Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:55.521983 containerd[1590]: time="2026-01-14T00:07:55.521830967Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=0" Jan 14 00:07:55.523592 containerd[1590]: time="2026-01-14T00:07:55.523264293Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:55.526997 containerd[1590]: time="2026-01-14T00:07:55.526962552Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.323452937s" Jan 14 00:07:55.527220 containerd[1590]: time="2026-01-14T00:07:55.527096685Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 14 00:07:55.527626 containerd[1590]: time="2026-01-14T00:07:55.527594802Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:07:55.533554 containerd[1590]: time="2026-01-14T00:07:55.533483926Z" level=info msg="CreateContainer within sandbox \"e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 14 00:07:55.545964 containerd[1590]: time="2026-01-14T00:07:55.545380661Z" level=info msg="Container 400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:07:55.563219 containerd[1590]: time="2026-01-14T00:07:55.563172163Z" level=info msg="CreateContainer within sandbox \"e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b\"" Jan 14 00:07:55.564635 containerd[1590]: time="2026-01-14T00:07:55.564511211Z" level=info msg="StartContainer for \"400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b\"" Jan 14 00:07:55.566202 containerd[1590]: time="2026-01-14T00:07:55.566159622Z" level=info msg="connecting to shim 400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b" address="unix:///run/containerd/s/efa8e772016b0eb7b52b1c2214aef417240b316ade27f4c8e890824fd30a31ce" protocol=ttrpc version=3 Jan 14 00:07:55.593937 systemd[1]: Started cri-containerd-400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b.scope - libcontainer container 400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b. Jan 14 00:07:55.607000 audit: BPF prog-id=144 op=LOAD Jan 14 00:07:55.608000 audit: BPF prog-id=145 op=LOAD Jan 14 00:07:55.608000 audit[3147]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2956 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:55.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430303832316136663161386633666563656130323561393466663639 Jan 14 00:07:55.608000 audit: BPF prog-id=145 op=UNLOAD Jan 14 00:07:55.608000 audit[3147]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2956 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:55.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430303832316136663161386633666563656130323561393466663639 Jan 14 00:07:55.608000 audit: BPF prog-id=146 op=LOAD Jan 14 00:07:55.608000 audit[3147]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2956 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:55.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430303832316136663161386633666563656130323561393466663639 Jan 14 00:07:55.608000 audit: BPF prog-id=147 op=LOAD Jan 14 00:07:55.608000 audit[3147]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2956 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:55.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430303832316136663161386633666563656130323561393466663639 Jan 14 00:07:55.608000 audit: BPF prog-id=147 op=UNLOAD Jan 14 00:07:55.608000 audit[3147]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2956 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:55.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430303832316136663161386633666563656130323561393466663639 Jan 14 00:07:55.608000 audit: BPF prog-id=146 op=UNLOAD Jan 14 00:07:55.608000 audit[3147]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2956 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:55.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430303832316136663161386633666563656130323561393466663639 Jan 14 00:07:55.608000 audit: BPF prog-id=148 op=LOAD Jan 14 00:07:55.608000 audit[3147]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2956 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:07:55.608000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3430303832316136663161386633666563656130323561393466663639 Jan 14 00:07:55.634389 containerd[1590]: time="2026-01-14T00:07:55.634353175Z" level=info msg="StartContainer for \"400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b\" returns successfully" Jan 14 00:07:56.442312 kubelet[2832]: I0114 00:07:56.442145 2832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-h5t2k" podStartSLOduration=2.116384896 podStartE2EDuration="4.442122887s" podCreationTimestamp="2026-01-14 00:07:52 +0000 UTC" firstStartedPulling="2026-01-14 00:07:53.202775814 +0000 UTC m=+7.005187121" lastFinishedPulling="2026-01-14 00:07:55.528513805 +0000 UTC m=+9.330925112" observedRunningTime="2026-01-14 00:07:56.442062104 +0000 UTC m=+10.244473411" watchObservedRunningTime="2026-01-14 00:07:56.442122887 +0000 UTC m=+10.244534194" Jan 14 00:08:01.722971 sudo[1870]: pam_unix(sudo:session): session closed for user root Jan 14 00:08:01.723000 audit[1870]: USER_END pid=1870 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:08:01.726580 kernel: kauditd_printk_skb: 224 callbacks suppressed Jan 14 00:08:01.726690 kernel: audit: type=1106 audit(1768349281.723:514): pid=1870 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:08:01.723000 audit[1870]: CRED_DISP pid=1870 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:08:01.728394 kernel: audit: type=1104 audit(1768349281.723:515): pid=1870 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 14 00:08:01.823867 sshd[1869]: Connection closed by 4.153.228.146 port 59768 Jan 14 00:08:01.825079 sshd-session[1865]: pam_unix(sshd:session): session closed for user core Jan 14 00:08:01.826000 audit[1865]: USER_END pid=1865 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:08:01.826000 audit[1865]: CRED_DISP pid=1865 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:08:01.833508 kernel: audit: type=1106 audit(1768349281.826:516): pid=1865 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:08:01.833653 kernel: audit: type=1104 audit(1768349281.826:517): pid=1865 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:08:01.834125 systemd[1]: sshd@6-46.224.77.139:22-4.153.228.146:59768.service: Deactivated successfully. Jan 14 00:08:01.838874 kernel: audit: type=1131 audit(1768349281.833:518): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-46.224.77.139:22-4.153.228.146:59768 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:08:01.833000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-46.224.77.139:22-4.153.228.146:59768 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:08:01.839140 systemd[1]: session-8.scope: Deactivated successfully. Jan 14 00:08:01.847630 systemd[1]: session-8.scope: Consumed 6.335s CPU time, 223M memory peak. Jan 14 00:08:01.849056 systemd-logind[1545]: Session 8 logged out. Waiting for processes to exit. Jan 14 00:08:01.855975 systemd-logind[1545]: Removed session 8. Jan 14 00:08:03.250000 audit[3226]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:03.250000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffed6c18d0 a2=0 a3=1 items=0 ppid=2942 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:03.253815 kernel: audit: type=1325 audit(1768349283.250:519): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:03.258176 kernel: audit: type=1300 audit(1768349283.250:519): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffed6c18d0 a2=0 a3=1 items=0 ppid=2942 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:03.258257 kernel: audit: type=1327 audit(1768349283.250:519): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:03.250000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:03.258000 audit[3226]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:03.260337 kernel: audit: type=1325 audit(1768349283.258:520): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:03.258000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffed6c18d0 a2=0 a3=1 items=0 ppid=2942 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:03.258000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:03.263964 kernel: audit: type=1300 audit(1768349283.258:520): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffed6c18d0 a2=0 a3=1 items=0 ppid=2942 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:03.273000 audit[3228]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3228 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:03.273000 audit[3228]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=fffff728c4a0 a2=0 a3=1 items=0 ppid=2942 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:03.273000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:03.278000 audit[3228]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3228 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:03.278000 audit[3228]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff728c4a0 a2=0 a3=1 items=0 ppid=2942 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:03.278000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:09.334000 audit[3230]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3230 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:09.338164 kernel: kauditd_printk_skb: 7 callbacks suppressed Jan 14 00:08:09.338221 kernel: audit: type=1325 audit(1768349289.334:523): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3230 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:09.334000 audit[3230]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffeb443bd0 a2=0 a3=1 items=0 ppid=2942 pid=3230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:09.340885 kernel: audit: type=1300 audit(1768349289.334:523): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffeb443bd0 a2=0 a3=1 items=0 ppid=2942 pid=3230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:09.334000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:09.342162 kernel: audit: type=1327 audit(1768349289.334:523): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:09.341000 audit[3230]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3230 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:09.341000 audit[3230]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffeb443bd0 a2=0 a3=1 items=0 ppid=2942 pid=3230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:09.346805 kernel: audit: type=1325 audit(1768349289.341:524): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3230 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:09.346857 kernel: audit: type=1300 audit(1768349289.341:524): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffeb443bd0 a2=0 a3=1 items=0 ppid=2942 pid=3230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:09.341000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:09.348053 kernel: audit: type=1327 audit(1768349289.341:524): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:09.404000 audit[3232]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:09.404000 audit[3232]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffef7f58c0 a2=0 a3=1 items=0 ppid=2942 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:09.409752 kernel: audit: type=1325 audit(1768349289.404:525): table=filter:111 family=2 entries=19 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:09.410656 kernel: audit: type=1300 audit(1768349289.404:525): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffef7f58c0 a2=0 a3=1 items=0 ppid=2942 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:09.404000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:09.412415 kernel: audit: type=1327 audit(1768349289.404:525): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:09.425000 audit[3232]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:09.429590 kernel: audit: type=1325 audit(1768349289.425:526): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3232 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:09.425000 audit[3232]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffef7f58c0 a2=0 a3=1 items=0 ppid=2942 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:09.425000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:10.509000 audit[3234]: NETFILTER_CFG table=filter:113 family=2 entries=20 op=nft_register_rule pid=3234 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:10.509000 audit[3234]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffea1ccba0 a2=0 a3=1 items=0 ppid=2942 pid=3234 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:10.509000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:10.513000 audit[3234]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3234 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:10.513000 audit[3234]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffea1ccba0 a2=0 a3=1 items=0 ppid=2942 pid=3234 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:10.513000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:12.246000 audit[3236]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3236 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:12.246000 audit[3236]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffc9ad2690 a2=0 a3=1 items=0 ppid=2942 pid=3236 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.246000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:12.256000 audit[3236]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3236 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:12.256000 audit[3236]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc9ad2690 a2=0 a3=1 items=0 ppid=2942 pid=3236 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.256000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:12.306243 systemd[1]: Created slice kubepods-besteffort-pod8ecf96b0_2402_4018_ae55_5051bab3b041.slice - libcontainer container kubepods-besteffort-pod8ecf96b0_2402_4018_ae55_5051bab3b041.slice. Jan 14 00:08:12.368000 audit[3238]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=3238 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:12.368000 audit[3238]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffc7233ba0 a2=0 a3=1 items=0 ppid=2942 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.368000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:12.372000 audit[3238]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=3238 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:12.372000 audit[3238]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc7233ba0 a2=0 a3=1 items=0 ppid=2942 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.372000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:12.410172 kubelet[2832]: I0114 00:08:12.409956 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8ecf96b0-2402-4018-ae55-5051bab3b041-tigera-ca-bundle\") pod \"calico-typha-6b486c66d5-cdrrg\" (UID: \"8ecf96b0-2402-4018-ae55-5051bab3b041\") " pod="calico-system/calico-typha-6b486c66d5-cdrrg" Jan 14 00:08:12.410172 kubelet[2832]: I0114 00:08:12.410031 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kv9k5\" (UniqueName: \"kubernetes.io/projected/8ecf96b0-2402-4018-ae55-5051bab3b041-kube-api-access-kv9k5\") pod \"calico-typha-6b486c66d5-cdrrg\" (UID: \"8ecf96b0-2402-4018-ae55-5051bab3b041\") " pod="calico-system/calico-typha-6b486c66d5-cdrrg" Jan 14 00:08:12.410172 kubelet[2832]: I0114 00:08:12.410076 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/8ecf96b0-2402-4018-ae55-5051bab3b041-typha-certs\") pod \"calico-typha-6b486c66d5-cdrrg\" (UID: \"8ecf96b0-2402-4018-ae55-5051bab3b041\") " pod="calico-system/calico-typha-6b486c66d5-cdrrg" Jan 14 00:08:12.532636 systemd[1]: Created slice kubepods-besteffort-pod3dd1c9f2_420a_40e7_a12e_10dfdcdad9d6.slice - libcontainer container kubepods-besteffort-pod3dd1c9f2_420a_40e7_a12e_10dfdcdad9d6.slice. Jan 14 00:08:12.610626 kubelet[2832]: I0114 00:08:12.610597 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-cni-log-dir\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.611053 kubelet[2832]: I0114 00:08:12.611013 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-flexvol-driver-host\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.611153 kubelet[2832]: I0114 00:08:12.611116 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-lib-modules\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.611270 kubelet[2832]: I0114 00:08:12.611140 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-tigera-ca-bundle\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.611270 kubelet[2832]: I0114 00:08:12.611213 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-var-run-calico\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.611270 kubelet[2832]: I0114 00:08:12.611233 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-node-certs\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.611270 kubelet[2832]: I0114 00:08:12.611252 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-cni-bin-dir\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.611551 kubelet[2832]: I0114 00:08:12.611414 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-cni-net-dir\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.611605 containerd[1590]: time="2026-01-14T00:08:12.611571087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b486c66d5-cdrrg,Uid:8ecf96b0-2402-4018-ae55-5051bab3b041,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:12.612193 kubelet[2832]: I0114 00:08:12.611957 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-xtables-lock\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.612193 kubelet[2832]: I0114 00:08:12.612070 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2xft\" (UniqueName: \"kubernetes.io/projected/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-kube-api-access-s2xft\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.612193 kubelet[2832]: I0114 00:08:12.612095 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-policysync\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.612193 kubelet[2832]: I0114 00:08:12.612109 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6-var-lib-calico\") pod \"calico-node-v4mx8\" (UID: \"3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6\") " pod="calico-system/calico-node-v4mx8" Jan 14 00:08:12.635554 containerd[1590]: time="2026-01-14T00:08:12.635494395Z" level=info msg="connecting to shim ede786f4624018a6dcfa25a7228636a76cd57c7e9702092be0e753776a732cd1" address="unix:///run/containerd/s/b7bdc95f0e16aea68ef70c805a67bab6654c0dc557cb15468c830cf2c0c82c40" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:12.668779 systemd[1]: Started cri-containerd-ede786f4624018a6dcfa25a7228636a76cd57c7e9702092be0e753776a732cd1.scope - libcontainer container ede786f4624018a6dcfa25a7228636a76cd57c7e9702092be0e753776a732cd1. Jan 14 00:08:12.706217 kubelet[2832]: E0114 00:08:12.706050 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:08:12.714507 kubelet[2832]: E0114 00:08:12.714398 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.714507 kubelet[2832]: W0114 00:08:12.714439 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.714507 kubelet[2832]: E0114 00:08:12.714479 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.716555 kubelet[2832]: E0114 00:08:12.715635 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.716555 kubelet[2832]: W0114 00:08:12.715652 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.716555 kubelet[2832]: E0114 00:08:12.715667 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.718719 kubelet[2832]: E0114 00:08:12.718327 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.718719 kubelet[2832]: W0114 00:08:12.718547 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.718719 kubelet[2832]: E0114 00:08:12.718570 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.721548 kubelet[2832]: E0114 00:08:12.721285 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.721548 kubelet[2832]: W0114 00:08:12.721305 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.721548 kubelet[2832]: E0114 00:08:12.721330 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.722524 kubelet[2832]: E0114 00:08:12.722050 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.722524 kubelet[2832]: W0114 00:08:12.722465 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.722524 kubelet[2832]: E0114 00:08:12.722488 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.724000 audit: BPF prog-id=149 op=LOAD Jan 14 00:08:12.726411 kubelet[2832]: E0114 00:08:12.726256 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.726411 kubelet[2832]: W0114 00:08:12.726321 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.726411 kubelet[2832]: E0114 00:08:12.726344 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.726000 audit: BPF prog-id=150 op=LOAD Jan 14 00:08:12.727675 kubelet[2832]: E0114 00:08:12.727509 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.727675 kubelet[2832]: W0114 00:08:12.727540 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.727675 kubelet[2832]: E0114 00:08:12.727558 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.726000 audit[3262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3250 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.726000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564653738366634363234303138613664636661323561373232383633 Jan 14 00:08:12.730647 kubelet[2832]: E0114 00:08:12.728349 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.730647 kubelet[2832]: W0114 00:08:12.728363 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.730647 kubelet[2832]: E0114 00:08:12.728376 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.730000 audit: BPF prog-id=150 op=UNLOAD Jan 14 00:08:12.730000 audit[3262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3250 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564653738366634363234303138613664636661323561373232383633 Jan 14 00:08:12.730000 audit: BPF prog-id=151 op=LOAD Jan 14 00:08:12.730000 audit[3262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3250 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564653738366634363234303138613664636661323561373232383633 Jan 14 00:08:12.730000 audit: BPF prog-id=152 op=LOAD Jan 14 00:08:12.730000 audit[3262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3250 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564653738366634363234303138613664636661323561373232383633 Jan 14 00:08:12.730000 audit: BPF prog-id=152 op=UNLOAD Jan 14 00:08:12.730000 audit[3262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3250 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564653738366634363234303138613664636661323561373232383633 Jan 14 00:08:12.730000 audit: BPF prog-id=151 op=UNLOAD Jan 14 00:08:12.730000 audit[3262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3250 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564653738366634363234303138613664636661323561373232383633 Jan 14 00:08:12.730000 audit: BPF prog-id=153 op=LOAD Jan 14 00:08:12.730000 audit[3262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3250 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.730000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6564653738366634363234303138613664636661323561373232383633 Jan 14 00:08:12.735160 kubelet[2832]: E0114 00:08:12.732029 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.735160 kubelet[2832]: W0114 00:08:12.732050 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.735160 kubelet[2832]: E0114 00:08:12.732068 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.737016 kubelet[2832]: E0114 00:08:12.736986 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.737016 kubelet[2832]: W0114 00:08:12.737010 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.737189 kubelet[2832]: E0114 00:08:12.737032 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.739842 kubelet[2832]: E0114 00:08:12.739726 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.739842 kubelet[2832]: W0114 00:08:12.739751 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.740661 kubelet[2832]: E0114 00:08:12.739772 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.740821 kubelet[2832]: E0114 00:08:12.740752 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.740821 kubelet[2832]: W0114 00:08:12.740767 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.740943 kubelet[2832]: E0114 00:08:12.740906 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.741323 kubelet[2832]: E0114 00:08:12.741309 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.741565 kubelet[2832]: W0114 00:08:12.741398 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.741823 kubelet[2832]: E0114 00:08:12.741649 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.742167 kubelet[2832]: E0114 00:08:12.742119 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.742398 kubelet[2832]: W0114 00:08:12.742329 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.742398 kubelet[2832]: E0114 00:08:12.742349 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.744208 kubelet[2832]: E0114 00:08:12.744161 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.744374 kubelet[2832]: W0114 00:08:12.744195 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.744374 kubelet[2832]: E0114 00:08:12.744308 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.744785 kubelet[2832]: E0114 00:08:12.744709 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.744785 kubelet[2832]: W0114 00:08:12.744723 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.744785 kubelet[2832]: E0114 00:08:12.744735 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.745154 kubelet[2832]: E0114 00:08:12.745079 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.745154 kubelet[2832]: W0114 00:08:12.745093 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.745154 kubelet[2832]: E0114 00:08:12.745104 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.745664 kubelet[2832]: E0114 00:08:12.745649 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.745861 kubelet[2832]: W0114 00:08:12.745733 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.745861 kubelet[2832]: E0114 00:08:12.745762 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.746087 kubelet[2832]: E0114 00:08:12.746075 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.746148 kubelet[2832]: W0114 00:08:12.746138 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.746253 kubelet[2832]: E0114 00:08:12.746197 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.746405 kubelet[2832]: E0114 00:08:12.746395 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.746533 kubelet[2832]: W0114 00:08:12.746460 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.746533 kubelet[2832]: E0114 00:08:12.746473 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.746846 kubelet[2832]: E0114 00:08:12.746728 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.746846 kubelet[2832]: W0114 00:08:12.746739 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.746846 kubelet[2832]: E0114 00:08:12.746749 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.747578 kubelet[2832]: E0114 00:08:12.747553 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.747578 kubelet[2832]: W0114 00:08:12.747570 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.747652 kubelet[2832]: E0114 00:08:12.747585 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.747891 kubelet[2832]: E0114 00:08:12.747868 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.747891 kubelet[2832]: W0114 00:08:12.747883 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.748002 kubelet[2832]: E0114 00:08:12.747894 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.748603 kubelet[2832]: E0114 00:08:12.748582 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.748603 kubelet[2832]: W0114 00:08:12.748600 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.748678 kubelet[2832]: E0114 00:08:12.748613 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.749104 kubelet[2832]: E0114 00:08:12.749074 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.749104 kubelet[2832]: W0114 00:08:12.749092 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.749193 kubelet[2832]: E0114 00:08:12.749169 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.749451 kubelet[2832]: E0114 00:08:12.749382 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.749451 kubelet[2832]: W0114 00:08:12.749400 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.749451 kubelet[2832]: E0114 00:08:12.749411 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.749664 kubelet[2832]: E0114 00:08:12.749648 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.749664 kubelet[2832]: W0114 00:08:12.749660 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.749734 kubelet[2832]: E0114 00:08:12.749670 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.750306 kubelet[2832]: E0114 00:08:12.750279 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.750306 kubelet[2832]: W0114 00:08:12.750296 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.750306 kubelet[2832]: E0114 00:08:12.750308 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.751135 kubelet[2832]: E0114 00:08:12.751112 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.751135 kubelet[2832]: W0114 00:08:12.751131 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.751386 kubelet[2832]: E0114 00:08:12.751144 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.752393 kubelet[2832]: E0114 00:08:12.752373 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.752393 kubelet[2832]: W0114 00:08:12.752391 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.752622 kubelet[2832]: E0114 00:08:12.752408 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.752733 kubelet[2832]: E0114 00:08:12.752713 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.752773 kubelet[2832]: W0114 00:08:12.752727 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.752773 kubelet[2832]: E0114 00:08:12.752761 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.752773 kubelet[2832]: E0114 00:08:12.752996 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.752773 kubelet[2832]: W0114 00:08:12.753006 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.752773 kubelet[2832]: E0114 00:08:12.753017 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.753306 kubelet[2832]: E0114 00:08:12.753184 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.753306 kubelet[2832]: W0114 00:08:12.753193 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.753306 kubelet[2832]: E0114 00:08:12.753204 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.754708 kubelet[2832]: E0114 00:08:12.754677 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.754826 kubelet[2832]: W0114 00:08:12.754798 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.754895 kubelet[2832]: E0114 00:08:12.754884 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.770465 kubelet[2832]: E0114 00:08:12.770441 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.770720 kubelet[2832]: W0114 00:08:12.770653 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.770720 kubelet[2832]: E0114 00:08:12.770678 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.786779 kubelet[2832]: E0114 00:08:12.784592 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.786779 kubelet[2832]: W0114 00:08:12.784617 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.786779 kubelet[2832]: E0114 00:08:12.784639 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.787579 kubelet[2832]: E0114 00:08:12.787559 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.788013 kubelet[2832]: W0114 00:08:12.787672 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.788013 kubelet[2832]: E0114 00:08:12.787750 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.788855 kubelet[2832]: E0114 00:08:12.788722 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.788855 kubelet[2832]: W0114 00:08:12.788738 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.788855 kubelet[2832]: E0114 00:08:12.788752 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.789921 containerd[1590]: time="2026-01-14T00:08:12.789016008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b486c66d5-cdrrg,Uid:8ecf96b0-2402-4018-ae55-5051bab3b041,Namespace:calico-system,Attempt:0,} returns sandbox id \"ede786f4624018a6dcfa25a7228636a76cd57c7e9702092be0e753776a732cd1\"" Jan 14 00:08:12.790003 kubelet[2832]: E0114 00:08:12.789451 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.790003 kubelet[2832]: W0114 00:08:12.789463 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.790003 kubelet[2832]: E0114 00:08:12.789474 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.791481 kubelet[2832]: E0114 00:08:12.791438 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.792790 kubelet[2832]: W0114 00:08:12.792769 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.792985 kubelet[2832]: E0114 00:08:12.792968 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.793443 kubelet[2832]: E0114 00:08:12.793424 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.793537 kubelet[2832]: W0114 00:08:12.793504 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.793643 kubelet[2832]: E0114 00:08:12.793630 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.794409 kubelet[2832]: E0114 00:08:12.794390 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.794561 kubelet[2832]: W0114 00:08:12.794546 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.794622 kubelet[2832]: E0114 00:08:12.794610 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.795224 kubelet[2832]: E0114 00:08:12.794878 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.795224 kubelet[2832]: W0114 00:08:12.794891 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.795224 kubelet[2832]: E0114 00:08:12.794902 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.795441 kubelet[2832]: E0114 00:08:12.795427 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.795613 kubelet[2832]: W0114 00:08:12.795494 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.795613 kubelet[2832]: E0114 00:08:12.795541 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.795772 kubelet[2832]: E0114 00:08:12.795761 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.796011 kubelet[2832]: W0114 00:08:12.795840 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.796011 kubelet[2832]: E0114 00:08:12.795855 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.796177 kubelet[2832]: E0114 00:08:12.796166 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.796236 kubelet[2832]: W0114 00:08:12.796226 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.796293 kubelet[2832]: E0114 00:08:12.796282 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.796942 kubelet[2832]: E0114 00:08:12.796651 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.796942 kubelet[2832]: W0114 00:08:12.796663 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.796942 kubelet[2832]: E0114 00:08:12.796674 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.797156 kubelet[2832]: E0114 00:08:12.797144 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.797208 kubelet[2832]: W0114 00:08:12.797198 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.798670 containerd[1590]: time="2026-01-14T00:08:12.798610784Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 14 00:08:12.799023 kubelet[2832]: E0114 00:08:12.798977 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.802954 kubelet[2832]: E0114 00:08:12.801747 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.802954 kubelet[2832]: W0114 00:08:12.801784 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.802954 kubelet[2832]: E0114 00:08:12.802046 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.805237 kubelet[2832]: E0114 00:08:12.805195 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.805353 kubelet[2832]: W0114 00:08:12.805246 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.805353 kubelet[2832]: E0114 00:08:12.805299 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.806933 kubelet[2832]: E0114 00:08:12.806906 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.807953 kubelet[2832]: W0114 00:08:12.807913 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.807953 kubelet[2832]: E0114 00:08:12.807956 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.810014 kubelet[2832]: E0114 00:08:12.809984 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.810014 kubelet[2832]: W0114 00:08:12.810007 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.810272 kubelet[2832]: E0114 00:08:12.810029 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.810398 kubelet[2832]: E0114 00:08:12.810379 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.810398 kubelet[2832]: W0114 00:08:12.810395 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.810467 kubelet[2832]: E0114 00:08:12.810407 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.810766 kubelet[2832]: E0114 00:08:12.810741 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.810766 kubelet[2832]: W0114 00:08:12.810763 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.810870 kubelet[2832]: E0114 00:08:12.810776 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.811210 kubelet[2832]: E0114 00:08:12.811187 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.811210 kubelet[2832]: W0114 00:08:12.811207 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.811512 kubelet[2832]: E0114 00:08:12.811219 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.814800 kubelet[2832]: E0114 00:08:12.814644 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.814800 kubelet[2832]: W0114 00:08:12.814674 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.814800 kubelet[2832]: E0114 00:08:12.814690 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.815304 kubelet[2832]: I0114 00:08:12.815117 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rt5nl\" (UniqueName: \"kubernetes.io/projected/6c288445-910a-4d1d-9b62-12f5155b11be-kube-api-access-rt5nl\") pod \"csi-node-driver-8jmff\" (UID: \"6c288445-910a-4d1d-9b62-12f5155b11be\") " pod="calico-system/csi-node-driver-8jmff" Jan 14 00:08:12.815993 kubelet[2832]: E0114 00:08:12.815977 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.816272 kubelet[2832]: W0114 00:08:12.816157 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.816272 kubelet[2832]: E0114 00:08:12.816179 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.816835 kubelet[2832]: E0114 00:08:12.816645 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.816835 kubelet[2832]: W0114 00:08:12.816659 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.816835 kubelet[2832]: E0114 00:08:12.816672 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.817349 kubelet[2832]: E0114 00:08:12.817243 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.817443 kubelet[2832]: W0114 00:08:12.817428 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.817535 kubelet[2832]: E0114 00:08:12.817494 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.817885 kubelet[2832]: I0114 00:08:12.817618 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/6c288445-910a-4d1d-9b62-12f5155b11be-registration-dir\") pod \"csi-node-driver-8jmff\" (UID: \"6c288445-910a-4d1d-9b62-12f5155b11be\") " pod="calico-system/csi-node-driver-8jmff" Jan 14 00:08:12.818618 kubelet[2832]: E0114 00:08:12.818597 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.818775 kubelet[2832]: W0114 00:08:12.818695 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.818775 kubelet[2832]: E0114 00:08:12.818716 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.818775 kubelet[2832]: I0114 00:08:12.818749 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6c288445-910a-4d1d-9b62-12f5155b11be-kubelet-dir\") pod \"csi-node-driver-8jmff\" (UID: \"6c288445-910a-4d1d-9b62-12f5155b11be\") " pod="calico-system/csi-node-driver-8jmff" Jan 14 00:08:12.821552 kubelet[2832]: E0114 00:08:12.820467 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.821552 kubelet[2832]: W0114 00:08:12.820802 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.821552 kubelet[2832]: E0114 00:08:12.820836 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.823544 kubelet[2832]: E0114 00:08:12.823145 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.823544 kubelet[2832]: W0114 00:08:12.823166 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.823544 kubelet[2832]: E0114 00:08:12.823183 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.824657 kubelet[2832]: E0114 00:08:12.823708 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.824723 kubelet[2832]: W0114 00:08:12.824662 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.824723 kubelet[2832]: E0114 00:08:12.824689 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.824847 kubelet[2832]: I0114 00:08:12.824811 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/6c288445-910a-4d1d-9b62-12f5155b11be-varrun\") pod \"csi-node-driver-8jmff\" (UID: \"6c288445-910a-4d1d-9b62-12f5155b11be\") " pod="calico-system/csi-node-driver-8jmff" Jan 14 00:08:12.825268 kubelet[2832]: E0114 00:08:12.825093 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.825268 kubelet[2832]: W0114 00:08:12.825114 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.825268 kubelet[2832]: E0114 00:08:12.825127 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.825397 kubelet[2832]: E0114 00:08:12.825289 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.825397 kubelet[2832]: W0114 00:08:12.825298 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.825397 kubelet[2832]: E0114 00:08:12.825308 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.825464 kubelet[2832]: E0114 00:08:12.825433 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.825464 kubelet[2832]: W0114 00:08:12.825441 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.825464 kubelet[2832]: E0114 00:08:12.825448 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.826605 kubelet[2832]: I0114 00:08:12.825466 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/6c288445-910a-4d1d-9b62-12f5155b11be-socket-dir\") pod \"csi-node-driver-8jmff\" (UID: \"6c288445-910a-4d1d-9b62-12f5155b11be\") " pod="calico-system/csi-node-driver-8jmff" Jan 14 00:08:12.826605 kubelet[2832]: E0114 00:08:12.826246 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.826605 kubelet[2832]: W0114 00:08:12.826268 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.826605 kubelet[2832]: E0114 00:08:12.826283 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.827182 kubelet[2832]: E0114 00:08:12.827151 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.827182 kubelet[2832]: W0114 00:08:12.827175 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.827484 kubelet[2832]: E0114 00:08:12.827193 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.828947 kubelet[2832]: E0114 00:08:12.828910 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.828947 kubelet[2832]: W0114 00:08:12.828930 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.828947 kubelet[2832]: E0114 00:08:12.828945 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.829739 kubelet[2832]: E0114 00:08:12.829713 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.829739 kubelet[2832]: W0114 00:08:12.829730 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.829739 kubelet[2832]: E0114 00:08:12.829744 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.839602 containerd[1590]: time="2026-01-14T00:08:12.839514794Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v4mx8,Uid:3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:12.868862 containerd[1590]: time="2026-01-14T00:08:12.867557984Z" level=info msg="connecting to shim 9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5" address="unix:///run/containerd/s/621e46b795ba93a712d09a908cef29e1ca946d09b47cec9fd28e90c2c6d919d9" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:12.893782 systemd[1]: Started cri-containerd-9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5.scope - libcontainer container 9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5. Jan 14 00:08:12.907000 audit: BPF prog-id=154 op=LOAD Jan 14 00:08:12.907000 audit: BPF prog-id=155 op=LOAD Jan 14 00:08:12.907000 audit[3385]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3374 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936393839333238323365346333613033663836336432363564306262 Jan 14 00:08:12.907000 audit: BPF prog-id=155 op=UNLOAD Jan 14 00:08:12.907000 audit[3385]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936393839333238323365346333613033663836336432363564306262 Jan 14 00:08:12.908000 audit: BPF prog-id=156 op=LOAD Jan 14 00:08:12.908000 audit[3385]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3374 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936393839333238323365346333613033663836336432363564306262 Jan 14 00:08:12.908000 audit: BPF prog-id=157 op=LOAD Jan 14 00:08:12.908000 audit[3385]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3374 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936393839333238323365346333613033663836336432363564306262 Jan 14 00:08:12.908000 audit: BPF prog-id=157 op=UNLOAD Jan 14 00:08:12.908000 audit[3385]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936393839333238323365346333613033663836336432363564306262 Jan 14 00:08:12.908000 audit: BPF prog-id=156 op=UNLOAD Jan 14 00:08:12.908000 audit[3385]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936393839333238323365346333613033663836336432363564306262 Jan 14 00:08:12.908000 audit: BPF prog-id=158 op=LOAD Jan 14 00:08:12.908000 audit[3385]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3374 pid=3385 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:12.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936393839333238323365346333613033663836336432363564306262 Jan 14 00:08:12.926764 kubelet[2832]: E0114 00:08:12.926725 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.926764 kubelet[2832]: W0114 00:08:12.926754 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.927039 kubelet[2832]: E0114 00:08:12.926777 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.928301 kubelet[2832]: E0114 00:08:12.928277 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.928301 kubelet[2832]: W0114 00:08:12.928299 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.928607 kubelet[2832]: E0114 00:08:12.928317 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.929376 kubelet[2832]: E0114 00:08:12.929353 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.929376 kubelet[2832]: W0114 00:08:12.929371 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.929626 kubelet[2832]: E0114 00:08:12.929386 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.930286 kubelet[2832]: E0114 00:08:12.930253 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.930286 kubelet[2832]: W0114 00:08:12.930273 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.930347 kubelet[2832]: E0114 00:08:12.930289 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.930544 kubelet[2832]: E0114 00:08:12.930469 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.930544 kubelet[2832]: W0114 00:08:12.930483 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.930544 kubelet[2832]: E0114 00:08:12.930495 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.930919 kubelet[2832]: E0114 00:08:12.930897 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.930919 kubelet[2832]: W0114 00:08:12.930916 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.930994 kubelet[2832]: E0114 00:08:12.930929 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.931219 kubelet[2832]: E0114 00:08:12.931081 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.931219 kubelet[2832]: W0114 00:08:12.931127 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.931219 kubelet[2832]: E0114 00:08:12.931138 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.931707 kubelet[2832]: E0114 00:08:12.931688 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.931707 kubelet[2832]: W0114 00:08:12.931705 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.931793 kubelet[2832]: E0114 00:08:12.931718 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.931957 containerd[1590]: time="2026-01-14T00:08:12.931917456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-v4mx8,Uid:3dd1c9f2-420a-40e7-a12e-10dfdcdad9d6,Namespace:calico-system,Attempt:0,} returns sandbox id \"9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5\"" Jan 14 00:08:12.932130 kubelet[2832]: E0114 00:08:12.932094 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.932130 kubelet[2832]: W0114 00:08:12.932130 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.932206 kubelet[2832]: E0114 00:08:12.932145 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.932605 kubelet[2832]: E0114 00:08:12.932584 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.932664 kubelet[2832]: W0114 00:08:12.932631 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.932664 kubelet[2832]: E0114 00:08:12.932646 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.933411 kubelet[2832]: E0114 00:08:12.932932 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.933411 kubelet[2832]: W0114 00:08:12.933270 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.933411 kubelet[2832]: E0114 00:08:12.933319 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.933978 kubelet[2832]: E0114 00:08:12.933571 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.933978 kubelet[2832]: W0114 00:08:12.933607 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.933978 kubelet[2832]: E0114 00:08:12.933617 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.933978 kubelet[2832]: E0114 00:08:12.933758 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.933978 kubelet[2832]: W0114 00:08:12.933766 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.933978 kubelet[2832]: E0114 00:08:12.933774 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.934198 kubelet[2832]: E0114 00:08:12.934137 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.934198 kubelet[2832]: W0114 00:08:12.934150 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.934198 kubelet[2832]: E0114 00:08:12.934177 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.934579 kubelet[2832]: E0114 00:08:12.934308 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.934579 kubelet[2832]: W0114 00:08:12.934324 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.934579 kubelet[2832]: E0114 00:08:12.934332 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.934579 kubelet[2832]: E0114 00:08:12.934539 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.934579 kubelet[2832]: W0114 00:08:12.934549 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.934579 kubelet[2832]: E0114 00:08:12.934557 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.935111 kubelet[2832]: E0114 00:08:12.935006 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.935111 kubelet[2832]: W0114 00:08:12.935026 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.935111 kubelet[2832]: E0114 00:08:12.935038 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.935602 kubelet[2832]: E0114 00:08:12.935299 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.935602 kubelet[2832]: W0114 00:08:12.935309 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.935602 kubelet[2832]: E0114 00:08:12.935321 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.935602 kubelet[2832]: E0114 00:08:12.935460 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.935602 kubelet[2832]: W0114 00:08:12.935468 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.935602 kubelet[2832]: E0114 00:08:12.935511 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.936113 kubelet[2832]: E0114 00:08:12.935726 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.936113 kubelet[2832]: W0114 00:08:12.935737 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.936113 kubelet[2832]: E0114 00:08:12.935746 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.936113 kubelet[2832]: E0114 00:08:12.936054 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.936113 kubelet[2832]: W0114 00:08:12.936065 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.936113 kubelet[2832]: E0114 00:08:12.936074 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.936621 kubelet[2832]: E0114 00:08:12.936236 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.936621 kubelet[2832]: W0114 00:08:12.936244 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.936621 kubelet[2832]: E0114 00:08:12.936252 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.936621 kubelet[2832]: E0114 00:08:12.936356 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.936621 kubelet[2832]: W0114 00:08:12.936363 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.936621 kubelet[2832]: E0114 00:08:12.936370 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.936621 kubelet[2832]: E0114 00:08:12.936465 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.936621 kubelet[2832]: W0114 00:08:12.936470 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.936621 kubelet[2832]: E0114 00:08:12.936477 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.937440 kubelet[2832]: E0114 00:08:12.936642 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.937440 kubelet[2832]: W0114 00:08:12.936651 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.937440 kubelet[2832]: E0114 00:08:12.936659 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:12.950686 kubelet[2832]: E0114 00:08:12.950652 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:12.950938 kubelet[2832]: W0114 00:08:12.950859 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:12.950938 kubelet[2832]: E0114 00:08:12.950899 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:14.334139 kubelet[2832]: E0114 00:08:14.334066 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:08:14.426330 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount44743403.mount: Deactivated successfully. Jan 14 00:08:15.142987 containerd[1590]: time="2026-01-14T00:08:15.142923796Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:15.144192 containerd[1590]: time="2026-01-14T00:08:15.143976051Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Jan 14 00:08:15.145320 containerd[1590]: time="2026-01-14T00:08:15.145280069Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:15.148167 containerd[1590]: time="2026-01-14T00:08:15.148067493Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:15.149213 containerd[1590]: time="2026-01-14T00:08:15.148626546Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.349953351s" Jan 14 00:08:15.149213 containerd[1590]: time="2026-01-14T00:08:15.148663472Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 14 00:08:15.150136 containerd[1590]: time="2026-01-14T00:08:15.150111113Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 14 00:08:15.166265 containerd[1590]: time="2026-01-14T00:08:15.166211635Z" level=info msg="CreateContainer within sandbox \"ede786f4624018a6dcfa25a7228636a76cd57c7e9702092be0e753776a732cd1\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 14 00:08:15.177787 containerd[1590]: time="2026-01-14T00:08:15.177732035Z" level=info msg="Container 23467040d4d683f0e80e50c33d132cc1af6b30ac939ca3872475c34dadd547da: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:08:15.191462 containerd[1590]: time="2026-01-14T00:08:15.191419755Z" level=info msg="CreateContainer within sandbox \"ede786f4624018a6dcfa25a7228636a76cd57c7e9702092be0e753776a732cd1\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"23467040d4d683f0e80e50c33d132cc1af6b30ac939ca3872475c34dadd547da\"" Jan 14 00:08:15.193854 containerd[1590]: time="2026-01-14T00:08:15.192367193Z" level=info msg="StartContainer for \"23467040d4d683f0e80e50c33d132cc1af6b30ac939ca3872475c34dadd547da\"" Jan 14 00:08:15.194056 containerd[1590]: time="2026-01-14T00:08:15.193510343Z" level=info msg="connecting to shim 23467040d4d683f0e80e50c33d132cc1af6b30ac939ca3872475c34dadd547da" address="unix:///run/containerd/s/b7bdc95f0e16aea68ef70c805a67bab6654c0dc557cb15468c830cf2c0c82c40" protocol=ttrpc version=3 Jan 14 00:08:15.220836 systemd[1]: Started cri-containerd-23467040d4d683f0e80e50c33d132cc1af6b30ac939ca3872475c34dadd547da.scope - libcontainer container 23467040d4d683f0e80e50c33d132cc1af6b30ac939ca3872475c34dadd547da. Jan 14 00:08:15.242000 audit: BPF prog-id=159 op=LOAD Jan 14 00:08:15.244786 kernel: kauditd_printk_skb: 64 callbacks suppressed Jan 14 00:08:15.244828 kernel: audit: type=1334 audit(1768349295.242:549): prog-id=159 op=LOAD Jan 14 00:08:15.243000 audit: BPF prog-id=160 op=LOAD Jan 14 00:08:15.245830 kernel: audit: type=1334 audit(1768349295.243:550): prog-id=160 op=LOAD Jan 14 00:08:15.243000 audit[3448]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.248141 kernel: audit: type=1300 audit(1768349295.243:550): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.243000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.250813 kernel: audit: type=1327 audit(1768349295.243:550): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.244000 audit: BPF prog-id=160 op=UNLOAD Jan 14 00:08:15.251727 kernel: audit: type=1334 audit(1768349295.244:551): prog-id=160 op=UNLOAD Jan 14 00:08:15.251786 kernel: audit: type=1300 audit(1768349295.244:551): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.244000 audit[3448]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.256086 kernel: audit: type=1327 audit(1768349295.244:551): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.244000 audit: BPF prog-id=161 op=LOAD Jan 14 00:08:15.256838 kernel: audit: type=1334 audit(1768349295.244:552): prog-id=161 op=LOAD Jan 14 00:08:15.256952 kernel: audit: type=1300 audit(1768349295.244:552): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.244000 audit[3448]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.261349 kernel: audit: type=1327 audit(1768349295.244:552): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.244000 audit: BPF prog-id=162 op=LOAD Jan 14 00:08:15.244000 audit[3448]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.244000 audit: BPF prog-id=162 op=UNLOAD Jan 14 00:08:15.244000 audit[3448]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.244000 audit: BPF prog-id=161 op=UNLOAD Jan 14 00:08:15.244000 audit[3448]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.244000 audit: BPF prog-id=163 op=LOAD Jan 14 00:08:15.244000 audit[3448]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3250 pid=3448 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:15.244000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233343637303430643464363833663065383065353063333364313332 Jan 14 00:08:15.292705 containerd[1590]: time="2026-01-14T00:08:15.292643857Z" level=info msg="StartContainer for \"23467040d4d683f0e80e50c33d132cc1af6b30ac939ca3872475c34dadd547da\" returns successfully" Jan 14 00:08:15.531637 kubelet[2832]: E0114 00:08:15.530179 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.531637 kubelet[2832]: W0114 00:08:15.530638 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.531637 kubelet[2832]: E0114 00:08:15.531543 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.532425 kubelet[2832]: E0114 00:08:15.532330 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.532425 kubelet[2832]: W0114 00:08:15.532360 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.532425 kubelet[2832]: E0114 00:08:15.532374 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.532801 kubelet[2832]: E0114 00:08:15.532756 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.533649 kubelet[2832]: W0114 00:08:15.532860 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.533649 kubelet[2832]: E0114 00:08:15.533593 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.533944 kubelet[2832]: E0114 00:08:15.533931 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.534099 kubelet[2832]: W0114 00:08:15.534012 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.534099 kubelet[2832]: E0114 00:08:15.534026 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.534325 kubelet[2832]: E0114 00:08:15.534304 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.534451 kubelet[2832]: W0114 00:08:15.534400 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.534451 kubelet[2832]: E0114 00:08:15.534416 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.534759 kubelet[2832]: E0114 00:08:15.534704 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.534759 kubelet[2832]: W0114 00:08:15.534716 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.534759 kubelet[2832]: E0114 00:08:15.534727 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.536870 kubelet[2832]: E0114 00:08:15.536743 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.537067 kubelet[2832]: W0114 00:08:15.537042 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.537275 kubelet[2832]: E0114 00:08:15.537256 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.538078 kubelet[2832]: E0114 00:08:15.537949 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.538078 kubelet[2832]: W0114 00:08:15.537993 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.538078 kubelet[2832]: E0114 00:08:15.538010 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.538495 kubelet[2832]: E0114 00:08:15.538460 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.538745 kubelet[2832]: W0114 00:08:15.538600 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.538745 kubelet[2832]: E0114 00:08:15.538617 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.539764 kubelet[2832]: E0114 00:08:15.539647 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.539764 kubelet[2832]: W0114 00:08:15.539674 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.539764 kubelet[2832]: E0114 00:08:15.539687 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.540597 kubelet[2832]: E0114 00:08:15.540368 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.540597 kubelet[2832]: W0114 00:08:15.540437 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.540597 kubelet[2832]: E0114 00:08:15.540449 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.541687 kubelet[2832]: E0114 00:08:15.541655 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.541855 kubelet[2832]: W0114 00:08:15.541769 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.541855 kubelet[2832]: E0114 00:08:15.541792 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.542364 kubelet[2832]: E0114 00:08:15.542348 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.542441 kubelet[2832]: W0114 00:08:15.542428 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.542828 kubelet[2832]: E0114 00:08:15.542515 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.543234 kubelet[2832]: E0114 00:08:15.543160 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.543234 kubelet[2832]: W0114 00:08:15.543177 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.543234 kubelet[2832]: E0114 00:08:15.543188 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.544287 kubelet[2832]: E0114 00:08:15.543663 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.544287 kubelet[2832]: W0114 00:08:15.543687 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.544287 kubelet[2832]: E0114 00:08:15.543703 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.553772 kubelet[2832]: E0114 00:08:15.553744 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.554020 kubelet[2832]: W0114 00:08:15.553920 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.554020 kubelet[2832]: E0114 00:08:15.553962 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.555024 kubelet[2832]: E0114 00:08:15.554969 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.555024 kubelet[2832]: W0114 00:08:15.554988 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.555024 kubelet[2832]: E0114 00:08:15.555006 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.555563 kubelet[2832]: E0114 00:08:15.555507 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.555563 kubelet[2832]: W0114 00:08:15.555537 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.555563 kubelet[2832]: E0114 00:08:15.555549 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.556671 kubelet[2832]: E0114 00:08:15.556654 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.556827 kubelet[2832]: W0114 00:08:15.556753 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.556827 kubelet[2832]: E0114 00:08:15.556769 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.557099 kubelet[2832]: E0114 00:08:15.557085 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.557182 kubelet[2832]: W0114 00:08:15.557155 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.557182 kubelet[2832]: E0114 00:08:15.557169 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.557579 kubelet[2832]: E0114 00:08:15.557564 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.557651 kubelet[2832]: W0114 00:08:15.557640 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.557699 kubelet[2832]: E0114 00:08:15.557690 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.558266 kubelet[2832]: E0114 00:08:15.558203 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.558266 kubelet[2832]: W0114 00:08:15.558218 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.558266 kubelet[2832]: E0114 00:08:15.558241 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.558602 kubelet[2832]: E0114 00:08:15.558574 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.558602 kubelet[2832]: W0114 00:08:15.558590 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.558602 kubelet[2832]: E0114 00:08:15.558603 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.559120 kubelet[2832]: E0114 00:08:15.559097 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.559120 kubelet[2832]: W0114 00:08:15.559113 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.559558 kubelet[2832]: E0114 00:08:15.559125 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.559714 kubelet[2832]: E0114 00:08:15.559694 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.559714 kubelet[2832]: W0114 00:08:15.559709 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.560308 kubelet[2832]: E0114 00:08:15.559722 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.560624 kubelet[2832]: E0114 00:08:15.560603 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.560624 kubelet[2832]: W0114 00:08:15.560620 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.560624 kubelet[2832]: E0114 00:08:15.560633 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.560806 kubelet[2832]: E0114 00:08:15.560791 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.560806 kubelet[2832]: W0114 00:08:15.560799 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.560806 kubelet[2832]: E0114 00:08:15.560807 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.560942 kubelet[2832]: E0114 00:08:15.560923 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.560942 kubelet[2832]: W0114 00:08:15.560931 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.560942 kubelet[2832]: E0114 00:08:15.560941 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.561164 kubelet[2832]: E0114 00:08:15.561140 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.561164 kubelet[2832]: W0114 00:08:15.561150 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.561164 kubelet[2832]: E0114 00:08:15.561158 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.561742 kubelet[2832]: E0114 00:08:15.561720 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.561742 kubelet[2832]: W0114 00:08:15.561736 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.562028 kubelet[2832]: E0114 00:08:15.561749 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.562688 kubelet[2832]: E0114 00:08:15.562669 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.562688 kubelet[2832]: W0114 00:08:15.562689 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.562779 kubelet[2832]: E0114 00:08:15.562703 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.562991 kubelet[2832]: E0114 00:08:15.562972 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.562991 kubelet[2832]: W0114 00:08:15.562986 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.563060 kubelet[2832]: E0114 00:08:15.562997 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:15.564534 kubelet[2832]: E0114 00:08:15.564494 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:15.564534 kubelet[2832]: W0114 00:08:15.564515 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:15.564643 kubelet[2832]: E0114 00:08:15.564546 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.336508 kubelet[2832]: E0114 00:08:16.336362 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:08:16.452232 kubelet[2832]: I0114 00:08:16.451961 2832 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 00:08:16.551021 kubelet[2832]: E0114 00:08:16.550943 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.551973 kubelet[2832]: W0114 00:08:16.551602 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.551973 kubelet[2832]: E0114 00:08:16.551694 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.552480 kubelet[2832]: E0114 00:08:16.552152 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.552480 kubelet[2832]: W0114 00:08:16.552168 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.552480 kubelet[2832]: E0114 00:08:16.552198 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.553049 kubelet[2832]: E0114 00:08:16.553023 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.553366 kubelet[2832]: W0114 00:08:16.553178 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.553366 kubelet[2832]: E0114 00:08:16.553207 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.554098 kubelet[2832]: E0114 00:08:16.553821 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.554098 kubelet[2832]: W0114 00:08:16.553842 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.554098 kubelet[2832]: E0114 00:08:16.553857 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.554404 kubelet[2832]: E0114 00:08:16.554260 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.554404 kubelet[2832]: W0114 00:08:16.554277 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.554404 kubelet[2832]: E0114 00:08:16.554291 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.554831 kubelet[2832]: E0114 00:08:16.554690 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.554831 kubelet[2832]: W0114 00:08:16.554709 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.554831 kubelet[2832]: E0114 00:08:16.554724 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.555037 kubelet[2832]: E0114 00:08:16.555020 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.555277 kubelet[2832]: W0114 00:08:16.555132 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.555277 kubelet[2832]: E0114 00:08:16.555154 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.555480 kubelet[2832]: E0114 00:08:16.555449 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.555790 kubelet[2832]: W0114 00:08:16.555602 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.555790 kubelet[2832]: E0114 00:08:16.555622 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.555939 kubelet[2832]: E0114 00:08:16.555926 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.556070 kubelet[2832]: W0114 00:08:16.556056 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.556122 kubelet[2832]: E0114 00:08:16.556113 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.556362 kubelet[2832]: E0114 00:08:16.556349 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.556553 kubelet[2832]: W0114 00:08:16.556421 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.556553 kubelet[2832]: E0114 00:08:16.556435 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.556766 kubelet[2832]: E0114 00:08:16.556750 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.556830 kubelet[2832]: W0114 00:08:16.556820 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.556977 kubelet[2832]: E0114 00:08:16.556879 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.557093 kubelet[2832]: E0114 00:08:16.557082 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.557240 kubelet[2832]: W0114 00:08:16.557136 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.557240 kubelet[2832]: E0114 00:08:16.557149 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.557362 kubelet[2832]: E0114 00:08:16.557352 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.557411 kubelet[2832]: W0114 00:08:16.557401 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.557494 kubelet[2832]: E0114 00:08:16.557454 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.557726 kubelet[2832]: E0114 00:08:16.557713 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.557785 kubelet[2832]: W0114 00:08:16.557774 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.557833 kubelet[2832]: E0114 00:08:16.557823 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.558130 kubelet[2832]: E0114 00:08:16.558034 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.558130 kubelet[2832]: W0114 00:08:16.558049 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.558130 kubelet[2832]: E0114 00:08:16.558059 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.563436 kubelet[2832]: E0114 00:08:16.563411 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.563436 kubelet[2832]: W0114 00:08:16.563423 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.563436 kubelet[2832]: E0114 00:08:16.563435 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.563859 kubelet[2832]: E0114 00:08:16.563759 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.563859 kubelet[2832]: W0114 00:08:16.563772 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.563859 kubelet[2832]: E0114 00:08:16.563782 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.563987 kubelet[2832]: E0114 00:08:16.563947 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.563987 kubelet[2832]: W0114 00:08:16.563954 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.563987 kubelet[2832]: E0114 00:08:16.563962 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.564152 kubelet[2832]: E0114 00:08:16.564136 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.564152 kubelet[2832]: W0114 00:08:16.564146 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.564152 kubelet[2832]: E0114 00:08:16.564154 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.564570 kubelet[2832]: E0114 00:08:16.564315 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.564570 kubelet[2832]: W0114 00:08:16.564323 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.564570 kubelet[2832]: E0114 00:08:16.564334 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.564570 kubelet[2832]: E0114 00:08:16.564506 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.564570 kubelet[2832]: W0114 00:08:16.564515 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.565014 kubelet[2832]: E0114 00:08:16.564533 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.565014 kubelet[2832]: E0114 00:08:16.564815 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.565014 kubelet[2832]: W0114 00:08:16.564825 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.565014 kubelet[2832]: E0114 00:08:16.564834 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.565014 kubelet[2832]: E0114 00:08:16.564978 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.565014 kubelet[2832]: W0114 00:08:16.564985 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.565014 kubelet[2832]: E0114 00:08:16.564992 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.565269 kubelet[2832]: E0114 00:08:16.565101 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.565269 kubelet[2832]: W0114 00:08:16.565107 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.565269 kubelet[2832]: E0114 00:08:16.565114 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.565269 kubelet[2832]: E0114 00:08:16.565212 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.565269 kubelet[2832]: W0114 00:08:16.565217 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.565269 kubelet[2832]: E0114 00:08:16.565224 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.565510 kubelet[2832]: E0114 00:08:16.565312 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.565510 kubelet[2832]: W0114 00:08:16.565319 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.565510 kubelet[2832]: E0114 00:08:16.565326 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.565510 kubelet[2832]: E0114 00:08:16.565431 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.565510 kubelet[2832]: W0114 00:08:16.565437 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.565510 kubelet[2832]: E0114 00:08:16.565444 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.567655 kubelet[2832]: E0114 00:08:16.565865 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.567655 kubelet[2832]: W0114 00:08:16.565875 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.567655 kubelet[2832]: E0114 00:08:16.565885 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.567655 kubelet[2832]: E0114 00:08:16.566136 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.567655 kubelet[2832]: W0114 00:08:16.566145 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.567655 kubelet[2832]: E0114 00:08:16.566170 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.567655 kubelet[2832]: E0114 00:08:16.566340 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.567655 kubelet[2832]: W0114 00:08:16.566349 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.567655 kubelet[2832]: E0114 00:08:16.566357 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.567655 kubelet[2832]: E0114 00:08:16.566509 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.567839 kubelet[2832]: W0114 00:08:16.566540 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.567839 kubelet[2832]: E0114 00:08:16.566552 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.567839 kubelet[2832]: E0114 00:08:16.566713 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.567839 kubelet[2832]: W0114 00:08:16.566721 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.567839 kubelet[2832]: E0114 00:08:16.566730 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.567839 kubelet[2832]: E0114 00:08:16.567268 2832 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 14 00:08:16.567839 kubelet[2832]: W0114 00:08:16.567291 2832 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 14 00:08:16.567839 kubelet[2832]: E0114 00:08:16.567301 2832 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 14 00:08:16.743126 containerd[1590]: time="2026-01-14T00:08:16.742342192Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:16.745187 containerd[1590]: time="2026-01-14T00:08:16.745101996Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:16.746383 containerd[1590]: time="2026-01-14T00:08:16.746124361Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:16.748326 containerd[1590]: time="2026-01-14T00:08:16.748272867Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:16.749169 containerd[1590]: time="2026-01-14T00:08:16.748763787Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.598491686s" Jan 14 00:08:16.749169 containerd[1590]: time="2026-01-14T00:08:16.748796872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 14 00:08:16.754088 containerd[1590]: time="2026-01-14T00:08:16.754045278Z" level=info msg="CreateContainer within sandbox \"9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 14 00:08:16.768396 containerd[1590]: time="2026-01-14T00:08:16.766749045Z" level=info msg="Container a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:08:16.774681 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3840116224.mount: Deactivated successfully. Jan 14 00:08:16.781898 containerd[1590]: time="2026-01-14T00:08:16.781831116Z" level=info msg="CreateContainer within sandbox \"9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2\"" Jan 14 00:08:16.783552 containerd[1590]: time="2026-01-14T00:08:16.782887687Z" level=info msg="StartContainer for \"a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2\"" Jan 14 00:08:16.785441 containerd[1590]: time="2026-01-14T00:08:16.785411613Z" level=info msg="connecting to shim a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2" address="unix:///run/containerd/s/621e46b795ba93a712d09a908cef29e1ca946d09b47cec9fd28e90c2c6d919d9" protocol=ttrpc version=3 Jan 14 00:08:16.809753 systemd[1]: Started cri-containerd-a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2.scope - libcontainer container a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2. Jan 14 00:08:16.874000 audit: BPF prog-id=164 op=LOAD Jan 14 00:08:16.874000 audit[3556]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3374 pid=3556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:16.874000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135326631326362353766656662383234356132376130326163313232 Jan 14 00:08:16.875000 audit: BPF prog-id=165 op=LOAD Jan 14 00:08:16.875000 audit[3556]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3374 pid=3556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:16.875000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135326631326362353766656662383234356132376130326163313232 Jan 14 00:08:16.875000 audit: BPF prog-id=165 op=UNLOAD Jan 14 00:08:16.875000 audit[3556]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:16.875000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135326631326362353766656662383234356132376130326163313232 Jan 14 00:08:16.875000 audit: BPF prog-id=164 op=UNLOAD Jan 14 00:08:16.875000 audit[3556]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:16.875000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135326631326362353766656662383234356132376130326163313232 Jan 14 00:08:16.876000 audit: BPF prog-id=166 op=LOAD Jan 14 00:08:16.876000 audit[3556]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3374 pid=3556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:16.876000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6135326631326362353766656662383234356132376130326163313232 Jan 14 00:08:16.902168 containerd[1590]: time="2026-01-14T00:08:16.902130666Z" level=info msg="StartContainer for \"a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2\" returns successfully" Jan 14 00:08:16.920703 systemd[1]: cri-containerd-a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2.scope: Deactivated successfully. Jan 14 00:08:16.924000 audit: BPF prog-id=166 op=UNLOAD Jan 14 00:08:16.927299 containerd[1590]: time="2026-01-14T00:08:16.927219950Z" level=info msg="received container exit event container_id:\"a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2\" id:\"a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2\" pid:3569 exited_at:{seconds:1768349296 nanos:926585727}" Jan 14 00:08:16.951597 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2-rootfs.mount: Deactivated successfully. Jan 14 00:08:17.458334 containerd[1590]: time="2026-01-14T00:08:17.458293907Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 14 00:08:17.481772 kubelet[2832]: I0114 00:08:17.481433 2832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b486c66d5-cdrrg" podStartSLOduration=3.126391606 podStartE2EDuration="5.481412076s" podCreationTimestamp="2026-01-14 00:08:12 +0000 UTC" firstStartedPulling="2026-01-14 00:08:12.794585119 +0000 UTC m=+26.596996426" lastFinishedPulling="2026-01-14 00:08:15.149605589 +0000 UTC m=+28.952016896" observedRunningTime="2026-01-14 00:08:15.492369649 +0000 UTC m=+29.294781076" watchObservedRunningTime="2026-01-14 00:08:17.481412076 +0000 UTC m=+31.283823343" Jan 14 00:08:18.336579 kubelet[2832]: E0114 00:08:18.336157 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:08:20.332350 kubelet[2832]: E0114 00:08:20.332304 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:08:20.878544 containerd[1590]: time="2026-01-14T00:08:20.878299196Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:20.879708 containerd[1590]: time="2026-01-14T00:08:20.879640307Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 14 00:08:20.880569 containerd[1590]: time="2026-01-14T00:08:20.880457144Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:20.883365 containerd[1590]: time="2026-01-14T00:08:20.883230500Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:20.884284 containerd[1590]: time="2026-01-14T00:08:20.884099704Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.42576215s" Jan 14 00:08:20.884284 containerd[1590]: time="2026-01-14T00:08:20.884148431Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 14 00:08:20.890412 containerd[1590]: time="2026-01-14T00:08:20.890375839Z" level=info msg="CreateContainer within sandbox \"9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 14 00:08:20.905513 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2384129664.mount: Deactivated successfully. Jan 14 00:08:20.905893 containerd[1590]: time="2026-01-14T00:08:20.905774637Z" level=info msg="Container 326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:08:20.918146 containerd[1590]: time="2026-01-14T00:08:20.918070872Z" level=info msg="CreateContainer within sandbox \"9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b\"" Jan 14 00:08:20.920539 containerd[1590]: time="2026-01-14T00:08:20.918790815Z" level=info msg="StartContainer for \"326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b\"" Jan 14 00:08:20.921075 containerd[1590]: time="2026-01-14T00:08:20.921045256Z" level=info msg="connecting to shim 326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b" address="unix:///run/containerd/s/621e46b795ba93a712d09a908cef29e1ca946d09b47cec9fd28e90c2c6d919d9" protocol=ttrpc version=3 Jan 14 00:08:20.943784 systemd[1]: Started cri-containerd-326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b.scope - libcontainer container 326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b. Jan 14 00:08:20.997000 audit: BPF prog-id=167 op=LOAD Jan 14 00:08:20.999802 kernel: kauditd_printk_skb: 28 callbacks suppressed Jan 14 00:08:20.999859 kernel: audit: type=1334 audit(1768349300.997:563): prog-id=167 op=LOAD Jan 14 00:08:20.997000 audit[3614]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3374 pid=3614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:21.002490 kernel: audit: type=1300 audit(1768349300.997:563): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3374 pid=3614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:20.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332366433376466653730613130373238353862396232376464353563 Jan 14 00:08:21.005441 kernel: audit: type=1327 audit(1768349300.997:563): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332366433376466653730613130373238353862396232376464353563 Jan 14 00:08:21.006208 kernel: audit: type=1334 audit(1768349300.997:564): prog-id=168 op=LOAD Jan 14 00:08:21.006276 kernel: audit: type=1300 audit(1768349300.997:564): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3374 pid=3614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:20.997000 audit: BPF prog-id=168 op=LOAD Jan 14 00:08:20.997000 audit[3614]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3374 pid=3614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:21.010663 kernel: audit: type=1327 audit(1768349300.997:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332366433376466653730613130373238353862396232376464353563 Jan 14 00:08:21.010812 kernel: audit: type=1334 audit(1768349300.999:565): prog-id=168 op=UNLOAD Jan 14 00:08:20.997000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332366433376466653730613130373238353862396232376464353563 Jan 14 00:08:20.999000 audit: BPF prog-id=168 op=UNLOAD Jan 14 00:08:20.999000 audit[3614]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:21.013231 kernel: audit: type=1300 audit(1768349300.999:565): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:20.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332366433376466653730613130373238353862396232376464353563 Jan 14 00:08:21.015463 kernel: audit: type=1327 audit(1768349300.999:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332366433376466653730613130373238353862396232376464353563 Jan 14 00:08:20.999000 audit: BPF prog-id=167 op=UNLOAD Jan 14 00:08:21.016566 kernel: audit: type=1334 audit(1768349300.999:566): prog-id=167 op=UNLOAD Jan 14 00:08:20.999000 audit[3614]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:20.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332366433376466653730613130373238353862396232376464353563 Jan 14 00:08:20.999000 audit: BPF prog-id=169 op=LOAD Jan 14 00:08:20.999000 audit[3614]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3374 pid=3614 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:20.999000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332366433376466653730613130373238353862396232376464353563 Jan 14 00:08:21.038011 containerd[1590]: time="2026-01-14T00:08:21.037963083Z" level=info msg="StartContainer for \"326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b\" returns successfully" Jan 14 00:08:21.670800 containerd[1590]: time="2026-01-14T00:08:21.670732908Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 14 00:08:21.674061 systemd[1]: cri-containerd-326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b.scope: Deactivated successfully. Jan 14 00:08:21.674430 systemd[1]: cri-containerd-326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b.scope: Consumed 536ms CPU time, 187.6M memory peak, 165.9M written to disk. Jan 14 00:08:21.677000 audit: BPF prog-id=169 op=UNLOAD Jan 14 00:08:21.679191 containerd[1590]: time="2026-01-14T00:08:21.678938327Z" level=info msg="received container exit event container_id:\"326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b\" id:\"326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b\" pid:3626 exited_at:{seconds:1768349301 nanos:678102491}" Jan 14 00:08:21.701154 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b-rootfs.mount: Deactivated successfully. Jan 14 00:08:21.734598 kubelet[2832]: I0114 00:08:21.734551 2832 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Jan 14 00:08:21.792840 systemd[1]: Created slice kubepods-burstable-podbf272ec4_219f_4fe3_8789_4c7d13ac9aa1.slice - libcontainer container kubepods-burstable-podbf272ec4_219f_4fe3_8789_4c7d13ac9aa1.slice. Jan 14 00:08:21.802699 systemd[1]: Created slice kubepods-burstable-pod56a52ed7_5f85_4305_9063_0c678613578e.slice - libcontainer container kubepods-burstable-pod56a52ed7_5f85_4305_9063_0c678613578e.slice. Jan 14 00:08:21.809934 kubelet[2832]: I0114 00:08:21.809897 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4g4j\" (UniqueName: \"kubernetes.io/projected/56a52ed7-5f85-4305-9063-0c678613578e-kube-api-access-h4g4j\") pod \"coredns-674b8bbfcf-hjxff\" (UID: \"56a52ed7-5f85-4305-9063-0c678613578e\") " pod="kube-system/coredns-674b8bbfcf-hjxff" Jan 14 00:08:21.809934 kubelet[2832]: I0114 00:08:21.809938 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5hzh\" (UniqueName: \"kubernetes.io/projected/bf272ec4-219f-4fe3-8789-4c7d13ac9aa1-kube-api-access-l5hzh\") pod \"coredns-674b8bbfcf-dj86q\" (UID: \"bf272ec4-219f-4fe3-8789-4c7d13ac9aa1\") " pod="kube-system/coredns-674b8bbfcf-dj86q" Jan 14 00:08:21.810955 kubelet[2832]: I0114 00:08:21.809961 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bf272ec4-219f-4fe3-8789-4c7d13ac9aa1-config-volume\") pod \"coredns-674b8bbfcf-dj86q\" (UID: \"bf272ec4-219f-4fe3-8789-4c7d13ac9aa1\") " pod="kube-system/coredns-674b8bbfcf-dj86q" Jan 14 00:08:21.810955 kubelet[2832]: I0114 00:08:21.809980 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/56a52ed7-5f85-4305-9063-0c678613578e-config-volume\") pod \"coredns-674b8bbfcf-hjxff\" (UID: \"56a52ed7-5f85-4305-9063-0c678613578e\") " pod="kube-system/coredns-674b8bbfcf-hjxff" Jan 14 00:08:21.817885 systemd[1]: Created slice kubepods-besteffort-pod5ee70bb0_55b7_4a80_b5cb_3133091615ae.slice - libcontainer container kubepods-besteffort-pod5ee70bb0_55b7_4a80_b5cb_3133091615ae.slice. Jan 14 00:08:21.839024 systemd[1]: Created slice kubepods-besteffort-pod1e4bec8e_a684_46cb_852e_ae05ed7b56d7.slice - libcontainer container kubepods-besteffort-pod1e4bec8e_a684_46cb_852e_ae05ed7b56d7.slice. Jan 14 00:08:21.850973 systemd[1]: Created slice kubepods-besteffort-poddd5a4796_a0a6_48e1_9d05_262e15688b90.slice - libcontainer container kubepods-besteffort-poddd5a4796_a0a6_48e1_9d05_262e15688b90.slice. Jan 14 00:08:21.862631 systemd[1]: Created slice kubepods-besteffort-pod1e53bd66_4746_482e_bb2b_bfd29a1ef20e.slice - libcontainer container kubepods-besteffort-pod1e53bd66_4746_482e_bb2b_bfd29a1ef20e.slice. Jan 14 00:08:21.870837 systemd[1]: Created slice kubepods-besteffort-pod3e7b6341_e94e_4a0e_be63_0d1b1ff1d4c4.slice - libcontainer container kubepods-besteffort-pod3e7b6341_e94e_4a0e_be63_0d1b1ff1d4c4.slice. Jan 14 00:08:21.911248 kubelet[2832]: I0114 00:08:21.911189 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4-calico-apiserver-certs\") pod \"calico-apiserver-6f67969d8d-7bt2c\" (UID: \"3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4\") " pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" Jan 14 00:08:21.911248 kubelet[2832]: I0114 00:08:21.911242 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1e53bd66-4746-482e-bb2b-bfd29a1ef20e-config\") pod \"goldmane-666569f655-hrn72\" (UID: \"1e53bd66-4746-482e-bb2b-bfd29a1ef20e\") " pod="calico-system/goldmane-666569f655-hrn72" Jan 14 00:08:21.911460 kubelet[2832]: I0114 00:08:21.911300 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dd5a4796-a0a6-48e1-9d05-262e15688b90-whisker-backend-key-pair\") pod \"whisker-54764fcb-v4q5r\" (UID: \"dd5a4796-a0a6-48e1-9d05-262e15688b90\") " pod="calico-system/whisker-54764fcb-v4q5r" Jan 14 00:08:21.911460 kubelet[2832]: I0114 00:08:21.911323 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1e4bec8e-a684-46cb-852e-ae05ed7b56d7-calico-apiserver-certs\") pod \"calico-apiserver-6f67969d8d-vdxqm\" (UID: \"1e4bec8e-a684-46cb-852e-ae05ed7b56d7\") " pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" Jan 14 00:08:21.911460 kubelet[2832]: I0114 00:08:21.911340 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/1e53bd66-4746-482e-bb2b-bfd29a1ef20e-goldmane-key-pair\") pod \"goldmane-666569f655-hrn72\" (UID: \"1e53bd66-4746-482e-bb2b-bfd29a1ef20e\") " pod="calico-system/goldmane-666569f655-hrn72" Jan 14 00:08:21.911460 kubelet[2832]: I0114 00:08:21.911371 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnzjn\" (UniqueName: \"kubernetes.io/projected/dd5a4796-a0a6-48e1-9d05-262e15688b90-kube-api-access-cnzjn\") pod \"whisker-54764fcb-v4q5r\" (UID: \"dd5a4796-a0a6-48e1-9d05-262e15688b90\") " pod="calico-system/whisker-54764fcb-v4q5r" Jan 14 00:08:21.911460 kubelet[2832]: I0114 00:08:21.911390 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5a4796-a0a6-48e1-9d05-262e15688b90-whisker-ca-bundle\") pod \"whisker-54764fcb-v4q5r\" (UID: \"dd5a4796-a0a6-48e1-9d05-262e15688b90\") " pod="calico-system/whisker-54764fcb-v4q5r" Jan 14 00:08:21.912076 kubelet[2832]: I0114 00:08:21.911408 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e53bd66-4746-482e-bb2b-bfd29a1ef20e-goldmane-ca-bundle\") pod \"goldmane-666569f655-hrn72\" (UID: \"1e53bd66-4746-482e-bb2b-bfd29a1ef20e\") " pod="calico-system/goldmane-666569f655-hrn72" Jan 14 00:08:21.912076 kubelet[2832]: I0114 00:08:21.911451 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md5mt\" (UniqueName: \"kubernetes.io/projected/1e4bec8e-a684-46cb-852e-ae05ed7b56d7-kube-api-access-md5mt\") pod \"calico-apiserver-6f67969d8d-vdxqm\" (UID: \"1e4bec8e-a684-46cb-852e-ae05ed7b56d7\") " pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" Jan 14 00:08:21.912076 kubelet[2832]: I0114 00:08:21.911483 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ee70bb0-55b7-4a80-b5cb-3133091615ae-tigera-ca-bundle\") pod \"calico-kube-controllers-b44cc6f4-gxl6c\" (UID: \"5ee70bb0-55b7-4a80-b5cb-3133091615ae\") " pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" Jan 14 00:08:21.912076 kubelet[2832]: I0114 00:08:21.911506 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p28dx\" (UniqueName: \"kubernetes.io/projected/1e53bd66-4746-482e-bb2b-bfd29a1ef20e-kube-api-access-p28dx\") pod \"goldmane-666569f655-hrn72\" (UID: \"1e53bd66-4746-482e-bb2b-bfd29a1ef20e\") " pod="calico-system/goldmane-666569f655-hrn72" Jan 14 00:08:21.912369 kubelet[2832]: I0114 00:08:21.912328 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r74wh\" (UniqueName: \"kubernetes.io/projected/5ee70bb0-55b7-4a80-b5cb-3133091615ae-kube-api-access-r74wh\") pod \"calico-kube-controllers-b44cc6f4-gxl6c\" (UID: \"5ee70bb0-55b7-4a80-b5cb-3133091615ae\") " pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" Jan 14 00:08:21.912571 kubelet[2832]: I0114 00:08:21.912415 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cr65d\" (UniqueName: \"kubernetes.io/projected/3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4-kube-api-access-cr65d\") pod \"calico-apiserver-6f67969d8d-7bt2c\" (UID: \"3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4\") " pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" Jan 14 00:08:22.099477 containerd[1590]: time="2026-01-14T00:08:22.099360801Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dj86q,Uid:bf272ec4-219f-4fe3-8789-4c7d13ac9aa1,Namespace:kube-system,Attempt:0,}" Jan 14 00:08:22.109549 containerd[1590]: time="2026-01-14T00:08:22.109469087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hjxff,Uid:56a52ed7-5f85-4305-9063-0c678613578e,Namespace:kube-system,Attempt:0,}" Jan 14 00:08:22.125326 containerd[1590]: time="2026-01-14T00:08:22.125245419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b44cc6f4-gxl6c,Uid:5ee70bb0-55b7-4a80-b5cb-3133091615ae,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:22.149020 containerd[1590]: time="2026-01-14T00:08:22.148767677Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f67969d8d-vdxqm,Uid:1e4bec8e-a684-46cb-852e-ae05ed7b56d7,Namespace:calico-apiserver,Attempt:0,}" Jan 14 00:08:22.156589 containerd[1590]: time="2026-01-14T00:08:22.156510523Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54764fcb-v4q5r,Uid:dd5a4796-a0a6-48e1-9d05-262e15688b90,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:22.167269 containerd[1590]: time="2026-01-14T00:08:22.167199168Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-hrn72,Uid:1e53bd66-4746-482e-bb2b-bfd29a1ef20e,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:22.175281 containerd[1590]: time="2026-01-14T00:08:22.175241134Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f67969d8d-7bt2c,Uid:3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4,Namespace:calico-apiserver,Attempt:0,}" Jan 14 00:08:22.273027 containerd[1590]: time="2026-01-14T00:08:22.272900010Z" level=error msg="Failed to destroy network for sandbox \"f589e3d54e6447286ec101b79c8f98500dca67c464d6d93829e17b3190d95fbd\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.281451 containerd[1590]: time="2026-01-14T00:08:22.280572967Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hjxff,Uid:56a52ed7-5f85-4305-9063-0c678613578e,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f589e3d54e6447286ec101b79c8f98500dca67c464d6d93829e17b3190d95fbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.281641 kubelet[2832]: E0114 00:08:22.281362 2832 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f589e3d54e6447286ec101b79c8f98500dca67c464d6d93829e17b3190d95fbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.281641 kubelet[2832]: E0114 00:08:22.281465 2832 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f589e3d54e6447286ec101b79c8f98500dca67c464d6d93829e17b3190d95fbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hjxff" Jan 14 00:08:22.281641 kubelet[2832]: E0114 00:08:22.281492 2832 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f589e3d54e6447286ec101b79c8f98500dca67c464d6d93829e17b3190d95fbd\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-hjxff" Jan 14 00:08:22.281740 kubelet[2832]: E0114 00:08:22.281565 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-hjxff_kube-system(56a52ed7-5f85-4305-9063-0c678613578e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-hjxff_kube-system(56a52ed7-5f85-4305-9063-0c678613578e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f589e3d54e6447286ec101b79c8f98500dca67c464d6d93829e17b3190d95fbd\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-hjxff" podUID="56a52ed7-5f85-4305-9063-0c678613578e" Jan 14 00:08:22.347398 systemd[1]: Created slice kubepods-besteffort-pod6c288445_910a_4d1d_9b62_12f5155b11be.slice - libcontainer container kubepods-besteffort-pod6c288445_910a_4d1d_9b62_12f5155b11be.slice. Jan 14 00:08:22.359610 containerd[1590]: time="2026-01-14T00:08:22.359221914Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8jmff,Uid:6c288445-910a-4d1d-9b62-12f5155b11be,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:22.367402 containerd[1590]: time="2026-01-14T00:08:22.367004006Z" level=error msg="Failed to destroy network for sandbox \"5b7cbd8f7a5ec6c0d98e460ed0a9c14bb4b25030a4eaa81b33d01aa064b0b160\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.382013 containerd[1590]: time="2026-01-14T00:08:22.381904179Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dj86q,Uid:bf272ec4-219f-4fe3-8789-4c7d13ac9aa1,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b7cbd8f7a5ec6c0d98e460ed0a9c14bb4b25030a4eaa81b33d01aa064b0b160\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.382633 kubelet[2832]: E0114 00:08:22.382600 2832 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b7cbd8f7a5ec6c0d98e460ed0a9c14bb4b25030a4eaa81b33d01aa064b0b160\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.382722 kubelet[2832]: E0114 00:08:22.382675 2832 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b7cbd8f7a5ec6c0d98e460ed0a9c14bb4b25030a4eaa81b33d01aa064b0b160\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dj86q" Jan 14 00:08:22.382722 kubelet[2832]: E0114 00:08:22.382696 2832 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5b7cbd8f7a5ec6c0d98e460ed0a9c14bb4b25030a4eaa81b33d01aa064b0b160\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-dj86q" Jan 14 00:08:22.382795 kubelet[2832]: E0114 00:08:22.382761 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-dj86q_kube-system(bf272ec4-219f-4fe3-8789-4c7d13ac9aa1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-dj86q_kube-system(bf272ec4-219f-4fe3-8789-4c7d13ac9aa1)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5b7cbd8f7a5ec6c0d98e460ed0a9c14bb4b25030a4eaa81b33d01aa064b0b160\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-dj86q" podUID="bf272ec4-219f-4fe3-8789-4c7d13ac9aa1" Jan 14 00:08:22.401214 containerd[1590]: time="2026-01-14T00:08:22.401082210Z" level=error msg="Failed to destroy network for sandbox \"324ac3c41f68d71a0d5edba9bcf0673e1f250fc1305c0a6b28c1cbf33e11aaba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.407717 containerd[1590]: time="2026-01-14T00:08:22.407665140Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b44cc6f4-gxl6c,Uid:5ee70bb0-55b7-4a80-b5cb-3133091615ae,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"324ac3c41f68d71a0d5edba9bcf0673e1f250fc1305c0a6b28c1cbf33e11aaba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.408370 kubelet[2832]: E0114 00:08:22.408285 2832 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"324ac3c41f68d71a0d5edba9bcf0673e1f250fc1305c0a6b28c1cbf33e11aaba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.408370 kubelet[2832]: E0114 00:08:22.408361 2832 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"324ac3c41f68d71a0d5edba9bcf0673e1f250fc1305c0a6b28c1cbf33e11aaba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" Jan 14 00:08:22.408482 kubelet[2832]: E0114 00:08:22.408384 2832 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"324ac3c41f68d71a0d5edba9bcf0673e1f250fc1305c0a6b28c1cbf33e11aaba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" Jan 14 00:08:22.408482 kubelet[2832]: E0114 00:08:22.408438 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-b44cc6f4-gxl6c_calico-system(5ee70bb0-55b7-4a80-b5cb-3133091615ae)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-b44cc6f4-gxl6c_calico-system(5ee70bb0-55b7-4a80-b5cb-3133091615ae)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"324ac3c41f68d71a0d5edba9bcf0673e1f250fc1305c0a6b28c1cbf33e11aaba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:08:22.415471 containerd[1590]: time="2026-01-14T00:08:22.415307092Z" level=error msg="Failed to destroy network for sandbox \"dd38219e776a5bd98baa1ac34dfefae0b5a3bd8ed6d363a4d4484c352321039d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.419902 containerd[1590]: time="2026-01-14T00:08:22.419848826Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f67969d8d-vdxqm,Uid:1e4bec8e-a684-46cb-852e-ae05ed7b56d7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd38219e776a5bd98baa1ac34dfefae0b5a3bd8ed6d363a4d4484c352321039d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.420397 kubelet[2832]: E0114 00:08:22.420354 2832 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd38219e776a5bd98baa1ac34dfefae0b5a3bd8ed6d363a4d4484c352321039d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.420474 kubelet[2832]: E0114 00:08:22.420412 2832 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd38219e776a5bd98baa1ac34dfefae0b5a3bd8ed6d363a4d4484c352321039d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" Jan 14 00:08:22.420474 kubelet[2832]: E0114 00:08:22.420432 2832 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dd38219e776a5bd98baa1ac34dfefae0b5a3bd8ed6d363a4d4484c352321039d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" Jan 14 00:08:22.420575 kubelet[2832]: E0114 00:08:22.420487 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f67969d8d-vdxqm_calico-apiserver(1e4bec8e-a684-46cb-852e-ae05ed7b56d7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f67969d8d-vdxqm_calico-apiserver(1e4bec8e-a684-46cb-852e-ae05ed7b56d7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dd38219e776a5bd98baa1ac34dfefae0b5a3bd8ed6d363a4d4484c352321039d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:08:22.438808 containerd[1590]: time="2026-01-14T00:08:22.438665529Z" level=error msg="Failed to destroy network for sandbox \"43f1081ac782b951f282c4bad196f4e1c40ce13642d6a01b9d0ab2d486a489fb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.441655 containerd[1590]: time="2026-01-14T00:08:22.441604966Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-54764fcb-v4q5r,Uid:dd5a4796-a0a6-48e1-9d05-262e15688b90,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"43f1081ac782b951f282c4bad196f4e1c40ce13642d6a01b9d0ab2d486a489fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.442094 kubelet[2832]: E0114 00:08:22.442047 2832 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43f1081ac782b951f282c4bad196f4e1c40ce13642d6a01b9d0ab2d486a489fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.442164 kubelet[2832]: E0114 00:08:22.442108 2832 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43f1081ac782b951f282c4bad196f4e1c40ce13642d6a01b9d0ab2d486a489fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54764fcb-v4q5r" Jan 14 00:08:22.442164 kubelet[2832]: E0114 00:08:22.442129 2832 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"43f1081ac782b951f282c4bad196f4e1c40ce13642d6a01b9d0ab2d486a489fb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-54764fcb-v4q5r" Jan 14 00:08:22.442226 kubelet[2832]: E0114 00:08:22.442188 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-54764fcb-v4q5r_calico-system(dd5a4796-a0a6-48e1-9d05-262e15688b90)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-54764fcb-v4q5r_calico-system(dd5a4796-a0a6-48e1-9d05-262e15688b90)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"43f1081ac782b951f282c4bad196f4e1c40ce13642d6a01b9d0ab2d486a489fb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-54764fcb-v4q5r" podUID="dd5a4796-a0a6-48e1-9d05-262e15688b90" Jan 14 00:08:22.448330 containerd[1590]: time="2026-01-14T00:08:22.448259305Z" level=error msg="Failed to destroy network for sandbox \"bae60a5182051637e00993b012fd39e58f0ea30c0fd01811910512ec605d1b08\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.454139 containerd[1590]: time="2026-01-14T00:08:22.454075571Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-hrn72,Uid:1e53bd66-4746-482e-bb2b-bfd29a1ef20e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bae60a5182051637e00993b012fd39e58f0ea30c0fd01811910512ec605d1b08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.454664 kubelet[2832]: E0114 00:08:22.454614 2832 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bae60a5182051637e00993b012fd39e58f0ea30c0fd01811910512ec605d1b08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.454753 kubelet[2832]: E0114 00:08:22.454679 2832 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bae60a5182051637e00993b012fd39e58f0ea30c0fd01811910512ec605d1b08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-hrn72" Jan 14 00:08:22.454753 kubelet[2832]: E0114 00:08:22.454699 2832 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bae60a5182051637e00993b012fd39e58f0ea30c0fd01811910512ec605d1b08\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-hrn72" Jan 14 00:08:22.454797 kubelet[2832]: E0114 00:08:22.454743 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-hrn72_calico-system(1e53bd66-4746-482e-bb2b-bfd29a1ef20e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-hrn72_calico-system(1e53bd66-4746-482e-bb2b-bfd29a1ef20e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bae60a5182051637e00993b012fd39e58f0ea30c0fd01811910512ec605d1b08\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:08:22.460592 containerd[1590]: time="2026-01-14T00:08:22.460543885Z" level=error msg="Failed to destroy network for sandbox \"2708646fe2349ca476910a32bdc07c5a2dcc20d8bf5957c723fedac67c4b1d36\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.463843 containerd[1590]: time="2026-01-14T00:08:22.463673588Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f67969d8d-7bt2c,Uid:3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2708646fe2349ca476910a32bdc07c5a2dcc20d8bf5957c723fedac67c4b1d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.464183 kubelet[2832]: E0114 00:08:22.464142 2832 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2708646fe2349ca476910a32bdc07c5a2dcc20d8bf5957c723fedac67c4b1d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.464450 kubelet[2832]: E0114 00:08:22.464380 2832 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2708646fe2349ca476910a32bdc07c5a2dcc20d8bf5957c723fedac67c4b1d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" Jan 14 00:08:22.464450 kubelet[2832]: E0114 00:08:22.464414 2832 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2708646fe2349ca476910a32bdc07c5a2dcc20d8bf5957c723fedac67c4b1d36\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" Jan 14 00:08:22.464853 kubelet[2832]: E0114 00:08:22.464669 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6f67969d8d-7bt2c_calico-apiserver(3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6f67969d8d-7bt2c_calico-apiserver(3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2708646fe2349ca476910a32bdc07c5a2dcc20d8bf5957c723fedac67c4b1d36\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:08:22.488430 containerd[1590]: time="2026-01-14T00:08:22.487496167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 14 00:08:22.495926 containerd[1590]: time="2026-01-14T00:08:22.495874859Z" level=error msg="Failed to destroy network for sandbox \"4da0e33b06cde3f39a5f079029e8bcb8126cd2cceb244fc3b0b6143efbbabc00\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.498662 containerd[1590]: time="2026-01-14T00:08:22.498611229Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8jmff,Uid:6c288445-910a-4d1d-9b62-12f5155b11be,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4da0e33b06cde3f39a5f079029e8bcb8126cd2cceb244fc3b0b6143efbbabc00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.500920 kubelet[2832]: E0114 00:08:22.499096 2832 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4da0e33b06cde3f39a5f079029e8bcb8126cd2cceb244fc3b0b6143efbbabc00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 14 00:08:22.500920 kubelet[2832]: E0114 00:08:22.499160 2832 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4da0e33b06cde3f39a5f079029e8bcb8126cd2cceb244fc3b0b6143efbbabc00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8jmff" Jan 14 00:08:22.500920 kubelet[2832]: E0114 00:08:22.499194 2832 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4da0e33b06cde3f39a5f079029e8bcb8126cd2cceb244fc3b0b6143efbbabc00\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-8jmff" Jan 14 00:08:22.501100 kubelet[2832]: E0114 00:08:22.499246 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4da0e33b06cde3f39a5f079029e8bcb8126cd2cceb244fc3b0b6143efbbabc00\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:08:30.152956 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2865274290.mount: Deactivated successfully. Jan 14 00:08:30.175865 containerd[1590]: time="2026-01-14T00:08:30.175796522Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:30.177293 containerd[1590]: time="2026-01-14T00:08:30.177154395Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 14 00:08:30.178602 containerd[1590]: time="2026-01-14T00:08:30.178383694Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:30.180840 containerd[1590]: time="2026-01-14T00:08:30.180780524Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 14 00:08:30.181798 containerd[1590]: time="2026-01-14T00:08:30.181656303Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 7.693495527s" Jan 14 00:08:30.181798 containerd[1590]: time="2026-01-14T00:08:30.181692947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 14 00:08:30.205756 containerd[1590]: time="2026-01-14T00:08:30.205504516Z" level=info msg="CreateContainer within sandbox \"9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 14 00:08:30.222619 containerd[1590]: time="2026-01-14T00:08:30.222572003Z" level=info msg="Container bd67ebed549388e0f56e5a453e43e855e29a3416e724d84afe08846d49715c39: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:08:30.242168 containerd[1590]: time="2026-01-14T00:08:30.242105488Z" level=info msg="CreateContainer within sandbox \"9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"bd67ebed549388e0f56e5a453e43e855e29a3416e724d84afe08846d49715c39\"" Jan 14 00:08:30.245553 containerd[1590]: time="2026-01-14T00:08:30.244592169Z" level=info msg="StartContainer for \"bd67ebed549388e0f56e5a453e43e855e29a3416e724d84afe08846d49715c39\"" Jan 14 00:08:30.247500 containerd[1590]: time="2026-01-14T00:08:30.247452012Z" level=info msg="connecting to shim bd67ebed549388e0f56e5a453e43e855e29a3416e724d84afe08846d49715c39" address="unix:///run/containerd/s/621e46b795ba93a712d09a908cef29e1ca946d09b47cec9fd28e90c2c6d919d9" protocol=ttrpc version=3 Jan 14 00:08:30.277964 systemd[1]: Started cri-containerd-bd67ebed549388e0f56e5a453e43e855e29a3416e724d84afe08846d49715c39.scope - libcontainer container bd67ebed549388e0f56e5a453e43e855e29a3416e724d84afe08846d49715c39. Jan 14 00:08:30.343787 kernel: kauditd_printk_skb: 6 callbacks suppressed Jan 14 00:08:30.343907 kernel: audit: type=1334 audit(1768349310.341:569): prog-id=170 op=LOAD Jan 14 00:08:30.341000 audit: BPF prog-id=170 op=LOAD Jan 14 00:08:30.346597 kernel: audit: type=1300 audit(1768349310.341:569): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3374 pid=3878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:30.341000 audit[3878]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3374 pid=3878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:30.341000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264363765626564353439333838653066353665356134353365343365 Jan 14 00:08:30.349067 kernel: audit: type=1327 audit(1768349310.341:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264363765626564353439333838653066353665356134353365343365 Jan 14 00:08:30.345000 audit: BPF prog-id=171 op=LOAD Jan 14 00:08:30.349769 kernel: audit: type=1334 audit(1768349310.345:570): prog-id=171 op=LOAD Jan 14 00:08:30.352383 kernel: audit: type=1300 audit(1768349310.345:570): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3374 pid=3878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:30.345000 audit[3878]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3374 pid=3878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:30.345000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264363765626564353439333838653066353665356134353365343365 Jan 14 00:08:30.356580 kernel: audit: type=1327 audit(1768349310.345:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264363765626564353439333838653066353665356134353365343365 Jan 14 00:08:30.345000 audit: BPF prog-id=171 op=UNLOAD Jan 14 00:08:30.358450 kernel: audit: type=1334 audit(1768349310.345:571): prog-id=171 op=UNLOAD Jan 14 00:08:30.361821 kernel: audit: type=1300 audit(1768349310.345:571): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:30.345000 audit[3878]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:30.345000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264363765626564353439333838653066353665356134353365343365 Jan 14 00:08:30.365600 kernel: audit: type=1327 audit(1768349310.345:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264363765626564353439333838653066353665356134353365343365 Jan 14 00:08:30.345000 audit: BPF prog-id=170 op=UNLOAD Jan 14 00:08:30.366945 kernel: audit: type=1334 audit(1768349310.345:572): prog-id=170 op=UNLOAD Jan 14 00:08:30.345000 audit[3878]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3374 pid=3878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:30.345000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264363765626564353439333838653066353665356134353365343365 Jan 14 00:08:30.345000 audit: BPF prog-id=172 op=LOAD Jan 14 00:08:30.345000 audit[3878]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3374 pid=3878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:30.345000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6264363765626564353439333838653066353665356134353365343365 Jan 14 00:08:30.388563 containerd[1590]: time="2026-01-14T00:08:30.388420967Z" level=info msg="StartContainer for \"bd67ebed549388e0f56e5a453e43e855e29a3416e724d84afe08846d49715c39\" returns successfully" Jan 14 00:08:30.557051 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 14 00:08:30.557174 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 14 00:08:30.569580 kubelet[2832]: I0114 00:08:30.568886 2832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-v4mx8" podStartSLOduration=1.319701884 podStartE2EDuration="18.568826576s" podCreationTimestamp="2026-01-14 00:08:12 +0000 UTC" firstStartedPulling="2026-01-14 00:08:12.933656738 +0000 UTC m=+26.736068045" lastFinishedPulling="2026-01-14 00:08:30.18278143 +0000 UTC m=+43.985192737" observedRunningTime="2026-01-14 00:08:30.55974355 +0000 UTC m=+44.362154857" watchObservedRunningTime="2026-01-14 00:08:30.568826576 +0000 UTC m=+44.371237883" Jan 14 00:08:30.785545 kubelet[2832]: I0114 00:08:30.785256 2832 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dd5a4796-a0a6-48e1-9d05-262e15688b90-whisker-backend-key-pair\") pod \"dd5a4796-a0a6-48e1-9d05-262e15688b90\" (UID: \"dd5a4796-a0a6-48e1-9d05-262e15688b90\") " Jan 14 00:08:30.785545 kubelet[2832]: I0114 00:08:30.785307 2832 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cnzjn\" (UniqueName: \"kubernetes.io/projected/dd5a4796-a0a6-48e1-9d05-262e15688b90-kube-api-access-cnzjn\") pod \"dd5a4796-a0a6-48e1-9d05-262e15688b90\" (UID: \"dd5a4796-a0a6-48e1-9d05-262e15688b90\") " Jan 14 00:08:30.785545 kubelet[2832]: I0114 00:08:30.785335 2832 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5a4796-a0a6-48e1-9d05-262e15688b90-whisker-ca-bundle\") pod \"dd5a4796-a0a6-48e1-9d05-262e15688b90\" (UID: \"dd5a4796-a0a6-48e1-9d05-262e15688b90\") " Jan 14 00:08:30.791586 kubelet[2832]: I0114 00:08:30.791481 2832 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd5a4796-a0a6-48e1-9d05-262e15688b90-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "dd5a4796-a0a6-48e1-9d05-262e15688b90" (UID: "dd5a4796-a0a6-48e1-9d05-262e15688b90"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 14 00:08:30.793808 kubelet[2832]: I0114 00:08:30.793735 2832 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd5a4796-a0a6-48e1-9d05-262e15688b90-kube-api-access-cnzjn" (OuterVolumeSpecName: "kube-api-access-cnzjn") pod "dd5a4796-a0a6-48e1-9d05-262e15688b90" (UID: "dd5a4796-a0a6-48e1-9d05-262e15688b90"). InnerVolumeSpecName "kube-api-access-cnzjn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 14 00:08:30.794097 kubelet[2832]: I0114 00:08:30.794073 2832 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd5a4796-a0a6-48e1-9d05-262e15688b90-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "dd5a4796-a0a6-48e1-9d05-262e15688b90" (UID: "dd5a4796-a0a6-48e1-9d05-262e15688b90"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 14 00:08:30.887548 kubelet[2832]: I0114 00:08:30.886510 2832 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/dd5a4796-a0a6-48e1-9d05-262e15688b90-whisker-backend-key-pair\") on node \"ci-4547-0-0-n-fb1a601aa4\" DevicePath \"\"" Jan 14 00:08:30.887548 kubelet[2832]: I0114 00:08:30.887483 2832 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cnzjn\" (UniqueName: \"kubernetes.io/projected/dd5a4796-a0a6-48e1-9d05-262e15688b90-kube-api-access-cnzjn\") on node \"ci-4547-0-0-n-fb1a601aa4\" DevicePath \"\"" Jan 14 00:08:30.887548 kubelet[2832]: I0114 00:08:30.887502 2832 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd5a4796-a0a6-48e1-9d05-262e15688b90-whisker-ca-bundle\") on node \"ci-4547-0-0-n-fb1a601aa4\" DevicePath \"\"" Jan 14 00:08:31.152051 systemd[1]: var-lib-kubelet-pods-dd5a4796\x2da0a6\x2d48e1\x2d9d05\x2d262e15688b90-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dcnzjn.mount: Deactivated successfully. Jan 14 00:08:31.152220 systemd[1]: var-lib-kubelet-pods-dd5a4796\x2da0a6\x2d48e1\x2d9d05\x2d262e15688b90-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 14 00:08:31.530762 systemd[1]: Removed slice kubepods-besteffort-poddd5a4796_a0a6_48e1_9d05_262e15688b90.slice - libcontainer container kubepods-besteffort-poddd5a4796_a0a6_48e1_9d05_262e15688b90.slice. Jan 14 00:08:31.609944 systemd[1]: Created slice kubepods-besteffort-pod5ea780f2_7146_4be4_95de_faccba85fdbd.slice - libcontainer container kubepods-besteffort-pod5ea780f2_7146_4be4_95de_faccba85fdbd.slice. Jan 14 00:08:31.694136 kubelet[2832]: I0114 00:08:31.694075 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5ea780f2-7146-4be4-95de-faccba85fdbd-whisker-ca-bundle\") pod \"whisker-684bfd8c46-zxdr6\" (UID: \"5ea780f2-7146-4be4-95de-faccba85fdbd\") " pod="calico-system/whisker-684bfd8c46-zxdr6" Jan 14 00:08:31.694136 kubelet[2832]: I0114 00:08:31.694126 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2vgh\" (UniqueName: \"kubernetes.io/projected/5ea780f2-7146-4be4-95de-faccba85fdbd-kube-api-access-z2vgh\") pod \"whisker-684bfd8c46-zxdr6\" (UID: \"5ea780f2-7146-4be4-95de-faccba85fdbd\") " pod="calico-system/whisker-684bfd8c46-zxdr6" Jan 14 00:08:31.694136 kubelet[2832]: I0114 00:08:31.694150 2832 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/5ea780f2-7146-4be4-95de-faccba85fdbd-whisker-backend-key-pair\") pod \"whisker-684bfd8c46-zxdr6\" (UID: \"5ea780f2-7146-4be4-95de-faccba85fdbd\") " pod="calico-system/whisker-684bfd8c46-zxdr6" Jan 14 00:08:31.914710 containerd[1590]: time="2026-01-14T00:08:31.914616640Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-684bfd8c46-zxdr6,Uid:5ea780f2-7146-4be4-95de-faccba85fdbd,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:32.176446 systemd-networkd[1476]: cali3e281ddd5ba: Link UP Jan 14 00:08:32.181195 systemd-networkd[1476]: cali3e281ddd5ba: Gained carrier Jan 14 00:08:32.221438 containerd[1590]: 2026-01-14 00:08:31.945 [INFO][3989] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 00:08:32.221438 containerd[1590]: 2026-01-14 00:08:32.007 [INFO][3989] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0 whisker-684bfd8c46- calico-system 5ea780f2-7146-4be4-95de-faccba85fdbd 926 0 2026-01-14 00:08:31 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:684bfd8c46 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-n-fb1a601aa4 whisker-684bfd8c46-zxdr6 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali3e281ddd5ba [] [] }} ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Namespace="calico-system" Pod="whisker-684bfd8c46-zxdr6" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-" Jan 14 00:08:32.221438 containerd[1590]: 2026-01-14 00:08:32.008 [INFO][3989] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Namespace="calico-system" Pod="whisker-684bfd8c46-zxdr6" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" Jan 14 00:08:32.221438 containerd[1590]: 2026-01-14 00:08:32.083 [INFO][4009] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" HandleID="k8s-pod-network.fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" Jan 14 00:08:32.221760 containerd[1590]: 2026-01-14 00:08:32.083 [INFO][4009] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" HandleID="k8s-pod-network.fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000231600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-fb1a601aa4", "pod":"whisker-684bfd8c46-zxdr6", "timestamp":"2026-01-14 00:08:32.083490679 +0000 UTC"}, Hostname:"ci-4547-0-0-n-fb1a601aa4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 00:08:32.221760 containerd[1590]: 2026-01-14 00:08:32.083 [INFO][4009] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 00:08:32.221760 containerd[1590]: 2026-01-14 00:08:32.084 [INFO][4009] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 00:08:32.221760 containerd[1590]: 2026-01-14 00:08:32.084 [INFO][4009] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-fb1a601aa4' Jan 14 00:08:32.221760 containerd[1590]: 2026-01-14 00:08:32.099 [INFO][4009] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:32.221760 containerd[1590]: 2026-01-14 00:08:32.106 [INFO][4009] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:32.221760 containerd[1590]: 2026-01-14 00:08:32.115 [INFO][4009] ipam/ipam.go 511: Trying affinity for 192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:32.221760 containerd[1590]: 2026-01-14 00:08:32.121 [INFO][4009] ipam/ipam.go 158: Attempting to load block cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:32.221760 containerd[1590]: 2026-01-14 00:08:32.128 [INFO][4009] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:32.221950 containerd[1590]: 2026-01-14 00:08:32.128 [INFO][4009] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.73.128/26 handle="k8s-pod-network.fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:32.221950 containerd[1590]: 2026-01-14 00:08:32.131 [INFO][4009] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba Jan 14 00:08:32.221950 containerd[1590]: 2026-01-14 00:08:32.140 [INFO][4009] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.73.128/26 handle="k8s-pod-network.fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:32.221950 containerd[1590]: 2026-01-14 00:08:32.149 [INFO][4009] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.73.129/26] block=192.168.73.128/26 handle="k8s-pod-network.fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:32.221950 containerd[1590]: 2026-01-14 00:08:32.149 [INFO][4009] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.73.129/26] handle="k8s-pod-network.fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:32.221950 containerd[1590]: 2026-01-14 00:08:32.149 [INFO][4009] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 00:08:32.221950 containerd[1590]: 2026-01-14 00:08:32.150 [INFO][4009] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.73.129/26] IPv6=[] ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" HandleID="k8s-pod-network.fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" Jan 14 00:08:32.222082 containerd[1590]: 2026-01-14 00:08:32.159 [INFO][3989] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Namespace="calico-system" Pod="whisker-684bfd8c46-zxdr6" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0", GenerateName:"whisker-684bfd8c46-", Namespace:"calico-system", SelfLink:"", UID:"5ea780f2-7146-4be4-95de-faccba85fdbd", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"684bfd8c46", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"", Pod:"whisker-684bfd8c46-zxdr6", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.73.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3e281ddd5ba", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:32.222082 containerd[1590]: 2026-01-14 00:08:32.159 [INFO][3989] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.73.129/32] ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Namespace="calico-system" Pod="whisker-684bfd8c46-zxdr6" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" Jan 14 00:08:32.222150 containerd[1590]: 2026-01-14 00:08:32.160 [INFO][3989] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3e281ddd5ba ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Namespace="calico-system" Pod="whisker-684bfd8c46-zxdr6" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" Jan 14 00:08:32.222150 containerd[1590]: 2026-01-14 00:08:32.176 [INFO][3989] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Namespace="calico-system" Pod="whisker-684bfd8c46-zxdr6" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" Jan 14 00:08:32.222194 containerd[1590]: 2026-01-14 00:08:32.183 [INFO][3989] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Namespace="calico-system" Pod="whisker-684bfd8c46-zxdr6" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0", GenerateName:"whisker-684bfd8c46-", Namespace:"calico-system", SelfLink:"", UID:"5ea780f2-7146-4be4-95de-faccba85fdbd", ResourceVersion:"926", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"684bfd8c46", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba", Pod:"whisker-684bfd8c46-zxdr6", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.73.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali3e281ddd5ba", MAC:"6e:cd:cd:71:52:b2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:32.222238 containerd[1590]: 2026-01-14 00:08:32.209 [INFO][3989] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" Namespace="calico-system" Pod="whisker-684bfd8c46-zxdr6" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-whisker--684bfd8c46--zxdr6-eth0" Jan 14 00:08:32.273069 containerd[1590]: time="2026-01-14T00:08:32.272943154Z" level=info msg="connecting to shim fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba" address="unix:///run/containerd/s/faab76f7c4de7ccb7057e862c45f0e22b1150bcbc831c8d124e02de47a0c8401" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:32.319776 systemd[1]: Started cri-containerd-fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba.scope - libcontainer container fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba. Jan 14 00:08:32.337760 kubelet[2832]: I0114 00:08:32.337694 2832 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd5a4796-a0a6-48e1-9d05-262e15688b90" path="/var/lib/kubelet/pods/dd5a4796-a0a6-48e1-9d05-262e15688b90/volumes" Jan 14 00:08:32.340000 audit: BPF prog-id=173 op=LOAD Jan 14 00:08:32.341000 audit: BPF prog-id=174 op=LOAD Jan 14 00:08:32.341000 audit[4118]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186180 a2=98 a3=0 items=0 ppid=4107 pid=4118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:32.341000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662643733666231633435363231343239656362323634626265303738 Jan 14 00:08:32.342000 audit: BPF prog-id=174 op=UNLOAD Jan 14 00:08:32.342000 audit[4118]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4107 pid=4118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:32.342000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662643733666231633435363231343239656362323634626265303738 Jan 14 00:08:32.342000 audit: BPF prog-id=175 op=LOAD Jan 14 00:08:32.342000 audit[4118]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001863e8 a2=98 a3=0 items=0 ppid=4107 pid=4118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:32.342000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662643733666231633435363231343239656362323634626265303738 Jan 14 00:08:32.343000 audit: BPF prog-id=176 op=LOAD Jan 14 00:08:32.343000 audit[4118]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000186168 a2=98 a3=0 items=0 ppid=4107 pid=4118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:32.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662643733666231633435363231343239656362323634626265303738 Jan 14 00:08:32.343000 audit: BPF prog-id=176 op=UNLOAD Jan 14 00:08:32.343000 audit[4118]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4107 pid=4118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:32.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662643733666231633435363231343239656362323634626265303738 Jan 14 00:08:32.343000 audit: BPF prog-id=175 op=UNLOAD Jan 14 00:08:32.343000 audit[4118]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4107 pid=4118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:32.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662643733666231633435363231343239656362323634626265303738 Jan 14 00:08:32.343000 audit: BPF prog-id=177 op=LOAD Jan 14 00:08:32.343000 audit[4118]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186648 a2=98 a3=0 items=0 ppid=4107 pid=4118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:32.343000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6662643733666231633435363231343239656362323634626265303738 Jan 14 00:08:32.395806 containerd[1590]: time="2026-01-14T00:08:32.395763172Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-684bfd8c46-zxdr6,Uid:5ea780f2-7146-4be4-95de-faccba85fdbd,Namespace:calico-system,Attempt:0,} returns sandbox id \"fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba\"" Jan 14 00:08:32.398537 containerd[1590]: time="2026-01-14T00:08:32.398125509Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 00:08:32.731300 containerd[1590]: time="2026-01-14T00:08:32.731212469Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:32.732849 containerd[1590]: time="2026-01-14T00:08:32.732730114Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 00:08:32.733014 containerd[1590]: time="2026-01-14T00:08:32.732884131Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:32.733446 kubelet[2832]: E0114 00:08:32.733254 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:08:32.733446 kubelet[2832]: E0114 00:08:32.733352 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:08:32.738158 kubelet[2832]: E0114 00:08:32.738084 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c290d008f4c4d48b25c8570357599ee,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:32.741725 containerd[1590]: time="2026-01-14T00:08:32.741690370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 00:08:33.095490 containerd[1590]: time="2026-01-14T00:08:33.095377124Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:33.097095 containerd[1590]: time="2026-01-14T00:08:33.097011179Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 00:08:33.097223 containerd[1590]: time="2026-01-14T00:08:33.097154754Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:33.097539 kubelet[2832]: E0114 00:08:33.097461 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:08:33.097823 kubelet[2832]: E0114 00:08:33.097569 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:08:33.097980 kubelet[2832]: E0114 00:08:33.097800 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:33.099707 kubelet[2832]: E0114 00:08:33.099002 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:08:33.335280 containerd[1590]: time="2026-01-14T00:08:33.334932504Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hjxff,Uid:56a52ed7-5f85-4305-9063-0c678613578e,Namespace:kube-system,Attempt:0,}" Jan 14 00:08:33.336069 containerd[1590]: time="2026-01-14T00:08:33.336030421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f67969d8d-7bt2c,Uid:3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4,Namespace:calico-apiserver,Attempt:0,}" Jan 14 00:08:33.535755 systemd-networkd[1476]: cali2ad8a1c2d38: Link UP Jan 14 00:08:33.541101 systemd-networkd[1476]: cali2ad8a1c2d38: Gained carrier Jan 14 00:08:33.553802 kubelet[2832]: E0114 00:08:33.552845 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:08:33.578733 containerd[1590]: 2026-01-14 00:08:33.393 [INFO][4186] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 00:08:33.578733 containerd[1590]: 2026-01-14 00:08:33.416 [INFO][4186] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0 calico-apiserver-6f67969d8d- calico-apiserver 3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4 855 0 2026-01-14 00:08:03 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f67969d8d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-n-fb1a601aa4 calico-apiserver-6f67969d8d-7bt2c eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2ad8a1c2d38 [] [] }} ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-7bt2c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-" Jan 14 00:08:33.578733 containerd[1590]: 2026-01-14 00:08:33.416 [INFO][4186] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-7bt2c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" Jan 14 00:08:33.578733 containerd[1590]: 2026-01-14 00:08:33.450 [INFO][4205] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" HandleID="k8s-pod-network.5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" Jan 14 00:08:33.578980 containerd[1590]: 2026-01-14 00:08:33.451 [INFO][4205] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" HandleID="k8s-pod-network.5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c15a0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-n-fb1a601aa4", "pod":"calico-apiserver-6f67969d8d-7bt2c", "timestamp":"2026-01-14 00:08:33.450894765 +0000 UTC"}, Hostname:"ci-4547-0-0-n-fb1a601aa4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 00:08:33.578980 containerd[1590]: 2026-01-14 00:08:33.451 [INFO][4205] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 00:08:33.578980 containerd[1590]: 2026-01-14 00:08:33.451 [INFO][4205] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 00:08:33.578980 containerd[1590]: 2026-01-14 00:08:33.451 [INFO][4205] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-fb1a601aa4' Jan 14 00:08:33.578980 containerd[1590]: 2026-01-14 00:08:33.464 [INFO][4205] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.578980 containerd[1590]: 2026-01-14 00:08:33.473 [INFO][4205] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.578980 containerd[1590]: 2026-01-14 00:08:33.481 [INFO][4205] ipam/ipam.go 511: Trying affinity for 192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.578980 containerd[1590]: 2026-01-14 00:08:33.485 [INFO][4205] ipam/ipam.go 158: Attempting to load block cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.578980 containerd[1590]: 2026-01-14 00:08:33.490 [INFO][4205] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.579226 containerd[1590]: 2026-01-14 00:08:33.490 [INFO][4205] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.73.128/26 handle="k8s-pod-network.5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.579226 containerd[1590]: 2026-01-14 00:08:33.494 [INFO][4205] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3 Jan 14 00:08:33.579226 containerd[1590]: 2026-01-14 00:08:33.501 [INFO][4205] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.73.128/26 handle="k8s-pod-network.5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.579226 containerd[1590]: 2026-01-14 00:08:33.511 [INFO][4205] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.73.130/26] block=192.168.73.128/26 handle="k8s-pod-network.5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.579226 containerd[1590]: 2026-01-14 00:08:33.511 [INFO][4205] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.73.130/26] handle="k8s-pod-network.5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.579226 containerd[1590]: 2026-01-14 00:08:33.511 [INFO][4205] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 00:08:33.579226 containerd[1590]: 2026-01-14 00:08:33.511 [INFO][4205] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.73.130/26] IPv6=[] ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" HandleID="k8s-pod-network.5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" Jan 14 00:08:33.579361 containerd[1590]: 2026-01-14 00:08:33.517 [INFO][4186] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-7bt2c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0", GenerateName:"calico-apiserver-6f67969d8d-", Namespace:"calico-apiserver", SelfLink:"", UID:"3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f67969d8d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"", Pod:"calico-apiserver-6f67969d8d-7bt2c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.73.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2ad8a1c2d38", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:33.579427 containerd[1590]: 2026-01-14 00:08:33.517 [INFO][4186] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.73.130/32] ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-7bt2c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" Jan 14 00:08:33.579427 containerd[1590]: 2026-01-14 00:08:33.517 [INFO][4186] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2ad8a1c2d38 ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-7bt2c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" Jan 14 00:08:33.579427 containerd[1590]: 2026-01-14 00:08:33.543 [INFO][4186] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-7bt2c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" Jan 14 00:08:33.579489 containerd[1590]: 2026-01-14 00:08:33.550 [INFO][4186] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-7bt2c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0", GenerateName:"calico-apiserver-6f67969d8d-", Namespace:"calico-apiserver", SelfLink:"", UID:"3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f67969d8d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3", Pod:"calico-apiserver-6f67969d8d-7bt2c", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.73.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2ad8a1c2d38", MAC:"de:af:99:20:3d:c4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:33.579560 containerd[1590]: 2026-01-14 00:08:33.572 [INFO][4186] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-7bt2c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--7bt2c-eth0" Jan 14 00:08:33.617000 audit[4241]: NETFILTER_CFG table=filter:119 family=2 entries=22 op=nft_register_rule pid=4241 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:33.617000 audit[4241]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffe297c320 a2=0 a3=1 items=0 ppid=2942 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.617000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:33.625000 audit[4241]: NETFILTER_CFG table=nat:120 family=2 entries=12 op=nft_register_rule pid=4241 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:33.630868 containerd[1590]: time="2026-01-14T00:08:33.630815517Z" level=info msg="connecting to shim 5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3" address="unix:///run/containerd/s/0081ae074fe66658c531db857751941ae33e7e4fd435e88518c0359eaf572640" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:33.625000 audit[4241]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe297c320 a2=0 a3=1 items=0 ppid=2942 pid=4241 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.625000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:33.659183 systemd-networkd[1476]: calib6cde816088: Link UP Jan 14 00:08:33.662236 systemd-networkd[1476]: calib6cde816088: Gained carrier Jan 14 00:08:33.693539 containerd[1590]: 2026-01-14 00:08:33.390 [INFO][4184] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 00:08:33.693539 containerd[1590]: 2026-01-14 00:08:33.412 [INFO][4184] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0 coredns-674b8bbfcf- kube-system 56a52ed7-5f85-4305-9063-0c678613578e 858 0 2026-01-14 00:07:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-n-fb1a601aa4 coredns-674b8bbfcf-hjxff eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calib6cde816088 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjxff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-" Jan 14 00:08:33.693539 containerd[1590]: 2026-01-14 00:08:33.412 [INFO][4184] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjxff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" Jan 14 00:08:33.693539 containerd[1590]: 2026-01-14 00:08:33.454 [INFO][4204] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" HandleID="k8s-pod-network.53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" Jan 14 00:08:33.693818 containerd[1590]: 2026-01-14 00:08:33.454 [INFO][4204] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" HandleID="k8s-pod-network.53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3260), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-n-fb1a601aa4", "pod":"coredns-674b8bbfcf-hjxff", "timestamp":"2026-01-14 00:08:33.454064384 +0000 UTC"}, Hostname:"ci-4547-0-0-n-fb1a601aa4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 00:08:33.693818 containerd[1590]: 2026-01-14 00:08:33.454 [INFO][4204] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 00:08:33.693818 containerd[1590]: 2026-01-14 00:08:33.511 [INFO][4204] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 00:08:33.693818 containerd[1590]: 2026-01-14 00:08:33.511 [INFO][4204] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-fb1a601aa4' Jan 14 00:08:33.693818 containerd[1590]: 2026-01-14 00:08:33.566 [INFO][4204] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.693818 containerd[1590]: 2026-01-14 00:08:33.588 [INFO][4204] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.693818 containerd[1590]: 2026-01-14 00:08:33.599 [INFO][4204] ipam/ipam.go 511: Trying affinity for 192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.693818 containerd[1590]: 2026-01-14 00:08:33.602 [INFO][4204] ipam/ipam.go 158: Attempting to load block cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.693818 containerd[1590]: 2026-01-14 00:08:33.612 [INFO][4204] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.694059 containerd[1590]: 2026-01-14 00:08:33.612 [INFO][4204] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.73.128/26 handle="k8s-pod-network.53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.694059 containerd[1590]: 2026-01-14 00:08:33.616 [INFO][4204] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9 Jan 14 00:08:33.694059 containerd[1590]: 2026-01-14 00:08:33.628 [INFO][4204] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.73.128/26 handle="k8s-pod-network.53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.694059 containerd[1590]: 2026-01-14 00:08:33.644 [INFO][4204] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.73.131/26] block=192.168.73.128/26 handle="k8s-pod-network.53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.694059 containerd[1590]: 2026-01-14 00:08:33.644 [INFO][4204] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.73.131/26] handle="k8s-pod-network.53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:33.694059 containerd[1590]: 2026-01-14 00:08:33.644 [INFO][4204] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 00:08:33.694059 containerd[1590]: 2026-01-14 00:08:33.644 [INFO][4204] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.73.131/26] IPv6=[] ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" HandleID="k8s-pod-network.53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" Jan 14 00:08:33.694225 containerd[1590]: 2026-01-14 00:08:33.652 [INFO][4184] cni-plugin/k8s.go 418: Populated endpoint ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjxff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"56a52ed7-5f85-4305-9063-0c678613578e", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 7, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"", Pod:"coredns-674b8bbfcf-hjxff", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.73.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib6cde816088", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:33.694225 containerd[1590]: 2026-01-14 00:08:33.653 [INFO][4184] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.73.131/32] ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjxff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" Jan 14 00:08:33.694225 containerd[1590]: 2026-01-14 00:08:33.653 [INFO][4184] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calib6cde816088 ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjxff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" Jan 14 00:08:33.694225 containerd[1590]: 2026-01-14 00:08:33.662 [INFO][4184] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjxff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" Jan 14 00:08:33.694225 containerd[1590]: 2026-01-14 00:08:33.664 [INFO][4184] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjxff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"56a52ed7-5f85-4305-9063-0c678613578e", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 7, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9", Pod:"coredns-674b8bbfcf-hjxff", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.73.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calib6cde816088", MAC:"ba:42:56:af:21:f7", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:33.694225 containerd[1590]: 2026-01-14 00:08:33.687 [INFO][4184] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" Namespace="kube-system" Pod="coredns-674b8bbfcf-hjxff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--hjxff-eth0" Jan 14 00:08:33.702016 systemd[1]: Started cri-containerd-5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3.scope - libcontainer container 5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3. Jan 14 00:08:33.730876 containerd[1590]: time="2026-01-14T00:08:33.730714698Z" level=info msg="connecting to shim 53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9" address="unix:///run/containerd/s/4443f42f9dfe66b46af02c401f1bc56f9f188183bc346ae009c5d6141ee8edc6" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:33.761000 audit: BPF prog-id=178 op=LOAD Jan 14 00:08:33.761000 audit: BPF prog-id=179 op=LOAD Jan 14 00:08:33.761000 audit[4261]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4250 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.761000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561353139363331383137356662333431303031653163346232383933 Jan 14 00:08:33.762000 audit: BPF prog-id=179 op=UNLOAD Jan 14 00:08:33.762000 audit[4261]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4250 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.762000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561353139363331383137356662333431303031653163346232383933 Jan 14 00:08:33.762000 audit: BPF prog-id=180 op=LOAD Jan 14 00:08:33.762000 audit[4261]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4250 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.762000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561353139363331383137356662333431303031653163346232383933 Jan 14 00:08:33.763000 audit: BPF prog-id=181 op=LOAD Jan 14 00:08:33.763000 audit[4261]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4250 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.763000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561353139363331383137356662333431303031653163346232383933 Jan 14 00:08:33.765000 audit: BPF prog-id=181 op=UNLOAD Jan 14 00:08:33.765000 audit[4261]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4250 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.765000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561353139363331383137356662333431303031653163346232383933 Jan 14 00:08:33.765000 audit: BPF prog-id=180 op=UNLOAD Jan 14 00:08:33.765000 audit[4261]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4250 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.765000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561353139363331383137356662333431303031653163346232383933 Jan 14 00:08:33.766000 audit: BPF prog-id=182 op=LOAD Jan 14 00:08:33.766000 audit[4261]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4250 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.766000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3561353139363331383137356662333431303031653163346232383933 Jan 14 00:08:33.775832 systemd[1]: Started cri-containerd-53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9.scope - libcontainer container 53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9. Jan 14 00:08:33.797000 audit: BPF prog-id=183 op=LOAD Jan 14 00:08:33.799000 audit: BPF prog-id=184 op=LOAD Jan 14 00:08:33.799000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4299 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533343735653565303335386364666539666232343235383161666166 Jan 14 00:08:33.799000 audit: BPF prog-id=184 op=UNLOAD Jan 14 00:08:33.799000 audit[4311]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4299 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533343735653565303335386364666539666232343235383161666166 Jan 14 00:08:33.799000 audit: BPF prog-id=185 op=LOAD Jan 14 00:08:33.799000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4299 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533343735653565303335386364666539666232343235383161666166 Jan 14 00:08:33.799000 audit: BPF prog-id=186 op=LOAD Jan 14 00:08:33.799000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4299 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533343735653565303335386364666539666232343235383161666166 Jan 14 00:08:33.799000 audit: BPF prog-id=186 op=UNLOAD Jan 14 00:08:33.799000 audit[4311]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4299 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533343735653565303335386364666539666232343235383161666166 Jan 14 00:08:33.799000 audit: BPF prog-id=185 op=UNLOAD Jan 14 00:08:33.799000 audit[4311]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4299 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533343735653565303335386364666539666232343235383161666166 Jan 14 00:08:33.800000 audit: BPF prog-id=187 op=LOAD Jan 14 00:08:33.800000 audit[4311]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4299 pid=4311 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.800000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3533343735653565303335386364666539666232343235383161666166 Jan 14 00:08:33.854077 containerd[1590]: time="2026-01-14T00:08:33.854019105Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-hjxff,Uid:56a52ed7-5f85-4305-9063-0c678613578e,Namespace:kube-system,Attempt:0,} returns sandbox id \"53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9\"" Jan 14 00:08:33.865266 containerd[1590]: time="2026-01-14T00:08:33.864481786Z" level=info msg="CreateContainer within sandbox \"53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 00:08:33.877592 containerd[1590]: time="2026-01-14T00:08:33.877463416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f67969d8d-7bt2c,Uid:3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3\"" Jan 14 00:08:33.880607 containerd[1590]: time="2026-01-14T00:08:33.880566109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:08:33.882047 containerd[1590]: time="2026-01-14T00:08:33.882014144Z" level=info msg="Container 9f723df0987aae60bee80b4ae19a7681f7f058b4b499acfbce9a4883ad198319: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:08:33.889359 containerd[1590]: time="2026-01-14T00:08:33.889306205Z" level=info msg="CreateContainer within sandbox \"53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9f723df0987aae60bee80b4ae19a7681f7f058b4b499acfbce9a4883ad198319\"" Jan 14 00:08:33.890781 containerd[1590]: time="2026-01-14T00:08:33.890740639Z" level=info msg="StartContainer for \"9f723df0987aae60bee80b4ae19a7681f7f058b4b499acfbce9a4883ad198319\"" Jan 14 00:08:33.898197 containerd[1590]: time="2026-01-14T00:08:33.898128670Z" level=info msg="connecting to shim 9f723df0987aae60bee80b4ae19a7681f7f058b4b499acfbce9a4883ad198319" address="unix:///run/containerd/s/4443f42f9dfe66b46af02c401f1bc56f9f188183bc346ae009c5d6141ee8edc6" protocol=ttrpc version=3 Jan 14 00:08:33.921928 systemd[1]: Started cri-containerd-9f723df0987aae60bee80b4ae19a7681f7f058b4b499acfbce9a4883ad198319.scope - libcontainer container 9f723df0987aae60bee80b4ae19a7681f7f058b4b499acfbce9a4883ad198319. Jan 14 00:08:33.935000 audit: BPF prog-id=188 op=LOAD Jan 14 00:08:33.936000 audit: BPF prog-id=189 op=LOAD Jan 14 00:08:33.936000 audit[4344]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186180 a2=98 a3=0 items=0 ppid=4299 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966373233646630393837616165363062656538306234616531396137 Jan 14 00:08:33.936000 audit: BPF prog-id=189 op=UNLOAD Jan 14 00:08:33.936000 audit[4344]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4299 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.936000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966373233646630393837616165363062656538306234616531396137 Jan 14 00:08:33.937000 audit: BPF prog-id=190 op=LOAD Jan 14 00:08:33.937000 audit[4344]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001863e8 a2=98 a3=0 items=0 ppid=4299 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966373233646630393837616165363062656538306234616531396137 Jan 14 00:08:33.937000 audit: BPF prog-id=191 op=LOAD Jan 14 00:08:33.937000 audit[4344]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000186168 a2=98 a3=0 items=0 ppid=4299 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966373233646630393837616165363062656538306234616531396137 Jan 14 00:08:33.937000 audit: BPF prog-id=191 op=UNLOAD Jan 14 00:08:33.937000 audit[4344]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4299 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966373233646630393837616165363062656538306234616531396137 Jan 14 00:08:33.937000 audit: BPF prog-id=190 op=UNLOAD Jan 14 00:08:33.937000 audit[4344]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4299 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966373233646630393837616165363062656538306234616531396137 Jan 14 00:08:33.937000 audit: BPF prog-id=192 op=LOAD Jan 14 00:08:33.937000 audit[4344]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000186648 a2=98 a3=0 items=0 ppid=4299 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:33.937000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3966373233646630393837616165363062656538306234616531396137 Jan 14 00:08:33.959647 containerd[1590]: time="2026-01-14T00:08:33.959604775Z" level=info msg="StartContainer for \"9f723df0987aae60bee80b4ae19a7681f7f058b4b499acfbce9a4883ad198319\" returns successfully" Jan 14 00:08:34.218123 containerd[1590]: time="2026-01-14T00:08:34.218066852Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:34.219876 containerd[1590]: time="2026-01-14T00:08:34.219740228Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:08:34.219876 containerd[1590]: time="2026-01-14T00:08:34.219811556Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:34.220078 kubelet[2832]: E0114 00:08:34.220031 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:08:34.220469 kubelet[2832]: E0114 00:08:34.220093 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:08:34.222116 kubelet[2832]: E0114 00:08:34.222021 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cr65d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-7bt2c_calico-apiserver(3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:34.223321 kubelet[2832]: E0114 00:08:34.223246 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:08:34.234947 systemd-networkd[1476]: cali3e281ddd5ba: Gained IPv6LL Jan 14 00:08:34.333870 containerd[1590]: time="2026-01-14T00:08:34.333769769Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-hrn72,Uid:1e53bd66-4746-482e-bb2b-bfd29a1ef20e,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:34.335097 containerd[1590]: time="2026-01-14T00:08:34.335066706Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8jmff,Uid:6c288445-910a-4d1d-9b62-12f5155b11be,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:34.337048 containerd[1590]: time="2026-01-14T00:08:34.336469734Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b44cc6f4-gxl6c,Uid:5ee70bb0-55b7-4a80-b5cb-3133091615ae,Namespace:calico-system,Attempt:0,}" Jan 14 00:08:34.337360 containerd[1590]: time="2026-01-14T00:08:34.337331585Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f67969d8d-vdxqm,Uid:1e4bec8e-a684-46cb-852e-ae05ed7b56d7,Namespace:calico-apiserver,Attempt:0,}" Jan 14 00:08:34.539759 kubelet[2832]: E0114 00:08:34.539639 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:08:34.595000 audit[4458]: NETFILTER_CFG table=filter:121 family=2 entries=22 op=nft_register_rule pid=4458 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:34.595000 audit[4458]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffff4435d20 a2=0 a3=1 items=0 ppid=2942 pid=4458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.595000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:34.601000 audit[4458]: NETFILTER_CFG table=nat:122 family=2 entries=12 op=nft_register_rule pid=4458 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:34.601000 audit[4458]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff4435d20 a2=0 a3=1 items=0 ppid=2942 pid=4458 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.601000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:34.622602 systemd-networkd[1476]: cali1b6b8bc8d9c: Link UP Jan 14 00:08:34.624180 systemd-networkd[1476]: cali1b6b8bc8d9c: Gained carrier Jan 14 00:08:34.631000 audit[4462]: NETFILTER_CFG table=filter:123 family=2 entries=19 op=nft_register_rule pid=4462 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:34.631000 audit[4462]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffc5e885f0 a2=0 a3=1 items=0 ppid=2942 pid=4462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.631000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:34.640000 audit[4462]: NETFILTER_CFG table=nat:124 family=2 entries=33 op=nft_register_chain pid=4462 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:34.640000 audit[4462]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=13428 a0=3 a1=ffffc5e885f0 a2=0 a3=1 items=0 ppid=2942 pid=4462 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.640000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:34.651441 kubelet[2832]: I0114 00:08:34.651343 2832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-hjxff" podStartSLOduration=42.651324806 podStartE2EDuration="42.651324806s" podCreationTimestamp="2026-01-14 00:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 00:08:34.58904016 +0000 UTC m=+48.391451467" watchObservedRunningTime="2026-01-14 00:08:34.651324806 +0000 UTC m=+48.453736113" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.443 [INFO][4378] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.465 [INFO][4378] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0 goldmane-666569f655- calico-system 1e53bd66-4746-482e-bb2b-bfd29a1ef20e 857 0 2026-01-14 00:08:09 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-n-fb1a601aa4 goldmane-666569f655-hrn72 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali1b6b8bc8d9c [] [] }} ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Namespace="calico-system" Pod="goldmane-666569f655-hrn72" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.466 [INFO][4378] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Namespace="calico-system" Pod="goldmane-666569f655-hrn72" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.520 [INFO][4432] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" HandleID="k8s-pod-network.aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.520 [INFO][4432] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" HandleID="k8s-pod-network.aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d5660), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-fb1a601aa4", "pod":"goldmane-666569f655-hrn72", "timestamp":"2026-01-14 00:08:34.520039166 +0000 UTC"}, Hostname:"ci-4547-0-0-n-fb1a601aa4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.521 [INFO][4432] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.521 [INFO][4432] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.521 [INFO][4432] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-fb1a601aa4' Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.540 [INFO][4432] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.556 [INFO][4432] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.571 [INFO][4432] ipam/ipam.go 511: Trying affinity for 192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.579 [INFO][4432] ipam/ipam.go 158: Attempting to load block cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.583 [INFO][4432] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.583 [INFO][4432] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.73.128/26 handle="k8s-pod-network.aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.587 [INFO][4432] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5 Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.598 [INFO][4432] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.73.128/26 handle="k8s-pod-network.aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.613 [INFO][4432] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.73.132/26] block=192.168.73.128/26 handle="k8s-pod-network.aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.614 [INFO][4432] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.73.132/26] handle="k8s-pod-network.aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.614 [INFO][4432] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 00:08:34.658585 containerd[1590]: 2026-01-14 00:08:34.614 [INFO][4432] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.73.132/26] IPv6=[] ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" HandleID="k8s-pod-network.aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" Jan 14 00:08:34.659218 containerd[1590]: 2026-01-14 00:08:34.617 [INFO][4378] cni-plugin/k8s.go 418: Populated endpoint ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Namespace="calico-system" Pod="goldmane-666569f655-hrn72" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"1e53bd66-4746-482e-bb2b-bfd29a1ef20e", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"", Pod:"goldmane-666569f655-hrn72", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.73.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali1b6b8bc8d9c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:34.659218 containerd[1590]: 2026-01-14 00:08:34.617 [INFO][4378] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.73.132/32] ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Namespace="calico-system" Pod="goldmane-666569f655-hrn72" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" Jan 14 00:08:34.659218 containerd[1590]: 2026-01-14 00:08:34.617 [INFO][4378] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1b6b8bc8d9c ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Namespace="calico-system" Pod="goldmane-666569f655-hrn72" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" Jan 14 00:08:34.659218 containerd[1590]: 2026-01-14 00:08:34.625 [INFO][4378] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Namespace="calico-system" Pod="goldmane-666569f655-hrn72" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" Jan 14 00:08:34.659218 containerd[1590]: 2026-01-14 00:08:34.625 [INFO][4378] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Namespace="calico-system" Pod="goldmane-666569f655-hrn72" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"1e53bd66-4746-482e-bb2b-bfd29a1ef20e", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 9, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5", Pod:"goldmane-666569f655-hrn72", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.73.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali1b6b8bc8d9c", MAC:"5a:ad:e5:be:ef:20", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:34.659218 containerd[1590]: 2026-01-14 00:08:34.651 [INFO][4378] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" Namespace="calico-system" Pod="goldmane-666569f655-hrn72" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-goldmane--666569f655--hrn72-eth0" Jan 14 00:08:34.699924 containerd[1590]: time="2026-01-14T00:08:34.699857283Z" level=info msg="connecting to shim aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5" address="unix:///run/containerd/s/939bcc8580ae334f0b012ccf1a69592a657ff0763feb614873e5a1e4050ad594" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:34.713149 systemd-networkd[1476]: calidd1ccfc2026: Link UP Jan 14 00:08:34.714573 systemd-networkd[1476]: calidd1ccfc2026: Gained carrier Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.432 [INFO][4380] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.451 [INFO][4380] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0 calico-kube-controllers-b44cc6f4- calico-system 5ee70bb0-55b7-4a80-b5cb-3133091615ae 859 0 2026-01-14 00:08:12 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:b44cc6f4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-n-fb1a601aa4 calico-kube-controllers-b44cc6f4-gxl6c eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] calidd1ccfc2026 [] [] }} ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Namespace="calico-system" Pod="calico-kube-controllers-b44cc6f4-gxl6c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.451 [INFO][4380] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Namespace="calico-system" Pod="calico-kube-controllers-b44cc6f4-gxl6c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.545 [INFO][4425] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" HandleID="k8s-pod-network.c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.548 [INFO][4425] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" HandleID="k8s-pod-network.c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40003d6360), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-fb1a601aa4", "pod":"calico-kube-controllers-b44cc6f4-gxl6c", "timestamp":"2026-01-14 00:08:34.545562377 +0000 UTC"}, Hostname:"ci-4547-0-0-n-fb1a601aa4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.550 [INFO][4425] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.614 [INFO][4425] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.614 [INFO][4425] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-fb1a601aa4' Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.643 [INFO][4425] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.658 [INFO][4425] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.669 [INFO][4425] ipam/ipam.go 511: Trying affinity for 192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.672 [INFO][4425] ipam/ipam.go 158: Attempting to load block cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.676 [INFO][4425] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.676 [INFO][4425] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.73.128/26 handle="k8s-pod-network.c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.679 [INFO][4425] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905 Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.694 [INFO][4425] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.73.128/26 handle="k8s-pod-network.c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.706 [INFO][4425] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.73.133/26] block=192.168.73.128/26 handle="k8s-pod-network.c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.706 [INFO][4425] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.73.133/26] handle="k8s-pod-network.c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.706 [INFO][4425] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 00:08:34.748310 containerd[1590]: 2026-01-14 00:08:34.706 [INFO][4425] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.73.133/26] IPv6=[] ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" HandleID="k8s-pod-network.c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" Jan 14 00:08:34.749738 containerd[1590]: 2026-01-14 00:08:34.708 [INFO][4380] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Namespace="calico-system" Pod="calico-kube-controllers-b44cc6f4-gxl6c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0", GenerateName:"calico-kube-controllers-b44cc6f4-", Namespace:"calico-system", SelfLink:"", UID:"5ee70bb0-55b7-4a80-b5cb-3133091615ae", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"b44cc6f4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"", Pod:"calico-kube-controllers-b44cc6f4-gxl6c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.73.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidd1ccfc2026", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:34.749738 containerd[1590]: 2026-01-14 00:08:34.709 [INFO][4380] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.73.133/32] ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Namespace="calico-system" Pod="calico-kube-controllers-b44cc6f4-gxl6c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" Jan 14 00:08:34.749738 containerd[1590]: 2026-01-14 00:08:34.709 [INFO][4380] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calidd1ccfc2026 ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Namespace="calico-system" Pod="calico-kube-controllers-b44cc6f4-gxl6c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" Jan 14 00:08:34.749738 containerd[1590]: 2026-01-14 00:08:34.717 [INFO][4380] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Namespace="calico-system" Pod="calico-kube-controllers-b44cc6f4-gxl6c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" Jan 14 00:08:34.749738 containerd[1590]: 2026-01-14 00:08:34.719 [INFO][4380] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Namespace="calico-system" Pod="calico-kube-controllers-b44cc6f4-gxl6c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0", GenerateName:"calico-kube-controllers-b44cc6f4-", Namespace:"calico-system", SelfLink:"", UID:"5ee70bb0-55b7-4a80-b5cb-3133091615ae", ResourceVersion:"859", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"b44cc6f4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905", Pod:"calico-kube-controllers-b44cc6f4-gxl6c", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.73.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"calidd1ccfc2026", MAC:"22:22:08:52:29:81", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:34.749738 containerd[1590]: 2026-01-14 00:08:34.741 [INFO][4380] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" Namespace="calico-system" Pod="calico-kube-controllers-b44cc6f4-gxl6c" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--kube--controllers--b44cc6f4--gxl6c-eth0" Jan 14 00:08:34.763842 systemd[1]: Started cri-containerd-aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5.scope - libcontainer container aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5. Jan 14 00:08:34.784000 audit: BPF prog-id=193 op=LOAD Jan 14 00:08:34.785000 audit: BPF prog-id=194 op=LOAD Jan 14 00:08:34.785000 audit[4490]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4478 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.785000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161623461303633623166383739663630623665383130386561383338 Jan 14 00:08:34.786000 audit: BPF prog-id=194 op=UNLOAD Jan 14 00:08:34.786000 audit[4490]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4478 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.786000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161623461303633623166383739663630623665383130386561383338 Jan 14 00:08:34.787000 audit: BPF prog-id=195 op=LOAD Jan 14 00:08:34.787000 audit[4490]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4478 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.787000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161623461303633623166383739663630623665383130386561383338 Jan 14 00:08:34.787000 audit: BPF prog-id=196 op=LOAD Jan 14 00:08:34.787000 audit[4490]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4478 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.787000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161623461303633623166383739663630623665383130386561383338 Jan 14 00:08:34.787000 audit: BPF prog-id=196 op=UNLOAD Jan 14 00:08:34.787000 audit[4490]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4478 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.787000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161623461303633623166383739663630623665383130386561383338 Jan 14 00:08:34.787000 audit: BPF prog-id=195 op=UNLOAD Jan 14 00:08:34.787000 audit[4490]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4478 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.787000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161623461303633623166383739663630623665383130386561383338 Jan 14 00:08:34.788000 audit: BPF prog-id=197 op=LOAD Jan 14 00:08:34.788000 audit[4490]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4478 pid=4490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:34.788000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6161623461303633623166383739663630623665383130386561383338 Jan 14 00:08:34.797165 containerd[1590]: time="2026-01-14T00:08:34.797039408Z" level=info msg="connecting to shim c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905" address="unix:///run/containerd/s/53ceaac9c4c20f401b26a99617081f633cfd2fb1872d4264f09551528b074f8f" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:34.833596 systemd-networkd[1476]: cali8561e96ce8d: Link UP Jan 14 00:08:34.842346 systemd-networkd[1476]: cali8561e96ce8d: Gained carrier Jan 14 00:08:34.863869 systemd[1]: Started cri-containerd-c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905.scope - libcontainer container c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905. Jan 14 00:08:34.875283 systemd-networkd[1476]: cali2ad8a1c2d38: Gained IPv6LL Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.454 [INFO][4399] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.485 [INFO][4399] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0 csi-node-driver- calico-system 6c288445-910a-4d1d-9b62-12f5155b11be 759 0 2026-01-14 00:08:12 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-n-fb1a601aa4 csi-node-driver-8jmff eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali8561e96ce8d [] [] }} ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Namespace="calico-system" Pod="csi-node-driver-8jmff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.485 [INFO][4399] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Namespace="calico-system" Pod="csi-node-driver-8jmff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.554 [INFO][4439] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" HandleID="k8s-pod-network.0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.555 [INFO][4439] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" HandleID="k8s-pod-network.0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cba00), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-n-fb1a601aa4", "pod":"csi-node-driver-8jmff", "timestamp":"2026-01-14 00:08:34.554010867 +0000 UTC"}, Hostname:"ci-4547-0-0-n-fb1a601aa4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.555 [INFO][4439] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.706 [INFO][4439] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.706 [INFO][4439] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-fb1a601aa4' Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.746 [INFO][4439] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.759 [INFO][4439] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.773 [INFO][4439] ipam/ipam.go 511: Trying affinity for 192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.776 [INFO][4439] ipam/ipam.go 158: Attempting to load block cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.783 [INFO][4439] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.783 [INFO][4439] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.73.128/26 handle="k8s-pod-network.0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.787 [INFO][4439] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.800 [INFO][4439] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.73.128/26 handle="k8s-pod-network.0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.814 [INFO][4439] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.73.134/26] block=192.168.73.128/26 handle="k8s-pod-network.0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.814 [INFO][4439] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.73.134/26] handle="k8s-pod-network.0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.817 [INFO][4439] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 00:08:34.884747 containerd[1590]: 2026-01-14 00:08:34.818 [INFO][4439] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.73.134/26] IPv6=[] ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" HandleID="k8s-pod-network.0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" Jan 14 00:08:34.885347 containerd[1590]: 2026-01-14 00:08:34.822 [INFO][4399] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Namespace="calico-system" Pod="csi-node-driver-8jmff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6c288445-910a-4d1d-9b62-12f5155b11be", ResourceVersion:"759", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"", Pod:"csi-node-driver-8jmff", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.73.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8561e96ce8d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:34.885347 containerd[1590]: 2026-01-14 00:08:34.822 [INFO][4399] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.73.134/32] ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Namespace="calico-system" Pod="csi-node-driver-8jmff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" Jan 14 00:08:34.885347 containerd[1590]: 2026-01-14 00:08:34.822 [INFO][4399] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8561e96ce8d ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Namespace="calico-system" Pod="csi-node-driver-8jmff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" Jan 14 00:08:34.885347 containerd[1590]: 2026-01-14 00:08:34.841 [INFO][4399] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Namespace="calico-system" Pod="csi-node-driver-8jmff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" Jan 14 00:08:34.885347 containerd[1590]: 2026-01-14 00:08:34.853 [INFO][4399] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Namespace="calico-system" Pod="csi-node-driver-8jmff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"6c288445-910a-4d1d-9b62-12f5155b11be", ResourceVersion:"759", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d", Pod:"csi-node-driver-8jmff", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.73.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali8561e96ce8d", MAC:"6e:5e:ea:d8:1d:72", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:34.885347 containerd[1590]: 2026-01-14 00:08:34.881 [INFO][4399] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" Namespace="calico-system" Pod="csi-node-driver-8jmff" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-csi--node--driver--8jmff-eth0" Jan 14 00:08:34.960570 containerd[1590]: time="2026-01-14T00:08:34.960160684Z" level=info msg="connecting to shim 0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d" address="unix:///run/containerd/s/3822613a580d838c3aad73a91b00a18b2dce5e619a5d660219209896f46abc80" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:35.003291 systemd-networkd[1476]: calib6cde816088: Gained IPv6LL Jan 14 00:08:35.012669 systemd-networkd[1476]: cali47580ce692b: Link UP Jan 14 00:08:35.013937 systemd-networkd[1476]: cali47580ce692b: Gained carrier Jan 14 00:08:35.062817 containerd[1590]: time="2026-01-14T00:08:35.062620587Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-hrn72,Uid:1e53bd66-4746-482e-bb2b-bfd29a1ef20e,Namespace:calico-system,Attempt:0,} returns sandbox id \"aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5\"" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.476 [INFO][4410] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.519 [INFO][4410] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0 calico-apiserver-6f67969d8d- calico-apiserver 1e4bec8e-a684-46cb-852e-ae05ed7b56d7 851 0 2026-01-14 00:08:03 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6f67969d8d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-n-fb1a601aa4 calico-apiserver-6f67969d8d-vdxqm eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali47580ce692b [] [] }} ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-vdxqm" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.519 [INFO][4410] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-vdxqm" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.591 [INFO][4449] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" HandleID="k8s-pod-network.b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.592 [INFO][4449] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" HandleID="k8s-pod-network.b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d810), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-n-fb1a601aa4", "pod":"calico-apiserver-6f67969d8d-vdxqm", "timestamp":"2026-01-14 00:08:34.591536903 +0000 UTC"}, Hostname:"ci-4547-0-0-n-fb1a601aa4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.592 [INFO][4449] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.815 [INFO][4449] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.815 [INFO][4449] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-fb1a601aa4' Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.860 [INFO][4449] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.871 [INFO][4449] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.892 [INFO][4449] ipam/ipam.go 511: Trying affinity for 192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.913 [INFO][4449] ipam/ipam.go 158: Attempting to load block cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.955 [INFO][4449] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.955 [INFO][4449] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.73.128/26 handle="k8s-pod-network.b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.964 [INFO][4449] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.972 [INFO][4449] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.73.128/26 handle="k8s-pod-network.b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.981 [INFO][4449] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.73.135/26] block=192.168.73.128/26 handle="k8s-pod-network.b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.981 [INFO][4449] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.73.135/26] handle="k8s-pod-network.b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.981 [INFO][4449] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 00:08:35.066899 containerd[1590]: 2026-01-14 00:08:34.981 [INFO][4449] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.73.135/26] IPv6=[] ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" HandleID="k8s-pod-network.b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" Jan 14 00:08:35.067426 containerd[1590]: 2026-01-14 00:08:34.986 [INFO][4410] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-vdxqm" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0", GenerateName:"calico-apiserver-6f67969d8d-", Namespace:"calico-apiserver", SelfLink:"", UID:"1e4bec8e-a684-46cb-852e-ae05ed7b56d7", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f67969d8d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"", Pod:"calico-apiserver-6f67969d8d-vdxqm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.73.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali47580ce692b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:35.067426 containerd[1590]: 2026-01-14 00:08:34.989 [INFO][4410] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.73.135/32] ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-vdxqm" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" Jan 14 00:08:35.067426 containerd[1590]: 2026-01-14 00:08:34.989 [INFO][4410] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali47580ce692b ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-vdxqm" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" Jan 14 00:08:35.067426 containerd[1590]: 2026-01-14 00:08:35.013 [INFO][4410] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-vdxqm" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" Jan 14 00:08:35.067426 containerd[1590]: 2026-01-14 00:08:35.022 [INFO][4410] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-vdxqm" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0", GenerateName:"calico-apiserver-6f67969d8d-", Namespace:"calico-apiserver", SelfLink:"", UID:"1e4bec8e-a684-46cb-852e-ae05ed7b56d7", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 8, 3, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6f67969d8d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf", Pod:"calico-apiserver-6f67969d8d-vdxqm", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.73.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali47580ce692b", MAC:"b6:94:5f:c4:aa:aa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:35.067426 containerd[1590]: 2026-01-14 00:08:35.055 [INFO][4410] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" Namespace="calico-apiserver" Pod="calico-apiserver-6f67969d8d-vdxqm" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-calico--apiserver--6f67969d8d--vdxqm-eth0" Jan 14 00:08:35.071772 containerd[1590]: time="2026-01-14T00:08:35.071722132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 00:08:35.074897 systemd[1]: Started cri-containerd-0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d.scope - libcontainer container 0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d. Jan 14 00:08:35.107000 audit: BPF prog-id=198 op=LOAD Jan 14 00:08:35.111000 audit: BPF prog-id=199 op=LOAD Jan 14 00:08:35.112886 containerd[1590]: time="2026-01-14T00:08:35.112691026Z" level=info msg="connecting to shim b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf" address="unix:///run/containerd/s/c6b4994e853e0b80815942f79839b490f4a92a95b11ae9bec9045bed8cd513e6" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:35.111000 audit[4537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4525 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.111000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335646464636333663763623531326566316165366337366633323735 Jan 14 00:08:35.112000 audit: BPF prog-id=199 op=UNLOAD Jan 14 00:08:35.112000 audit[4537]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4525 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.112000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335646464636333663763623531326566316165366337366633323735 Jan 14 00:08:35.113000 audit: BPF prog-id=200 op=LOAD Jan 14 00:08:35.113000 audit[4537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4525 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.113000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335646464636333663763623531326566316165366337366633323735 Jan 14 00:08:35.114000 audit: BPF prog-id=201 op=LOAD Jan 14 00:08:35.114000 audit[4537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4525 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.114000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335646464636333663763623531326566316165366337366633323735 Jan 14 00:08:35.115000 audit: BPF prog-id=201 op=UNLOAD Jan 14 00:08:35.115000 audit[4537]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4525 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335646464636333663763623531326566316165366337366633323735 Jan 14 00:08:35.115000 audit: BPF prog-id=200 op=UNLOAD Jan 14 00:08:35.115000 audit[4537]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4525 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335646464636333663763623531326566316165366337366633323735 Jan 14 00:08:35.116000 audit: BPF prog-id=202 op=LOAD Jan 14 00:08:35.116000 audit[4537]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4525 pid=4537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.116000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335646464636333663763623531326566316165366337366633323735 Jan 14 00:08:35.145000 audit: BPF prog-id=203 op=LOAD Jan 14 00:08:35.151000 audit: BPF prog-id=204 op=LOAD Jan 14 00:08:35.151000 audit[4579]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4567 pid=4579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064386637306266366332613361663637613732623265653734616462 Jan 14 00:08:35.151000 audit: BPF prog-id=204 op=UNLOAD Jan 14 00:08:35.151000 audit[4579]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4567 pid=4579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064386637306266366332613361663637613732623265653734616462 Jan 14 00:08:35.151000 audit: BPF prog-id=205 op=LOAD Jan 14 00:08:35.151000 audit[4579]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4567 pid=4579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064386637306266366332613361663637613732623265653734616462 Jan 14 00:08:35.151000 audit: BPF prog-id=206 op=LOAD Jan 14 00:08:35.151000 audit[4579]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4567 pid=4579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064386637306266366332613361663637613732623265653734616462 Jan 14 00:08:35.151000 audit: BPF prog-id=206 op=UNLOAD Jan 14 00:08:35.151000 audit[4579]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4567 pid=4579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064386637306266366332613361663637613732623265653734616462 Jan 14 00:08:35.152000 audit: BPF prog-id=205 op=UNLOAD Jan 14 00:08:35.152000 audit[4579]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4567 pid=4579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.152000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064386637306266366332613361663637613732623265653734616462 Jan 14 00:08:35.152000 audit: BPF prog-id=207 op=LOAD Jan 14 00:08:35.152000 audit[4579]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4567 pid=4579 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.152000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064386637306266366332613361663637613732623265653734616462 Jan 14 00:08:35.167087 systemd[1]: Started cri-containerd-b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf.scope - libcontainer container b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf. Jan 14 00:08:35.216245 containerd[1590]: time="2026-01-14T00:08:35.216188373Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-8jmff,Uid:6c288445-910a-4d1d-9b62-12f5155b11be,Namespace:calico-system,Attempt:0,} returns sandbox id \"0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d\"" Jan 14 00:08:35.256991 containerd[1590]: time="2026-01-14T00:08:35.256941244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-b44cc6f4-gxl6c,Uid:5ee70bb0-55b7-4a80-b5cb-3133091615ae,Namespace:calico-system,Attempt:0,} returns sandbox id \"c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905\"" Jan 14 00:08:35.281000 audit: BPF prog-id=208 op=LOAD Jan 14 00:08:35.282000 audit: BPF prog-id=209 op=LOAD Jan 14 00:08:35.282000 audit[4644]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=4625 pid=4644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.282000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353533643461376163396334633963386332646634346632336562 Jan 14 00:08:35.283000 audit: BPF prog-id=209 op=UNLOAD Jan 14 00:08:35.283000 audit[4644]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4625 pid=4644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.283000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353533643461376163396334633963386332646634346632336562 Jan 14 00:08:35.283000 audit: BPF prog-id=210 op=LOAD Jan 14 00:08:35.283000 audit[4644]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=4625 pid=4644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.283000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353533643461376163396334633963386332646634346632336562 Jan 14 00:08:35.283000 audit: BPF prog-id=211 op=LOAD Jan 14 00:08:35.283000 audit[4644]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=4625 pid=4644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.283000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353533643461376163396334633963386332646634346632336562 Jan 14 00:08:35.283000 audit: BPF prog-id=211 op=UNLOAD Jan 14 00:08:35.283000 audit[4644]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4625 pid=4644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.283000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353533643461376163396334633963386332646634346632336562 Jan 14 00:08:35.283000 audit: BPF prog-id=210 op=UNLOAD Jan 14 00:08:35.283000 audit[4644]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4625 pid=4644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.283000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353533643461376163396334633963386332646634346632336562 Jan 14 00:08:35.283000 audit: BPF prog-id=212 op=LOAD Jan 14 00:08:35.283000 audit[4644]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=4625 pid=4644 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.283000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6236353533643461376163396334633963386332646634346632336562 Jan 14 00:08:35.334078 containerd[1590]: time="2026-01-14T00:08:35.332992221Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dj86q,Uid:bf272ec4-219f-4fe3-8789-4c7d13ac9aa1,Namespace:kube-system,Attempt:0,}" Jan 14 00:08:35.406113 containerd[1590]: time="2026-01-14T00:08:35.405968278Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6f67969d8d-vdxqm,Uid:1e4bec8e-a684-46cb-852e-ae05ed7b56d7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf\"" Jan 14 00:08:35.417934 containerd[1590]: time="2026-01-14T00:08:35.417590765Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:35.421124 containerd[1590]: time="2026-01-14T00:08:35.419969132Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 00:08:35.421124 containerd[1590]: time="2026-01-14T00:08:35.420085304Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:35.421841 kubelet[2832]: E0114 00:08:35.421406 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:08:35.424655 kubelet[2832]: E0114 00:08:35.422278 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:08:35.424655 kubelet[2832]: E0114 00:08:35.424014 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p28dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-hrn72_calico-system(1e53bd66-4746-482e-bb2b-bfd29a1ef20e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:35.424844 containerd[1590]: time="2026-01-14T00:08:35.424127644Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 00:08:35.426420 kubelet[2832]: E0114 00:08:35.426388 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:08:35.561717 kubelet[2832]: E0114 00:08:35.561660 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:08:35.562025 kubelet[2832]: E0114 00:08:35.561986 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:08:35.587139 systemd-networkd[1476]: calic2670223610: Link UP Jan 14 00:08:35.587434 systemd-networkd[1476]: calic2670223610: Gained carrier Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.431 [INFO][4689] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.454 [INFO][4689] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0 coredns-674b8bbfcf- kube-system bf272ec4-219f-4fe3-8789-4c7d13ac9aa1 847 0 2026-01-14 00:07:52 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-n-fb1a601aa4 coredns-674b8bbfcf-dj86q eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calic2670223610 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dj86q" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.455 [INFO][4689] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dj86q" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.504 [INFO][4705] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" HandleID="k8s-pod-network.779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.505 [INFO][4705] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" HandleID="k8s-pod-network.779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb020), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-n-fb1a601aa4", "pod":"coredns-674b8bbfcf-dj86q", "timestamp":"2026-01-14 00:08:35.504356534 +0000 UTC"}, Hostname:"ci-4547-0-0-n-fb1a601aa4", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.505 [INFO][4705] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.505 [INFO][4705] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.505 [INFO][4705] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-n-fb1a601aa4' Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.518 [INFO][4705] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.526 [INFO][4705] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.533 [INFO][4705] ipam/ipam.go 511: Trying affinity for 192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.536 [INFO][4705] ipam/ipam.go 158: Attempting to load block cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.542 [INFO][4705] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.73.128/26 host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.542 [INFO][4705] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.73.128/26 handle="k8s-pod-network.779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.544 [INFO][4705] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.553 [INFO][4705] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.73.128/26 handle="k8s-pod-network.779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.568 [INFO][4705] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.73.136/26] block=192.168.73.128/26 handle="k8s-pod-network.779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.568 [INFO][4705] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.73.136/26] handle="k8s-pod-network.779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" host="ci-4547-0-0-n-fb1a601aa4" Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.568 [INFO][4705] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 14 00:08:35.627507 containerd[1590]: 2026-01-14 00:08:35.568 [INFO][4705] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.73.136/26] IPv6=[] ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" HandleID="k8s-pod-network.779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Workload="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" Jan 14 00:08:35.629143 containerd[1590]: 2026-01-14 00:08:35.573 [INFO][4689] cni-plugin/k8s.go 418: Populated endpoint ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dj86q" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"bf272ec4-219f-4fe3-8789-4c7d13ac9aa1", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 7, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"", Pod:"coredns-674b8bbfcf-dj86q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.73.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic2670223610", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:35.629143 containerd[1590]: 2026-01-14 00:08:35.573 [INFO][4689] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.73.136/32] ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dj86q" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" Jan 14 00:08:35.629143 containerd[1590]: 2026-01-14 00:08:35.574 [INFO][4689] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic2670223610 ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dj86q" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" Jan 14 00:08:35.629143 containerd[1590]: 2026-01-14 00:08:35.592 [INFO][4689] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dj86q" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" Jan 14 00:08:35.629143 containerd[1590]: 2026-01-14 00:08:35.599 [INFO][4689] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dj86q" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"bf272ec4-219f-4fe3-8789-4c7d13ac9aa1", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2026, time.January, 14, 0, 7, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-n-fb1a601aa4", ContainerID:"779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe", Pod:"coredns-674b8bbfcf-dj86q", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.73.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calic2670223610", MAC:"82:a6:25:de:38:a8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 14 00:08:35.629143 containerd[1590]: 2026-01-14 00:08:35.623 [INFO][4689] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" Namespace="kube-system" Pod="coredns-674b8bbfcf-dj86q" WorkloadEndpoint="ci--4547--0--0--n--fb1a601aa4-k8s-coredns--674b8bbfcf--dj86q-eth0" Jan 14 00:08:35.643000 audit[4718]: NETFILTER_CFG table=filter:125 family=2 entries=16 op=nft_register_rule pid=4718 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:35.645405 kernel: kauditd_printk_skb: 199 callbacks suppressed Jan 14 00:08:35.645813 kernel: audit: type=1325 audit(1768349315.643:644): table=filter:125 family=2 entries=16 op=nft_register_rule pid=4718 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:35.645881 kernel: audit: type=1300 audit(1768349315.643:644): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffcec75550 a2=0 a3=1 items=0 ppid=2942 pid=4718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.643000 audit[4718]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffcec75550 a2=0 a3=1 items=0 ppid=2942 pid=4718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.643000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:35.652368 kernel: audit: type=1327 audit(1768349315.643:644): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:35.652000 audit[4718]: NETFILTER_CFG table=nat:126 family=2 entries=18 op=nft_register_rule pid=4718 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:35.652000 audit[4718]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5004 a0=3 a1=ffffcec75550 a2=0 a3=1 items=0 ppid=2942 pid=4718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.657102 kernel: audit: type=1325 audit(1768349315.652:645): table=nat:126 family=2 entries=18 op=nft_register_rule pid=4718 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:35.657169 kernel: audit: type=1300 audit(1768349315.652:645): arch=c00000b7 syscall=211 success=yes exit=5004 a0=3 a1=ffffcec75550 a2=0 a3=1 items=0 ppid=2942 pid=4718 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.658940 kernel: audit: type=1327 audit(1768349315.652:645): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:35.652000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:35.673811 containerd[1590]: time="2026-01-14T00:08:35.673736521Z" level=info msg="connecting to shim 779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe" address="unix:///run/containerd/s/5809793e7894f9f05b963caa46c001a56aa60a0b5839fd03477e1077991f0123" namespace=k8s.io protocol=ttrpc version=3 Jan 14 00:08:35.714888 systemd[1]: Started cri-containerd-779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe.scope - libcontainer container 779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe. Jan 14 00:08:35.731000 audit: BPF prog-id=213 op=LOAD Jan 14 00:08:35.733569 kernel: audit: type=1334 audit(1768349315.731:646): prog-id=213 op=LOAD Jan 14 00:08:35.732000 audit: BPF prog-id=214 op=LOAD Jan 14 00:08:35.737547 kernel: audit: type=1334 audit(1768349315.732:647): prog-id=214 op=LOAD Jan 14 00:08:35.737637 kernel: audit: type=1300 audit(1768349315.732:647): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4728 pid=4740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.732000 audit[4740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4728 pid=4740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.732000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737396537306136343533313662306230313161633933396636633232 Jan 14 00:08:35.739718 kernel: audit: type=1327 audit(1768349315.732:647): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737396537306136343533313662306230313161633933396636633232 Jan 14 00:08:35.732000 audit: BPF prog-id=214 op=UNLOAD Jan 14 00:08:35.732000 audit[4740]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4728 pid=4740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.732000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737396537306136343533313662306230313161633933396636633232 Jan 14 00:08:35.732000 audit: BPF prog-id=215 op=LOAD Jan 14 00:08:35.732000 audit[4740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4728 pid=4740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.732000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737396537306136343533313662306230313161633933396636633232 Jan 14 00:08:35.734000 audit: BPF prog-id=216 op=LOAD Jan 14 00:08:35.734000 audit[4740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4728 pid=4740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.734000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737396537306136343533313662306230313161633933396636633232 Jan 14 00:08:35.734000 audit: BPF prog-id=216 op=UNLOAD Jan 14 00:08:35.734000 audit[4740]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4728 pid=4740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.734000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737396537306136343533313662306230313161633933396636633232 Jan 14 00:08:35.734000 audit: BPF prog-id=215 op=UNLOAD Jan 14 00:08:35.734000 audit[4740]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4728 pid=4740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.734000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737396537306136343533313662306230313161633933396636633232 Jan 14 00:08:35.734000 audit: BPF prog-id=217 op=LOAD Jan 14 00:08:35.734000 audit[4740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4728 pid=4740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.734000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737396537306136343533313662306230313161633933396636633232 Jan 14 00:08:35.770282 containerd[1590]: time="2026-01-14T00:08:35.770217299Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:35.771994 systemd-networkd[1476]: cali1b6b8bc8d9c: Gained IPv6LL Jan 14 00:08:35.774513 containerd[1590]: time="2026-01-14T00:08:35.774450819Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:35.775311 containerd[1590]: time="2026-01-14T00:08:35.774563751Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 00:08:35.775378 kubelet[2832]: E0114 00:08:35.775091 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:08:35.775378 kubelet[2832]: E0114 00:08:35.775142 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:08:35.775975 containerd[1590]: time="2026-01-14T00:08:35.775750794Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 00:08:35.779716 kubelet[2832]: E0114 00:08:35.779634 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:35.783232 containerd[1590]: time="2026-01-14T00:08:35.783193647Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-dj86q,Uid:bf272ec4-219f-4fe3-8789-4c7d13ac9aa1,Namespace:kube-system,Attempt:0,} returns sandbox id \"779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe\"" Jan 14 00:08:35.789740 containerd[1590]: time="2026-01-14T00:08:35.789700042Z" level=info msg="CreateContainer within sandbox \"779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 14 00:08:35.805550 containerd[1590]: time="2026-01-14T00:08:35.805503203Z" level=info msg="Container b59d546f2c0d817fe3aa9c120261662c60ee0beed232120aecb63cea4e1b7422: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:08:35.816571 containerd[1590]: time="2026-01-14T00:08:35.816315086Z" level=info msg="CreateContainer within sandbox \"779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b59d546f2c0d817fe3aa9c120261662c60ee0beed232120aecb63cea4e1b7422\"" Jan 14 00:08:35.817860 containerd[1590]: time="2026-01-14T00:08:35.817822002Z" level=info msg="StartContainer for \"b59d546f2c0d817fe3aa9c120261662c60ee0beed232120aecb63cea4e1b7422\"" Jan 14 00:08:35.818854 containerd[1590]: time="2026-01-14T00:08:35.818817946Z" level=info msg="connecting to shim b59d546f2c0d817fe3aa9c120261662c60ee0beed232120aecb63cea4e1b7422" address="unix:///run/containerd/s/5809793e7894f9f05b963caa46c001a56aa60a0b5839fd03477e1077991f0123" protocol=ttrpc version=3 Jan 14 00:08:35.841247 systemd[1]: Started cri-containerd-b59d546f2c0d817fe3aa9c120261662c60ee0beed232120aecb63cea4e1b7422.scope - libcontainer container b59d546f2c0d817fe3aa9c120261662c60ee0beed232120aecb63cea4e1b7422. Jan 14 00:08:35.859000 audit: BPF prog-id=218 op=LOAD Jan 14 00:08:35.861000 audit: BPF prog-id=219 op=LOAD Jan 14 00:08:35.861000 audit[4766]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4728 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235396435343666326330643831376665336161396331323032363136 Jan 14 00:08:35.861000 audit: BPF prog-id=219 op=UNLOAD Jan 14 00:08:35.861000 audit[4766]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4728 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.861000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235396435343666326330643831376665336161396331323032363136 Jan 14 00:08:35.862000 audit: BPF prog-id=220 op=LOAD Jan 14 00:08:35.862000 audit[4766]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4728 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235396435343666326330643831376665336161396331323032363136 Jan 14 00:08:35.862000 audit: BPF prog-id=221 op=LOAD Jan 14 00:08:35.862000 audit[4766]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4728 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235396435343666326330643831376665336161396331323032363136 Jan 14 00:08:35.862000 audit: BPF prog-id=221 op=UNLOAD Jan 14 00:08:35.862000 audit[4766]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4728 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235396435343666326330643831376665336161396331323032363136 Jan 14 00:08:35.862000 audit: BPF prog-id=220 op=UNLOAD Jan 14 00:08:35.862000 audit[4766]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4728 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235396435343666326330643831376665336161396331323032363136 Jan 14 00:08:35.862000 audit: BPF prog-id=222 op=LOAD Jan 14 00:08:35.862000 audit[4766]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4728 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:35.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235396435343666326330643831376665336161396331323032363136 Jan 14 00:08:35.895623 containerd[1590]: time="2026-01-14T00:08:35.895576716Z" level=info msg="StartContainer for \"b59d546f2c0d817fe3aa9c120261662c60ee0beed232120aecb63cea4e1b7422\" returns successfully" Jan 14 00:08:36.026825 systemd-networkd[1476]: cali8561e96ce8d: Gained IPv6LL Jan 14 00:08:36.115965 containerd[1590]: time="2026-01-14T00:08:36.115792292Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:36.117124 containerd[1590]: time="2026-01-14T00:08:36.117059661Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 00:08:36.117345 containerd[1590]: time="2026-01-14T00:08:36.117096345Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:36.118355 kubelet[2832]: E0114 00:08:36.118264 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:08:36.118355 kubelet[2832]: E0114 00:08:36.118354 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:08:36.118937 containerd[1590]: time="2026-01-14T00:08:36.118901890Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:08:36.119173 kubelet[2832]: E0114 00:08:36.119117 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r74wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b44cc6f4-gxl6c_calico-system(5ee70bb0-55b7-4a80-b5cb-3133091615ae): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:36.120616 kubelet[2832]: E0114 00:08:36.120496 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:08:36.343699 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount633727986.mount: Deactivated successfully. Jan 14 00:08:36.459030 containerd[1590]: time="2026-01-14T00:08:36.458836801Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:36.460498 containerd[1590]: time="2026-01-14T00:08:36.460436085Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:08:36.460675 containerd[1590]: time="2026-01-14T00:08:36.460555617Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:36.460968 kubelet[2832]: E0114 00:08:36.460929 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:08:36.461614 kubelet[2832]: E0114 00:08:36.460995 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:08:36.461614 kubelet[2832]: E0114 00:08:36.461214 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md5mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-vdxqm_calico-apiserver(1e4bec8e-a684-46cb-852e-ae05ed7b56d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:36.462680 containerd[1590]: time="2026-01-14T00:08:36.461736138Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 00:08:36.463360 kubelet[2832]: E0114 00:08:36.463313 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:08:36.474864 systemd-networkd[1476]: calidd1ccfc2026: Gained IPv6LL Jan 14 00:08:36.539376 systemd-networkd[1476]: cali47580ce692b: Gained IPv6LL Jan 14 00:08:36.569955 kubelet[2832]: E0114 00:08:36.569919 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:08:36.570624 kubelet[2832]: E0114 00:08:36.570583 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:08:36.571284 kubelet[2832]: E0114 00:08:36.571254 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:08:36.611913 kubelet[2832]: I0114 00:08:36.611843 2832 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-dj86q" podStartSLOduration=44.611825659 podStartE2EDuration="44.611825659s" podCreationTimestamp="2026-01-14 00:07:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-14 00:08:36.59288656 +0000 UTC m=+50.395297867" watchObservedRunningTime="2026-01-14 00:08:36.611825659 +0000 UTC m=+50.414236966" Jan 14 00:08:36.634000 audit[4822]: NETFILTER_CFG table=filter:127 family=2 entries=16 op=nft_register_rule pid=4822 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:36.634000 audit[4822]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffee3d1170 a2=0 a3=1 items=0 ppid=2942 pid=4822 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:36.634000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:36.640000 audit[4822]: NETFILTER_CFG table=nat:128 family=2 entries=42 op=nft_register_rule pid=4822 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:36.640000 audit[4822]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=13428 a0=3 a1=ffffee3d1170 a2=0 a3=1 items=0 ppid=2942 pid=4822 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:36.640000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:36.798217 containerd[1590]: time="2026-01-14T00:08:36.797334005Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:36.800289 containerd[1590]: time="2026-01-14T00:08:36.800084846Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 00:08:36.800289 containerd[1590]: time="2026-01-14T00:08:36.800218220Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:36.800840 kubelet[2832]: E0114 00:08:36.800694 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:08:36.800840 kubelet[2832]: E0114 00:08:36.800786 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:08:36.801136 kubelet[2832]: E0114 00:08:36.800910 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:36.802151 kubelet[2832]: E0114 00:08:36.802102 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:08:37.307702 systemd-networkd[1476]: calic2670223610: Gained IPv6LL Jan 14 00:08:37.573853 kubelet[2832]: E0114 00:08:37.573702 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:08:37.671000 audit[4834]: NETFILTER_CFG table=filter:129 family=2 entries=16 op=nft_register_rule pid=4834 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:37.671000 audit[4834]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffc7a47880 a2=0 a3=1 items=0 ppid=2942 pid=4834 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:37.671000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:37.690000 audit[4834]: NETFILTER_CFG table=nat:130 family=2 entries=54 op=nft_register_chain pid=4834 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:37.690000 audit[4834]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19092 a0=3 a1=ffffc7a47880 a2=0 a3=1 items=0 ppid=2942 pid=4834 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:37.690000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:40.382192 kubelet[2832]: I0114 00:08:40.382094 2832 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 14 00:08:40.419000 audit[4892]: NETFILTER_CFG table=filter:131 family=2 entries=15 op=nft_register_rule pid=4892 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:40.419000 audit[4892]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc4d4bd60 a2=0 a3=1 items=0 ppid=2942 pid=4892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:40.419000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:40.425000 audit[4892]: NETFILTER_CFG table=nat:132 family=2 entries=25 op=nft_register_chain pid=4892 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:08:40.425000 audit[4892]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8580 a0=3 a1=ffffc4d4bd60 a2=0 a3=1 items=0 ppid=2942 pid=4892 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:40.425000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:08:41.107000 audit: BPF prog-id=223 op=LOAD Jan 14 00:08:41.108822 kernel: kauditd_printk_skb: 58 callbacks suppressed Jan 14 00:08:41.108867 kernel: audit: type=1334 audit(1768349321.107:668): prog-id=223 op=LOAD Jan 14 00:08:41.107000 audit[4914]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0d5c738 a2=98 a3=ffffd0d5c728 items=0 ppid=4898 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.107000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 00:08:41.114595 kernel: audit: type=1300 audit(1768349321.107:668): arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0d5c738 a2=98 a3=ffffd0d5c728 items=0 ppid=4898 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.114694 kernel: audit: type=1327 audit(1768349321.107:668): proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 00:08:41.108000 audit: BPF prog-id=223 op=UNLOAD Jan 14 00:08:41.116609 kernel: audit: type=1334 audit(1768349321.108:669): prog-id=223 op=UNLOAD Jan 14 00:08:41.116681 kernel: audit: type=1300 audit(1768349321.108:669): arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd0d5c708 a3=0 items=0 ppid=4898 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.108000 audit[4914]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd0d5c708 a3=0 items=0 ppid=4898 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.108000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 00:08:41.121577 kernel: audit: type=1327 audit(1768349321.108:669): proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 00:08:41.121661 kernel: audit: type=1334 audit(1768349321.108:670): prog-id=224 op=LOAD Jan 14 00:08:41.108000 audit: BPF prog-id=224 op=LOAD Jan 14 00:08:41.108000 audit[4914]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0d5c5e8 a2=74 a3=95 items=0 ppid=4898 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.124368 kernel: audit: type=1300 audit(1768349321.108:670): arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0d5c5e8 a2=74 a3=95 items=0 ppid=4898 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.108000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 00:08:41.127245 kernel: audit: type=1327 audit(1768349321.108:670): proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 00:08:41.108000 audit: BPF prog-id=224 op=UNLOAD Jan 14 00:08:41.108000 audit[4914]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4898 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.108000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 00:08:41.108000 audit: BPF prog-id=225 op=LOAD Jan 14 00:08:41.108000 audit[4914]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0d5c618 a2=40 a3=ffffd0d5c648 items=0 ppid=4898 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.108000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 00:08:41.108000 audit: BPF prog-id=225 op=UNLOAD Jan 14 00:08:41.108000 audit[4914]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffd0d5c648 items=0 ppid=4898 pid=4914 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.108000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 14 00:08:41.118000 audit: BPF prog-id=226 op=LOAD Jan 14 00:08:41.118000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff4e8d8a8 a2=98 a3=fffff4e8d898 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.118000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.118000 audit: BPF prog-id=226 op=UNLOAD Jan 14 00:08:41.118000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff4e8d878 a3=0 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.118000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.118000 audit: BPF prog-id=227 op=LOAD Jan 14 00:08:41.118000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff4e8d538 a2=74 a3=95 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.129584 kernel: audit: type=1334 audit(1768349321.108:671): prog-id=224 op=UNLOAD Jan 14 00:08:41.118000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.118000 audit: BPF prog-id=227 op=UNLOAD Jan 14 00:08:41.118000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.118000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.118000 audit: BPF prog-id=228 op=LOAD Jan 14 00:08:41.118000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff4e8d598 a2=94 a3=2 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.118000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.118000 audit: BPF prog-id=228 op=UNLOAD Jan 14 00:08:41.118000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.118000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.346000 audit: BPF prog-id=229 op=LOAD Jan 14 00:08:41.346000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff4e8d558 a2=40 a3=fffff4e8d588 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.346000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.347000 audit: BPF prog-id=229 op=UNLOAD Jan 14 00:08:41.347000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=fffff4e8d588 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.347000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.369000 audit: BPF prog-id=230 op=LOAD Jan 14 00:08:41.369000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff4e8d568 a2=94 a3=4 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.369000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.369000 audit: BPF prog-id=230 op=UNLOAD Jan 14 00:08:41.369000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.369000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.369000 audit: BPF prog-id=231 op=LOAD Jan 14 00:08:41.369000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff4e8d3a8 a2=94 a3=5 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.369000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.369000 audit: BPF prog-id=231 op=UNLOAD Jan 14 00:08:41.369000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.369000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.369000 audit: BPF prog-id=232 op=LOAD Jan 14 00:08:41.369000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff4e8d5d8 a2=94 a3=6 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.369000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.370000 audit: BPF prog-id=232 op=UNLOAD Jan 14 00:08:41.370000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.370000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.370000 audit: BPF prog-id=233 op=LOAD Jan 14 00:08:41.370000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff4e8cda8 a2=94 a3=83 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.370000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.370000 audit: BPF prog-id=234 op=LOAD Jan 14 00:08:41.370000 audit[4917]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=fffff4e8cb68 a2=94 a3=2 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.370000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.371000 audit: BPF prog-id=234 op=UNLOAD Jan 14 00:08:41.371000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.371000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.371000 audit: BPF prog-id=233 op=UNLOAD Jan 14 00:08:41.371000 audit[4917]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=3853c620 a3=3852fb00 items=0 ppid=4898 pid=4917 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.371000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 14 00:08:41.383000 audit: BPF prog-id=235 op=LOAD Jan 14 00:08:41.383000 audit[4953]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdac7ae58 a2=98 a3=ffffdac7ae48 items=0 ppid=4898 pid=4953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.383000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 00:08:41.383000 audit: BPF prog-id=235 op=UNLOAD Jan 14 00:08:41.383000 audit[4953]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffdac7ae28 a3=0 items=0 ppid=4898 pid=4953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.383000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 00:08:41.383000 audit: BPF prog-id=236 op=LOAD Jan 14 00:08:41.383000 audit[4953]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdac7ad08 a2=74 a3=95 items=0 ppid=4898 pid=4953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.383000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 00:08:41.383000 audit: BPF prog-id=236 op=UNLOAD Jan 14 00:08:41.383000 audit[4953]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4898 pid=4953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.383000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 00:08:41.383000 audit: BPF prog-id=237 op=LOAD Jan 14 00:08:41.383000 audit[4953]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdac7ad38 a2=40 a3=ffffdac7ad68 items=0 ppid=4898 pid=4953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.383000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 00:08:41.383000 audit: BPF prog-id=237 op=UNLOAD Jan 14 00:08:41.383000 audit[4953]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffdac7ad68 items=0 ppid=4898 pid=4953 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.383000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 14 00:08:41.464710 systemd-networkd[1476]: vxlan.calico: Link UP Jan 14 00:08:41.464717 systemd-networkd[1476]: vxlan.calico: Gained carrier Jan 14 00:08:41.501000 audit: BPF prog-id=238 op=LOAD Jan 14 00:08:41.501000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff36f19c8 a2=98 a3=fffff36f19b8 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.501000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.501000 audit: BPF prog-id=238 op=UNLOAD Jan 14 00:08:41.501000 audit[4979]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff36f1998 a3=0 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.501000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.501000 audit: BPF prog-id=239 op=LOAD Jan 14 00:08:41.501000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff36f16a8 a2=74 a3=95 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.501000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.501000 audit: BPF prog-id=239 op=UNLOAD Jan 14 00:08:41.501000 audit[4979]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.501000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.501000 audit: BPF prog-id=240 op=LOAD Jan 14 00:08:41.501000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff36f1708 a2=94 a3=2 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.501000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.501000 audit: BPF prog-id=240 op=UNLOAD Jan 14 00:08:41.501000 audit[4979]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.501000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.501000 audit: BPF prog-id=241 op=LOAD Jan 14 00:08:41.501000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff36f1588 a2=40 a3=fffff36f15b8 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.501000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.504000 audit: BPF prog-id=241 op=UNLOAD Jan 14 00:08:41.504000 audit[4979]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=fffff36f15b8 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.504000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.504000 audit: BPF prog-id=242 op=LOAD Jan 14 00:08:41.504000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff36f16d8 a2=94 a3=b7 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.504000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.506000 audit: BPF prog-id=242 op=UNLOAD Jan 14 00:08:41.506000 audit[4979]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.506000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.506000 audit: BPF prog-id=243 op=LOAD Jan 14 00:08:41.506000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff36f0d88 a2=94 a3=2 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.506000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.506000 audit: BPF prog-id=243 op=UNLOAD Jan 14 00:08:41.506000 audit[4979]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.506000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.506000 audit: BPF prog-id=244 op=LOAD Jan 14 00:08:41.506000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff36f0f18 a2=94 a3=30 items=0 ppid=4898 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.506000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 14 00:08:41.527000 audit: BPF prog-id=245 op=LOAD Jan 14 00:08:41.527000 audit[4986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd10e2e58 a2=98 a3=ffffd10e2e48 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.527000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.527000 audit: BPF prog-id=245 op=UNLOAD Jan 14 00:08:41.527000 audit[4986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd10e2e28 a3=0 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.527000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.528000 audit: BPF prog-id=246 op=LOAD Jan 14 00:08:41.528000 audit[4986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd10e2ae8 a2=74 a3=95 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.528000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.529000 audit: BPF prog-id=246 op=UNLOAD Jan 14 00:08:41.529000 audit[4986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.529000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.529000 audit: BPF prog-id=247 op=LOAD Jan 14 00:08:41.529000 audit[4986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd10e2b48 a2=94 a3=2 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.529000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.529000 audit: BPF prog-id=247 op=UNLOAD Jan 14 00:08:41.529000 audit[4986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.529000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.648000 audit: BPF prog-id=248 op=LOAD Jan 14 00:08:41.648000 audit[4986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffd10e2b08 a2=40 a3=ffffd10e2b38 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.648000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.648000 audit: BPF prog-id=248 op=UNLOAD Jan 14 00:08:41.648000 audit[4986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffd10e2b38 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.648000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.660000 audit: BPF prog-id=249 op=LOAD Jan 14 00:08:41.660000 audit[4986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd10e2b18 a2=94 a3=4 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.660000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.660000 audit: BPF prog-id=249 op=UNLOAD Jan 14 00:08:41.660000 audit[4986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.660000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.660000 audit: BPF prog-id=250 op=LOAD Jan 14 00:08:41.660000 audit[4986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd10e2958 a2=94 a3=5 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.660000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.660000 audit: BPF prog-id=250 op=UNLOAD Jan 14 00:08:41.660000 audit[4986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.660000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.660000 audit: BPF prog-id=251 op=LOAD Jan 14 00:08:41.660000 audit[4986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd10e2b88 a2=94 a3=6 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.660000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.660000 audit: BPF prog-id=251 op=UNLOAD Jan 14 00:08:41.660000 audit[4986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.660000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.661000 audit: BPF prog-id=252 op=LOAD Jan 14 00:08:41.661000 audit[4986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffd10e2358 a2=94 a3=83 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.661000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.661000 audit: BPF prog-id=253 op=LOAD Jan 14 00:08:41.661000 audit[4986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffd10e2118 a2=94 a3=2 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.661000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.661000 audit: BPF prog-id=253 op=UNLOAD Jan 14 00:08:41.661000 audit[4986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.661000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.662000 audit: BPF prog-id=252 op=UNLOAD Jan 14 00:08:41.662000 audit[4986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1c333620 a3=1c326b00 items=0 ppid=4898 pid=4986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.662000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 14 00:08:41.667000 audit: BPF prog-id=244 op=UNLOAD Jan 14 00:08:41.667000 audit[4898]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=40006bf640 a2=0 a3=0 items=0 ppid=4005 pid=4898 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.667000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 14 00:08:41.761000 audit[5016]: NETFILTER_CFG table=nat:133 family=2 entries=15 op=nft_register_chain pid=5016 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 00:08:41.761000 audit[5016]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffd7cd2ce0 a2=0 a3=ffff8d70bfa8 items=0 ppid=4898 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.761000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 00:08:41.765000 audit[5015]: NETFILTER_CFG table=raw:134 family=2 entries=21 op=nft_register_chain pid=5015 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 00:08:41.765000 audit[5015]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffd5b6ded0 a2=0 a3=ffffa3c99fa8 items=0 ppid=4898 pid=5015 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.765000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 00:08:41.768000 audit[5019]: NETFILTER_CFG table=mangle:135 family=2 entries=16 op=nft_register_chain pid=5019 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 00:08:41.768000 audit[5019]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffec3af5f0 a2=0 a3=ffff8fd26fa8 items=0 ppid=4898 pid=5019 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.768000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 00:08:41.781000 audit[5021]: NETFILTER_CFG table=filter:136 family=2 entries=315 op=nft_register_chain pid=5021 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 14 00:08:41.781000 audit[5021]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=187764 a0=3 a1=fffffbb3d070 a2=0 a3=ffffbce73fa8 items=0 ppid=4898 pid=5021 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:08:41.781000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 14 00:08:43.194849 systemd-networkd[1476]: vxlan.calico: Gained IPv6LL Jan 14 00:08:46.336392 containerd[1590]: time="2026-01-14T00:08:46.336215328Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:08:46.680939 containerd[1590]: time="2026-01-14T00:08:46.680733651Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:46.682061 containerd[1590]: time="2026-01-14T00:08:46.682015448Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:08:46.682344 containerd[1590]: time="2026-01-14T00:08:46.682104216Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:46.682423 kubelet[2832]: E0114 00:08:46.682306 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:08:46.682423 kubelet[2832]: E0114 00:08:46.682356 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:08:46.683456 kubelet[2832]: E0114 00:08:46.682497 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cr65d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-7bt2c_calico-apiserver(3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:46.684094 kubelet[2832]: E0114 00:08:46.684052 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:08:49.336287 containerd[1590]: time="2026-01-14T00:08:49.335737762Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 00:08:49.674456 containerd[1590]: time="2026-01-14T00:08:49.674267390Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:49.675901 containerd[1590]: time="2026-01-14T00:08:49.675841223Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 00:08:49.676023 containerd[1590]: time="2026-01-14T00:08:49.675972415Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:49.676373 kubelet[2832]: E0114 00:08:49.676301 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:08:49.677308 kubelet[2832]: E0114 00:08:49.676392 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:08:49.677506 containerd[1590]: time="2026-01-14T00:08:49.677008038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 00:08:49.677777 kubelet[2832]: E0114 00:08:49.677686 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c290d008f4c4d48b25c8570357599ee,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:50.006215 containerd[1590]: time="2026-01-14T00:08:50.005646834Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:50.007400 containerd[1590]: time="2026-01-14T00:08:50.007323907Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 00:08:50.007498 containerd[1590]: time="2026-01-14T00:08:50.007411903Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:50.007852 kubelet[2832]: E0114 00:08:50.007758 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:08:50.007852 kubelet[2832]: E0114 00:08:50.007842 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:08:50.008321 kubelet[2832]: E0114 00:08:50.008207 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p28dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-hrn72_calico-system(1e53bd66-4746-482e-bb2b-bfd29a1ef20e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:50.009319 containerd[1590]: time="2026-01-14T00:08:50.009266127Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 00:08:50.010087 kubelet[2832]: E0114 00:08:50.009951 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:08:50.344916 containerd[1590]: time="2026-01-14T00:08:50.344868622Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:50.348090 containerd[1590]: time="2026-01-14T00:08:50.347998820Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 00:08:50.348368 containerd[1590]: time="2026-01-14T00:08:50.348266646Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:50.352939 kubelet[2832]: E0114 00:08:50.352392 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:08:50.352939 kubelet[2832]: E0114 00:08:50.352439 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:08:50.352939 kubelet[2832]: E0114 00:08:50.352726 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:50.353794 containerd[1590]: time="2026-01-14T00:08:50.353759041Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 00:08:50.355560 kubelet[2832]: E0114 00:08:50.354922 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:08:50.696692 containerd[1590]: time="2026-01-14T00:08:50.696504726Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:50.698542 containerd[1590]: time="2026-01-14T00:08:50.698451346Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 00:08:50.698744 containerd[1590]: time="2026-01-14T00:08:50.698468865Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:50.698891 kubelet[2832]: E0114 00:08:50.698829 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:08:50.700564 kubelet[2832]: E0114 00:08:50.698905 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:08:50.700564 kubelet[2832]: E0114 00:08:50.699150 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:50.702761 containerd[1590]: time="2026-01-14T00:08:50.702724444Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 00:08:51.044385 containerd[1590]: time="2026-01-14T00:08:51.044169114Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:51.045902 containerd[1590]: time="2026-01-14T00:08:51.045743478Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:51.045902 containerd[1590]: time="2026-01-14T00:08:51.045788716Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 00:08:51.046467 kubelet[2832]: E0114 00:08:51.046410 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:08:51.046802 kubelet[2832]: E0114 00:08:51.046683 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:08:51.047262 kubelet[2832]: E0114 00:08:51.047151 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:51.048609 kubelet[2832]: E0114 00:08:51.048509 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:08:51.336579 containerd[1590]: time="2026-01-14T00:08:51.335737234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:08:51.677174 containerd[1590]: time="2026-01-14T00:08:51.676794611Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:51.678906 containerd[1590]: time="2026-01-14T00:08:51.678480050Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:08:51.678906 containerd[1590]: time="2026-01-14T00:08:51.678629203Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:51.679077 kubelet[2832]: E0114 00:08:51.678877 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:08:51.679077 kubelet[2832]: E0114 00:08:51.678942 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:08:51.679271 kubelet[2832]: E0114 00:08:51.679152 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md5mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-vdxqm_calico-apiserver(1e4bec8e-a684-46cb-852e-ae05ed7b56d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:51.680824 kubelet[2832]: E0114 00:08:51.680748 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:08:52.335595 containerd[1590]: time="2026-01-14T00:08:52.335373681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 00:08:52.673662 containerd[1590]: time="2026-01-14T00:08:52.673431117Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:08:52.675498 containerd[1590]: time="2026-01-14T00:08:52.675354552Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 00:08:52.675715 containerd[1590]: time="2026-01-14T00:08:52.675469026Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 00:08:52.675784 kubelet[2832]: E0114 00:08:52.675703 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:08:52.675784 kubelet[2832]: E0114 00:08:52.675768 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:08:52.676240 kubelet[2832]: E0114 00:08:52.675955 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r74wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b44cc6f4-gxl6c_calico-system(5ee70bb0-55b7-4a80-b5cb-3133091615ae): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 00:08:52.678098 kubelet[2832]: E0114 00:08:52.678026 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:08:59.335345 kubelet[2832]: E0114 00:08:59.335039 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:09:03.334135 kubelet[2832]: E0114 00:09:03.334077 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:09:03.335007 kubelet[2832]: E0114 00:09:03.334959 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:09:04.337783 kubelet[2832]: E0114 00:09:04.337664 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:09:05.336403 kubelet[2832]: E0114 00:09:05.336319 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:09:06.338265 kubelet[2832]: E0114 00:09:06.338120 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:09:14.336967 containerd[1590]: time="2026-01-14T00:09:14.335289276Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 00:09:14.689364 containerd[1590]: time="2026-01-14T00:09:14.688991938Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:09:14.692459 containerd[1590]: time="2026-01-14T00:09:14.692387853Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 00:09:14.692768 containerd[1590]: time="2026-01-14T00:09:14.692488534Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 00:09:14.692849 kubelet[2832]: E0114 00:09:14.692635 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:09:14.692849 kubelet[2832]: E0114 00:09:14.692677 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:09:14.693392 kubelet[2832]: E0114 00:09:14.693020 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p28dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-hrn72_calico-system(1e53bd66-4746-482e-bb2b-bfd29a1ef20e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 00:09:14.695262 kubelet[2832]: E0114 00:09:14.695031 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:09:14.695330 containerd[1590]: time="2026-01-14T00:09:14.694717078Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:09:15.011950 containerd[1590]: time="2026-01-14T00:09:15.011677693Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:09:15.013711 containerd[1590]: time="2026-01-14T00:09:15.013446235Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:09:15.013711 containerd[1590]: time="2026-01-14T00:09:15.013515556Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:09:15.013854 kubelet[2832]: E0114 00:09:15.013792 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:09:15.013854 kubelet[2832]: E0114 00:09:15.013836 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:09:15.015123 kubelet[2832]: E0114 00:09:15.013972 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cr65d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-7bt2c_calico-apiserver(3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:09:15.015292 kubelet[2832]: E0114 00:09:15.015129 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:09:18.339536 containerd[1590]: time="2026-01-14T00:09:18.339458027Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 00:09:18.674386 containerd[1590]: time="2026-01-14T00:09:18.674214346Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:09:18.675992 containerd[1590]: time="2026-01-14T00:09:18.675927175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 00:09:18.676245 containerd[1590]: time="2026-01-14T00:09:18.675931655Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 00:09:18.676621 kubelet[2832]: E0114 00:09:18.676413 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:09:18.676621 kubelet[2832]: E0114 00:09:18.676463 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:09:18.677018 containerd[1590]: time="2026-01-14T00:09:18.676768109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 00:09:18.677575 kubelet[2832]: E0114 00:09:18.677080 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r74wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b44cc6f4-gxl6c_calico-system(5ee70bb0-55b7-4a80-b5cb-3133091615ae): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 00:09:18.680095 kubelet[2832]: E0114 00:09:18.678917 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:09:19.010129 containerd[1590]: time="2026-01-14T00:09:19.009987016Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:09:19.011670 containerd[1590]: time="2026-01-14T00:09:19.011544645Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 00:09:19.011670 containerd[1590]: time="2026-01-14T00:09:19.011552725Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 00:09:19.012854 kubelet[2832]: E0114 00:09:19.012597 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:09:19.012854 kubelet[2832]: E0114 00:09:19.012649 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:09:19.012854 kubelet[2832]: E0114 00:09:19.012775 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c290d008f4c4d48b25c8570357599ee,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 00:09:19.017153 containerd[1590]: time="2026-01-14T00:09:19.015697682Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 00:09:19.352033 containerd[1590]: time="2026-01-14T00:09:19.351961850Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:09:19.353612 containerd[1590]: time="2026-01-14T00:09:19.353548600Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 00:09:19.353758 containerd[1590]: time="2026-01-14T00:09:19.353643201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 00:09:19.353870 kubelet[2832]: E0114 00:09:19.353785 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:09:19.353870 kubelet[2832]: E0114 00:09:19.353849 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:09:19.355572 kubelet[2832]: E0114 00:09:19.354016 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 00:09:19.356141 containerd[1590]: time="2026-01-14T00:09:19.355506836Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:09:19.356214 kubelet[2832]: E0114 00:09:19.355757 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:09:19.888744 containerd[1590]: time="2026-01-14T00:09:19.888686880Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:09:19.890186 containerd[1590]: time="2026-01-14T00:09:19.890135827Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:09:19.890318 containerd[1590]: time="2026-01-14T00:09:19.890222909Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:09:19.891548 kubelet[2832]: E0114 00:09:19.891479 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:09:19.893716 kubelet[2832]: E0114 00:09:19.891559 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:09:19.893716 kubelet[2832]: E0114 00:09:19.891754 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md5mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-vdxqm_calico-apiserver(1e4bec8e-a684-46cb-852e-ae05ed7b56d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:09:19.893716 kubelet[2832]: E0114 00:09:19.893262 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:09:20.338488 containerd[1590]: time="2026-01-14T00:09:20.338435155Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 00:09:20.689817 containerd[1590]: time="2026-01-14T00:09:20.689276463Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:09:20.692316 containerd[1590]: time="2026-01-14T00:09:20.692171641Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 00:09:20.693038 containerd[1590]: time="2026-01-14T00:09:20.692602129Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 00:09:20.693226 kubelet[2832]: E0114 00:09:20.692852 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:09:20.693226 kubelet[2832]: E0114 00:09:20.692918 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:09:20.693912 kubelet[2832]: E0114 00:09:20.693679 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 00:09:20.696353 containerd[1590]: time="2026-01-14T00:09:20.696316003Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 00:09:21.033230 containerd[1590]: time="2026-01-14T00:09:21.033069516Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:09:21.034689 containerd[1590]: time="2026-01-14T00:09:21.034552668Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 00:09:21.035074 containerd[1590]: time="2026-01-14T00:09:21.034604909Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 00:09:21.035369 kubelet[2832]: E0114 00:09:21.035089 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:09:21.035369 kubelet[2832]: E0114 00:09:21.035141 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:09:21.035369 kubelet[2832]: E0114 00:09:21.035263 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 00:09:21.037413 kubelet[2832]: E0114 00:09:21.036566 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:09:26.335229 kubelet[2832]: E0114 00:09:26.334778 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:09:30.334738 kubelet[2832]: E0114 00:09:30.334115 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:09:32.336927 kubelet[2832]: E0114 00:09:32.336879 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:09:33.334348 kubelet[2832]: E0114 00:09:33.334280 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:09:33.336568 kubelet[2832]: E0114 00:09:33.336484 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:09:34.335551 kubelet[2832]: E0114 00:09:34.335469 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:09:39.335391 kubelet[2832]: E0114 00:09:39.335137 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:09:42.337411 kubelet[2832]: E0114 00:09:42.337301 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:09:47.335551 kubelet[2832]: E0114 00:09:47.335141 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:09:47.335551 kubelet[2832]: E0114 00:09:47.335479 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:09:48.340089 kubelet[2832]: E0114 00:09:48.340038 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:09:49.335875 kubelet[2832]: E0114 00:09:49.335763 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:09:52.335978 kubelet[2832]: E0114 00:09:52.335510 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:09:53.333634 kubelet[2832]: E0114 00:09:53.333566 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:10:00.336858 containerd[1590]: time="2026-01-14T00:10:00.336581796Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 00:10:00.671367 containerd[1590]: time="2026-01-14T00:10:00.670664911Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:10:00.673339 containerd[1590]: time="2026-01-14T00:10:00.673178283Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 00:10:00.673339 containerd[1590]: time="2026-01-14T00:10:00.673226445Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 00:10:00.673776 kubelet[2832]: E0114 00:10:00.673725 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:10:00.674240 kubelet[2832]: E0114 00:10:00.673792 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:10:00.674240 kubelet[2832]: E0114 00:10:00.674165 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r74wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b44cc6f4-gxl6c_calico-system(5ee70bb0-55b7-4a80-b5cb-3133091615ae): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 00:10:00.674830 containerd[1590]: time="2026-01-14T00:10:00.674789247Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 00:10:00.675495 kubelet[2832]: E0114 00:10:00.675357 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:10:01.015965 containerd[1590]: time="2026-01-14T00:10:01.015712286Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:10:01.017422 containerd[1590]: time="2026-01-14T00:10:01.017310170Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 00:10:01.017422 containerd[1590]: time="2026-01-14T00:10:01.017378734Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 00:10:01.019385 kubelet[2832]: E0114 00:10:01.018696 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:10:01.019385 kubelet[2832]: E0114 00:10:01.018745 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:10:01.019385 kubelet[2832]: E0114 00:10:01.018860 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c290d008f4c4d48b25c8570357599ee,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 00:10:01.021416 containerd[1590]: time="2026-01-14T00:10:01.021391865Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 00:10:01.538782 containerd[1590]: time="2026-01-14T00:10:01.538606605Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:10:01.540236 containerd[1590]: time="2026-01-14T00:10:01.540176528Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 00:10:01.540567 containerd[1590]: time="2026-01-14T00:10:01.540281173Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 00:10:01.540982 kubelet[2832]: E0114 00:10:01.540847 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:10:01.541294 kubelet[2832]: E0114 00:10:01.541095 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:10:01.542828 kubelet[2832]: E0114 00:10:01.542762 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 00:10:01.544074 kubelet[2832]: E0114 00:10:01.544008 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:10:02.335947 containerd[1590]: time="2026-01-14T00:10:02.335721425Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:10:02.789972 containerd[1590]: time="2026-01-14T00:10:02.788696752Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:10:02.791679 containerd[1590]: time="2026-01-14T00:10:02.791489100Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:10:02.792061 containerd[1590]: time="2026-01-14T00:10:02.791894281Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:10:02.792722 kubelet[2832]: E0114 00:10:02.792681 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:10:02.793127 kubelet[2832]: E0114 00:10:02.792766 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:10:02.793127 kubelet[2832]: E0114 00:10:02.792900 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md5mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-vdxqm_calico-apiserver(1e4bec8e-a684-46cb-852e-ae05ed7b56d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:10:02.794405 kubelet[2832]: E0114 00:10:02.794347 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:10:03.335558 containerd[1590]: time="2026-01-14T00:10:03.335451942Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 00:10:03.657249 containerd[1590]: time="2026-01-14T00:10:03.655403206Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:10:03.659158 containerd[1590]: time="2026-01-14T00:10:03.658819148Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 00:10:03.659368 containerd[1590]: time="2026-01-14T00:10:03.659143126Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 00:10:03.660534 kubelet[2832]: E0114 00:10:03.659676 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:10:03.660534 kubelet[2832]: E0114 00:10:03.659728 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:10:03.660983 containerd[1590]: time="2026-01-14T00:10:03.660754452Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 00:10:03.661487 kubelet[2832]: E0114 00:10:03.661396 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 00:10:04.056584 containerd[1590]: time="2026-01-14T00:10:04.055552696Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:10:04.057686 containerd[1590]: time="2026-01-14T00:10:04.057617088Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 00:10:04.057851 containerd[1590]: time="2026-01-14T00:10:04.057735414Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 00:10:04.059845 kubelet[2832]: E0114 00:10:04.059571 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:10:04.059845 kubelet[2832]: E0114 00:10:04.059631 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:10:04.060178 kubelet[2832]: E0114 00:10:04.059948 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p28dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-hrn72_calico-system(1e53bd66-4746-482e-bb2b-bfd29a1ef20e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 00:10:04.061534 kubelet[2832]: E0114 00:10:04.061386 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:10:04.061599 containerd[1590]: time="2026-01-14T00:10:04.061539539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 00:10:04.393553 containerd[1590]: time="2026-01-14T00:10:04.392951894Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:10:04.395390 containerd[1590]: time="2026-01-14T00:10:04.395280380Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 00:10:04.395493 containerd[1590]: time="2026-01-14T00:10:04.395359304Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 00:10:04.395853 kubelet[2832]: E0114 00:10:04.395746 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:10:04.395853 kubelet[2832]: E0114 00:10:04.395798 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:10:04.396175 kubelet[2832]: E0114 00:10:04.396110 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 00:10:04.397611 kubelet[2832]: E0114 00:10:04.397562 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:10:07.335954 containerd[1590]: time="2026-01-14T00:10:07.335458305Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:10:07.668303 containerd[1590]: time="2026-01-14T00:10:07.668136586Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:10:07.669759 containerd[1590]: time="2026-01-14T00:10:07.669678911Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:10:07.669891 containerd[1590]: time="2026-01-14T00:10:07.669818078Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:10:07.670228 kubelet[2832]: E0114 00:10:07.670168 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:10:07.670685 kubelet[2832]: E0114 00:10:07.670263 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:10:07.670685 kubelet[2832]: E0114 00:10:07.670561 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cr65d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-7bt2c_calico-apiserver(3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:10:07.670685 kubelet[2832]: E0114 00:10:07.671709 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:10:13.333742 kubelet[2832]: E0114 00:10:13.333659 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:10:14.338631 kubelet[2832]: E0114 00:10:14.338575 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:10:15.336033 kubelet[2832]: E0114 00:10:15.335946 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:10:16.153000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-46.224.77.139:22-4.153.228.146:51454 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:16.154649 systemd[1]: Started sshd@7-46.224.77.139:22-4.153.228.146:51454.service - OpenSSH per-connection server daemon (4.153.228.146:51454). Jan 14 00:10:16.157688 kernel: kauditd_printk_skb: 188 callbacks suppressed Jan 14 00:10:16.157866 kernel: audit: type=1130 audit(1768349416.153:734): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-46.224.77.139:22-4.153.228.146:51454 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:16.341381 kubelet[2832]: E0114 00:10:16.341089 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:10:16.344177 kubelet[2832]: E0114 00:10:16.342994 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:10:16.737000 audit[5191]: USER_ACCT pid=5191 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:16.740502 sshd[5191]: Accepted publickey for core from 4.153.228.146 port 51454 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:10:16.740000 audit[5191]: CRED_ACQ pid=5191 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:16.744250 kernel: audit: type=1101 audit(1768349416.737:735): pid=5191 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:16.744337 kernel: audit: type=1103 audit(1768349416.740:736): pid=5191 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:16.743429 sshd-session[5191]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:10:16.745861 kernel: audit: type=1006 audit(1768349416.741:737): pid=5191 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Jan 14 00:10:16.741000 audit[5191]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd0ad3e80 a2=3 a3=0 items=0 ppid=1 pid=5191 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:16.748151 kernel: audit: type=1300 audit(1768349416.741:737): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd0ad3e80 a2=3 a3=0 items=0 ppid=1 pid=5191 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:16.741000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:16.750830 kernel: audit: type=1327 audit(1768349416.741:737): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:16.756878 systemd-logind[1545]: New session 9 of user core. Jan 14 00:10:16.760770 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 14 00:10:16.764000 audit[5191]: USER_START pid=5191 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:16.766000 audit[5195]: CRED_ACQ pid=5195 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:16.770653 kernel: audit: type=1105 audit(1768349416.764:738): pid=5191 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:16.770723 kernel: audit: type=1103 audit(1768349416.766:739): pid=5195 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:17.182550 sshd[5195]: Connection closed by 4.153.228.146 port 51454 Jan 14 00:10:17.183355 sshd-session[5191]: pam_unix(sshd:session): session closed for user core Jan 14 00:10:17.183000 audit[5191]: USER_END pid=5191 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:17.190300 systemd[1]: sshd@7-46.224.77.139:22-4.153.228.146:51454.service: Deactivated successfully. Jan 14 00:10:17.184000 audit[5191]: CRED_DISP pid=5191 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:17.192559 kernel: audit: type=1106 audit(1768349417.183:740): pid=5191 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:17.192644 kernel: audit: type=1104 audit(1768349417.184:741): pid=5191 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:17.193458 systemd[1]: session-9.scope: Deactivated successfully. Jan 14 00:10:17.189000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-46.224.77.139:22-4.153.228.146:51454 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:17.197408 systemd-logind[1545]: Session 9 logged out. Waiting for processes to exit. Jan 14 00:10:17.199834 systemd-logind[1545]: Removed session 9. Jan 14 00:10:22.289000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-46.224.77.139:22-4.153.228.146:51462 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:22.289846 systemd[1]: Started sshd@8-46.224.77.139:22-4.153.228.146:51462.service - OpenSSH per-connection server daemon (4.153.228.146:51462). Jan 14 00:10:22.292633 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:10:22.293015 kernel: audit: type=1130 audit(1768349422.289:743): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-46.224.77.139:22-4.153.228.146:51462 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:22.334984 kubelet[2832]: E0114 00:10:22.334605 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:10:22.825000 audit[5208]: USER_ACCT pid=5208 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:22.826686 sshd[5208]: Accepted publickey for core from 4.153.228.146 port 51462 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:10:22.828921 sshd-session[5208]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:10:22.827000 audit[5208]: CRED_ACQ pid=5208 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:22.832131 kernel: audit: type=1101 audit(1768349422.825:744): pid=5208 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:22.832198 kernel: audit: type=1103 audit(1768349422.827:745): pid=5208 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:22.834809 kernel: audit: type=1006 audit(1768349422.827:746): pid=5208 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 14 00:10:22.835049 kernel: audit: type=1300 audit(1768349422.827:746): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff189afa0 a2=3 a3=0 items=0 ppid=1 pid=5208 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:22.827000 audit[5208]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff189afa0 a2=3 a3=0 items=0 ppid=1 pid=5208 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:22.827000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:22.839540 kernel: audit: type=1327 audit(1768349422.827:746): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:22.841183 systemd-logind[1545]: New session 10 of user core. Jan 14 00:10:22.854427 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 14 00:10:22.859000 audit[5208]: USER_START pid=5208 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:22.863000 audit[5212]: CRED_ACQ pid=5212 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:22.867055 kernel: audit: type=1105 audit(1768349422.859:747): pid=5208 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:22.867224 kernel: audit: type=1103 audit(1768349422.863:748): pid=5212 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:23.204764 sshd[5212]: Connection closed by 4.153.228.146 port 51462 Jan 14 00:10:23.205645 sshd-session[5208]: pam_unix(sshd:session): session closed for user core Jan 14 00:10:23.209000 audit[5208]: USER_END pid=5208 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:23.214080 systemd[1]: sshd@8-46.224.77.139:22-4.153.228.146:51462.service: Deactivated successfully. Jan 14 00:10:23.216684 kernel: audit: type=1106 audit(1768349423.209:749): pid=5208 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:23.216777 kernel: audit: type=1104 audit(1768349423.209:750): pid=5208 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:23.209000 audit[5208]: CRED_DISP pid=5208 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:23.216000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-46.224.77.139:22-4.153.228.146:51462 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:23.219737 systemd[1]: session-10.scope: Deactivated successfully. Jan 14 00:10:23.220985 systemd-logind[1545]: Session 10 logged out. Waiting for processes to exit. Jan 14 00:10:23.222471 systemd-logind[1545]: Removed session 10. Jan 14 00:10:24.693861 systemd[1]: Started sshd@9-46.224.77.139:22-199.45.155.73:37968.service - OpenSSH per-connection server daemon (199.45.155.73:37968). Jan 14 00:10:24.693000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-46.224.77.139:22-199.45.155.73:37968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:26.336175 kubelet[2832]: E0114 00:10:26.335499 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:10:27.335152 kubelet[2832]: E0114 00:10:27.335073 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:10:27.335152 kubelet[2832]: E0114 00:10:27.335291 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:10:27.338338 kubelet[2832]: E0114 00:10:27.338276 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:10:27.339096 kubelet[2832]: E0114 00:10:27.339049 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:10:28.328769 kernel: kauditd_printk_skb: 2 callbacks suppressed Jan 14 00:10:28.328871 kernel: audit: type=1130 audit(1768349428.324:753): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-46.224.77.139:22-4.153.228.146:42270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:28.324000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-46.224.77.139:22-4.153.228.146:42270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:28.325821 systemd[1]: Started sshd@10-46.224.77.139:22-4.153.228.146:42270.service - OpenSSH per-connection server daemon (4.153.228.146:42270). Jan 14 00:10:28.884000 audit[5232]: USER_ACCT pid=5232 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:28.886586 sshd[5232]: Accepted publickey for core from 4.153.228.146 port 42270 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:10:28.887000 audit[5232]: CRED_ACQ pid=5232 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:28.889847 sshd-session[5232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:10:28.891318 kernel: audit: type=1101 audit(1768349428.884:754): pid=5232 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:28.891381 kernel: audit: type=1103 audit(1768349428.887:755): pid=5232 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:28.894487 kernel: audit: type=1006 audit(1768349428.887:756): pid=5232 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 14 00:10:28.894562 kernel: audit: type=1300 audit(1768349428.887:756): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc44ea9e0 a2=3 a3=0 items=0 ppid=1 pid=5232 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:28.887000 audit[5232]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc44ea9e0 a2=3 a3=0 items=0 ppid=1 pid=5232 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:28.887000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:28.899291 kernel: audit: type=1327 audit(1768349428.887:756): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:28.902574 systemd-logind[1545]: New session 11 of user core. Jan 14 00:10:28.911774 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 14 00:10:28.914000 audit[5232]: USER_START pid=5232 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:28.919000 audit[5236]: CRED_ACQ pid=5236 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:28.926754 kernel: audit: type=1105 audit(1768349428.914:757): pid=5232 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:28.926868 kernel: audit: type=1103 audit(1768349428.919:758): pid=5236 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:29.287628 sshd[5236]: Connection closed by 4.153.228.146 port 42270 Jan 14 00:10:29.288934 sshd-session[5232]: pam_unix(sshd:session): session closed for user core Jan 14 00:10:29.289000 audit[5232]: USER_END pid=5232 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:29.289000 audit[5232]: CRED_DISP pid=5232 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:29.300454 kernel: audit: type=1106 audit(1768349429.289:759): pid=5232 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:29.300546 kernel: audit: type=1104 audit(1768349429.289:760): pid=5232 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:29.299025 systemd[1]: sshd@10-46.224.77.139:22-4.153.228.146:42270.service: Deactivated successfully. Jan 14 00:10:29.297000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-46.224.77.139:22-4.153.228.146:42270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:29.302188 systemd[1]: session-11.scope: Deactivated successfully. Jan 14 00:10:29.304940 systemd-logind[1545]: Session 11 logged out. Waiting for processes to exit. Jan 14 00:10:29.307893 systemd-logind[1545]: Removed session 11. Jan 14 00:10:33.334756 kubelet[2832]: E0114 00:10:33.334691 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:10:34.393132 systemd[1]: Started sshd@11-46.224.77.139:22-4.153.228.146:42272.service - OpenSSH per-connection server daemon (4.153.228.146:42272). Jan 14 00:10:34.396048 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:10:34.396091 kernel: audit: type=1130 audit(1768349434.391:762): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-46.224.77.139:22-4.153.228.146:42272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:34.391000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-46.224.77.139:22-4.153.228.146:42272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:34.926000 audit[5274]: USER_ACCT pid=5274 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:34.928083 sshd[5274]: Accepted publickey for core from 4.153.228.146 port 42272 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:10:34.931596 kernel: audit: type=1101 audit(1768349434.926:763): pid=5274 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:34.930000 audit[5274]: CRED_ACQ pid=5274 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:34.933673 sshd-session[5274]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:10:34.936929 kernel: audit: type=1103 audit(1768349434.930:764): pid=5274 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:34.936996 kernel: audit: type=1006 audit(1768349434.930:765): pid=5274 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Jan 14 00:10:34.930000 audit[5274]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe5b51820 a2=3 a3=0 items=0 ppid=1 pid=5274 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:34.939927 kernel: audit: type=1300 audit(1768349434.930:765): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe5b51820 a2=3 a3=0 items=0 ppid=1 pid=5274 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:34.930000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:34.943547 kernel: audit: type=1327 audit(1768349434.930:765): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:34.945751 systemd-logind[1545]: New session 12 of user core. Jan 14 00:10:34.953987 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 14 00:10:34.955000 audit[5274]: USER_START pid=5274 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:34.959000 audit[5278]: CRED_ACQ pid=5278 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:34.963937 kernel: audit: type=1105 audit(1768349434.955:766): pid=5274 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:34.963995 kernel: audit: type=1103 audit(1768349434.959:767): pid=5278 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:35.323978 sshd[5278]: Connection closed by 4.153.228.146 port 42272 Jan 14 00:10:35.325723 sshd-session[5274]: pam_unix(sshd:session): session closed for user core Jan 14 00:10:35.326000 audit[5274]: USER_END pid=5274 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:35.335543 kernel: audit: type=1106 audit(1768349435.326:768): pid=5274 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:35.335641 kernel: audit: type=1104 audit(1768349435.328:769): pid=5274 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:35.328000 audit[5274]: CRED_DISP pid=5274 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:35.333916 systemd[1]: sshd@11-46.224.77.139:22-4.153.228.146:42272.service: Deactivated successfully. Jan 14 00:10:35.332000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-46.224.77.139:22-4.153.228.146:42272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:35.337001 systemd[1]: session-12.scope: Deactivated successfully. Jan 14 00:10:35.341497 systemd-logind[1545]: Session 12 logged out. Waiting for processes to exit. Jan 14 00:10:35.343616 systemd-logind[1545]: Removed session 12. Jan 14 00:10:38.337959 kubelet[2832]: E0114 00:10:38.337593 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:10:38.339764 kubelet[2832]: E0114 00:10:38.339656 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:10:39.334251 kubelet[2832]: E0114 00:10:39.334143 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:10:39.810567 sshd[5228]: Connection closed by 199.45.155.73 port 37968 [preauth] Jan 14 00:10:39.813330 systemd[1]: sshd@9-46.224.77.139:22-199.45.155.73:37968.service: Deactivated successfully. Jan 14 00:10:39.814000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-46.224.77.139:22-199.45.155.73:37968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:39.817474 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:10:39.817564 kernel: audit: type=1131 audit(1768349439.814:771): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-46.224.77.139:22-199.45.155.73:37968 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:40.337196 kubelet[2832]: E0114 00:10:40.337112 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:10:40.434000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-46.224.77.139:22-4.153.228.146:36434 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:40.435381 systemd[1]: Started sshd@12-46.224.77.139:22-4.153.228.146:36434.service - OpenSSH per-connection server daemon (4.153.228.146:36434). Jan 14 00:10:40.441566 kernel: audit: type=1130 audit(1768349440.434:772): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-46.224.77.139:22-4.153.228.146:36434 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:40.992000 audit[5293]: USER_ACCT pid=5293 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:40.993158 sshd[5293]: Accepted publickey for core from 4.153.228.146 port 36434 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:10:40.995000 audit[5293]: CRED_ACQ pid=5293 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:40.998451 kernel: audit: type=1101 audit(1768349440.992:773): pid=5293 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:40.998530 kernel: audit: type=1103 audit(1768349440.995:774): pid=5293 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:40.997310 sshd-session[5293]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:10:41.001426 kernel: audit: type=1006 audit(1768349440.995:775): pid=5293 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Jan 14 00:10:40.995000 audit[5293]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe38ee490 a2=3 a3=0 items=0 ppid=1 pid=5293 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:41.004232 kernel: audit: type=1300 audit(1768349440.995:775): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe38ee490 a2=3 a3=0 items=0 ppid=1 pid=5293 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:40.995000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:41.006192 kernel: audit: type=1327 audit(1768349440.995:775): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:41.008689 systemd-logind[1545]: New session 13 of user core. Jan 14 00:10:41.013760 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 14 00:10:41.018000 audit[5293]: USER_START pid=5293 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:41.027132 kernel: audit: type=1105 audit(1768349441.018:776): pid=5293 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:41.027266 kernel: audit: type=1103 audit(1768349441.022:777): pid=5297 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:41.022000 audit[5297]: CRED_ACQ pid=5297 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:41.333956 kubelet[2832]: E0114 00:10:41.333810 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:10:41.382429 sshd[5297]: Connection closed by 4.153.228.146 port 36434 Jan 14 00:10:41.384705 sshd-session[5293]: pam_unix(sshd:session): session closed for user core Jan 14 00:10:41.386000 audit[5293]: USER_END pid=5293 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:41.386000 audit[5293]: CRED_DISP pid=5293 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:41.394560 kernel: audit: type=1106 audit(1768349441.386:778): pid=5293 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:41.394736 systemd[1]: sshd@12-46.224.77.139:22-4.153.228.146:36434.service: Deactivated successfully. Jan 14 00:10:41.395000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-46.224.77.139:22-4.153.228.146:36434 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:41.399057 systemd[1]: session-13.scope: Deactivated successfully. Jan 14 00:10:41.401586 systemd-logind[1545]: Session 13 logged out. Waiting for processes to exit. Jan 14 00:10:41.404585 systemd-logind[1545]: Removed session 13. Jan 14 00:10:46.338588 kubelet[2832]: E0114 00:10:46.338288 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:10:46.506739 kernel: kauditd_printk_skb: 2 callbacks suppressed Jan 14 00:10:46.506856 kernel: audit: type=1130 audit(1768349446.502:781): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-46.224.77.139:22-4.153.228.146:37260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:46.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-46.224.77.139:22-4.153.228.146:37260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:46.503876 systemd[1]: Started sshd@13-46.224.77.139:22-4.153.228.146:37260.service - OpenSSH per-connection server daemon (4.153.228.146:37260). Jan 14 00:10:47.078000 audit[5313]: USER_ACCT pid=5313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.083006 sshd[5313]: Accepted publickey for core from 4.153.228.146 port 37260 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:10:47.087325 kernel: audit: type=1101 audit(1768349447.078:782): pid=5313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.087416 kernel: audit: type=1103 audit(1768349447.082:783): pid=5313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.082000 audit[5313]: CRED_ACQ pid=5313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.085488 sshd-session[5313]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:10:47.089633 kernel: audit: type=1006 audit(1768349447.082:784): pid=5313 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 14 00:10:47.082000 audit[5313]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffff55ee90 a2=3 a3=0 items=0 ppid=1 pid=5313 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:47.082000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:47.097472 kernel: audit: type=1300 audit(1768349447.082:784): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffff55ee90 a2=3 a3=0 items=0 ppid=1 pid=5313 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:47.098006 kernel: audit: type=1327 audit(1768349447.082:784): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:47.097152 systemd-logind[1545]: New session 14 of user core. Jan 14 00:10:47.103817 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 14 00:10:47.110000 audit[5313]: USER_START pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.116692 kernel: audit: type=1105 audit(1768349447.110:785): pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.115000 audit[5317]: CRED_ACQ pid=5317 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.119632 kernel: audit: type=1103 audit(1768349447.115:786): pid=5317 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.525600 sshd[5317]: Connection closed by 4.153.228.146 port 37260 Jan 14 00:10:47.527783 sshd-session[5313]: pam_unix(sshd:session): session closed for user core Jan 14 00:10:47.528000 audit[5313]: USER_END pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.533588 systemd[1]: sshd@13-46.224.77.139:22-4.153.228.146:37260.service: Deactivated successfully. Jan 14 00:10:47.528000 audit[5313]: CRED_DISP pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.539211 kernel: audit: type=1106 audit(1768349447.528:787): pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.539351 kernel: audit: type=1104 audit(1768349447.528:788): pid=5313 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:47.538988 systemd[1]: session-14.scope: Deactivated successfully. Jan 14 00:10:47.532000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-46.224.77.139:22-4.153.228.146:37260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:47.542096 systemd-logind[1545]: Session 14 logged out. Waiting for processes to exit. Jan 14 00:10:47.543845 systemd-logind[1545]: Removed session 14. Jan 14 00:10:52.336432 kubelet[2832]: E0114 00:10:52.336382 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:10:52.338159 kubelet[2832]: E0114 00:10:52.337996 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:10:52.341363 kubelet[2832]: E0114 00:10:52.341305 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:10:52.343342 kubelet[2832]: E0114 00:10:52.343300 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:10:52.638863 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:10:52.638962 kernel: audit: type=1130 audit(1768349452.634:790): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-46.224.77.139:22-4.153.228.146:37264 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:52.634000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-46.224.77.139:22-4.153.228.146:37264 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:52.635027 systemd[1]: Started sshd@14-46.224.77.139:22-4.153.228.146:37264.service - OpenSSH per-connection server daemon (4.153.228.146:37264). Jan 14 00:10:53.183000 audit[5329]: USER_ACCT pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.187732 sshd[5329]: Accepted publickey for core from 4.153.228.146 port 37264 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:10:53.187996 sshd-session[5329]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:10:53.186000 audit[5329]: CRED_ACQ pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.188643 kernel: audit: type=1101 audit(1768349453.183:791): pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.192793 kernel: audit: type=1103 audit(1768349453.186:792): pid=5329 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.192880 kernel: audit: type=1006 audit(1768349453.186:793): pid=5329 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Jan 14 00:10:53.192900 kernel: audit: type=1300 audit(1768349453.186:793): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffda514de0 a2=3 a3=0 items=0 ppid=1 pid=5329 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:53.186000 audit[5329]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffda514de0 a2=3 a3=0 items=0 ppid=1 pid=5329 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:53.196480 kernel: audit: type=1327 audit(1768349453.186:793): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:53.186000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:53.198327 systemd-logind[1545]: New session 15 of user core. Jan 14 00:10:53.204786 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 14 00:10:53.210000 audit[5329]: USER_START pid=5329 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.212000 audit[5333]: CRED_ACQ pid=5333 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.217403 kernel: audit: type=1105 audit(1768349453.210:794): pid=5329 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.217559 kernel: audit: type=1103 audit(1768349453.212:795): pid=5333 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.333913 kubelet[2832]: E0114 00:10:53.333838 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:10:53.560373 sshd[5333]: Connection closed by 4.153.228.146 port 37264 Jan 14 00:10:53.560245 sshd-session[5329]: pam_unix(sshd:session): session closed for user core Jan 14 00:10:53.565000 audit[5329]: USER_END pid=5329 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.567000 audit[5329]: CRED_DISP pid=5329 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.572551 kernel: audit: type=1106 audit(1768349453.565:796): pid=5329 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.572606 kernel: audit: type=1104 audit(1768349453.567:797): pid=5329 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:53.573216 systemd[1]: sshd@14-46.224.77.139:22-4.153.228.146:37264.service: Deactivated successfully. Jan 14 00:10:53.572000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-46.224.77.139:22-4.153.228.146:37264 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:53.576310 systemd[1]: session-15.scope: Deactivated successfully. Jan 14 00:10:53.580679 systemd-logind[1545]: Session 15 logged out. Waiting for processes to exit. Jan 14 00:10:53.583798 systemd-logind[1545]: Removed session 15. Jan 14 00:10:58.670331 systemd[1]: Started sshd@15-46.224.77.139:22-4.153.228.146:54574.service - OpenSSH per-connection server daemon (4.153.228.146:54574). Jan 14 00:10:58.673637 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:10:58.673755 kernel: audit: type=1130 audit(1768349458.669:799): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-46.224.77.139:22-4.153.228.146:54574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:58.669000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-46.224.77.139:22-4.153.228.146:54574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:59.229000 audit[5348]: USER_ACCT pid=5348 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.235541 kernel: audit: type=1101 audit(1768349459.229:800): pid=5348 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.235659 sshd[5348]: Accepted publickey for core from 4.153.228.146 port 54574 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:10:59.236000 audit[5348]: CRED_ACQ pid=5348 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.240929 sshd-session[5348]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:10:59.243960 kernel: audit: type=1103 audit(1768349459.236:801): pid=5348 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.244073 kernel: audit: type=1006 audit(1768349459.239:802): pid=5348 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 14 00:10:59.239000 audit[5348]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe27561b0 a2=3 a3=0 items=0 ppid=1 pid=5348 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:59.247468 kernel: audit: type=1300 audit(1768349459.239:802): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe27561b0 a2=3 a3=0 items=0 ppid=1 pid=5348 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:10:59.239000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:59.250951 kernel: audit: type=1327 audit(1768349459.239:802): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:10:59.253726 systemd-logind[1545]: New session 16 of user core. Jan 14 00:10:59.258722 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 14 00:10:59.263000 audit[5348]: USER_START pid=5348 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.268000 audit[5352]: CRED_ACQ pid=5352 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.270694 kernel: audit: type=1105 audit(1768349459.263:803): pid=5348 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.270810 kernel: audit: type=1103 audit(1768349459.268:804): pid=5352 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.334443 kubelet[2832]: E0114 00:10:59.334395 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:10:59.642620 sshd[5352]: Connection closed by 4.153.228.146 port 54574 Jan 14 00:10:59.643601 sshd-session[5348]: pam_unix(sshd:session): session closed for user core Jan 14 00:10:59.646000 audit[5348]: USER_END pid=5348 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.646000 audit[5348]: CRED_DISP pid=5348 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.651379 kernel: audit: type=1106 audit(1768349459.646:805): pid=5348 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.651433 kernel: audit: type=1104 audit(1768349459.646:806): pid=5348 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:10:59.651872 systemd[1]: sshd@15-46.224.77.139:22-4.153.228.146:54574.service: Deactivated successfully. Jan 14 00:10:59.653000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-46.224.77.139:22-4.153.228.146:54574 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:10:59.656381 systemd[1]: session-16.scope: Deactivated successfully. Jan 14 00:10:59.658591 systemd-logind[1545]: Session 16 logged out. Waiting for processes to exit. Jan 14 00:10:59.662601 systemd-logind[1545]: Removed session 16. Jan 14 00:11:03.334700 kubelet[2832]: E0114 00:11:03.334566 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:11:04.335442 kubelet[2832]: E0114 00:11:04.335272 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:11:04.336939 kubelet[2832]: E0114 00:11:04.336895 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:11:04.752086 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:04.752195 kernel: audit: type=1130 audit(1768349464.750:808): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-46.224.77.139:22-4.153.228.146:39122 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:04.750000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-46.224.77.139:22-4.153.228.146:39122 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:04.750657 systemd[1]: Started sshd@16-46.224.77.139:22-4.153.228.146:39122.service - OpenSSH per-connection server daemon (4.153.228.146:39122). Jan 14 00:11:05.282959 sshd[5390]: Accepted publickey for core from 4.153.228.146 port 39122 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:05.282000 audit[5390]: USER_ACCT pid=5390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.286715 sshd-session[5390]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:05.289585 kernel: audit: type=1101 audit(1768349465.282:809): pid=5390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.289666 kernel: audit: type=1103 audit(1768349465.285:810): pid=5390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.285000 audit[5390]: CRED_ACQ pid=5390 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.292182 kernel: audit: type=1006 audit(1768349465.285:811): pid=5390 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Jan 14 00:11:05.294780 kernel: audit: type=1300 audit(1768349465.285:811): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdf6b8f30 a2=3 a3=0 items=0 ppid=1 pid=5390 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:05.285000 audit[5390]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdf6b8f30 a2=3 a3=0 items=0 ppid=1 pid=5390 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:05.285000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:05.295995 kernel: audit: type=1327 audit(1768349465.285:811): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:05.299557 systemd-logind[1545]: New session 17 of user core. Jan 14 00:11:05.305734 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 14 00:11:05.311000 audit[5390]: USER_START pid=5390 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.315000 audit[5395]: CRED_ACQ pid=5395 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.318453 kernel: audit: type=1105 audit(1768349465.311:812): pid=5390 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.318534 kernel: audit: type=1103 audit(1768349465.315:813): pid=5395 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.335798 kubelet[2832]: E0114 00:11:05.335677 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:11:05.663764 sshd[5395]: Connection closed by 4.153.228.146 port 39122 Jan 14 00:11:05.664320 sshd-session[5390]: pam_unix(sshd:session): session closed for user core Jan 14 00:11:05.668000 audit[5390]: USER_END pid=5390 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.676263 systemd[1]: sshd@16-46.224.77.139:22-4.153.228.146:39122.service: Deactivated successfully. Jan 14 00:11:05.668000 audit[5390]: CRED_DISP pid=5390 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.678300 kernel: audit: type=1106 audit(1768349465.668:814): pid=5390 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.678436 kernel: audit: type=1104 audit(1768349465.668:815): pid=5390 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:05.678000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-46.224.77.139:22-4.153.228.146:39122 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:05.683737 systemd[1]: session-17.scope: Deactivated successfully. Jan 14 00:11:05.688876 systemd-logind[1545]: Session 17 logged out. Waiting for processes to exit. Jan 14 00:11:05.690853 systemd-logind[1545]: Removed session 17. Jan 14 00:11:07.335152 kubelet[2832]: E0114 00:11:07.334731 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:11:10.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-46.224.77.139:22-4.153.228.146:39136 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:10.784044 systemd[1]: Started sshd@17-46.224.77.139:22-4.153.228.146:39136.service - OpenSSH per-connection server daemon (4.153.228.146:39136). Jan 14 00:11:10.786804 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:10.786889 kernel: audit: type=1130 audit(1768349470.782:817): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-46.224.77.139:22-4.153.228.146:39136 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:11.360000 audit[5409]: USER_ACCT pid=5409 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.367106 sshd[5409]: Accepted publickey for core from 4.153.228.146 port 39136 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:11.365000 audit[5409]: CRED_ACQ pid=5409 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.369778 kernel: audit: type=1101 audit(1768349471.360:818): pid=5409 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.369832 kernel: audit: type=1103 audit(1768349471.365:819): pid=5409 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.368939 sshd-session[5409]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:11.365000 audit[5409]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdfebf450 a2=3 a3=0 items=0 ppid=1 pid=5409 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:11.381078 kernel: audit: type=1006 audit(1768349471.365:820): pid=5409 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Jan 14 00:11:11.381127 kernel: audit: type=1300 audit(1768349471.365:820): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdfebf450 a2=3 a3=0 items=0 ppid=1 pid=5409 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:11.377734 systemd-logind[1545]: New session 18 of user core. Jan 14 00:11:11.365000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:11.382671 kernel: audit: type=1327 audit(1768349471.365:820): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:11.384748 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 14 00:11:11.387000 audit[5409]: USER_START pid=5409 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.393000 audit[5413]: CRED_ACQ pid=5413 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.397264 kernel: audit: type=1105 audit(1768349471.387:821): pid=5409 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.397323 kernel: audit: type=1103 audit(1768349471.393:822): pid=5413 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.769968 sshd[5413]: Connection closed by 4.153.228.146 port 39136 Jan 14 00:11:11.771499 sshd-session[5409]: pam_unix(sshd:session): session closed for user core Jan 14 00:11:11.771000 audit[5409]: USER_END pid=5409 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.771000 audit[5409]: CRED_DISP pid=5409 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.778351 kernel: audit: type=1106 audit(1768349471.771:823): pid=5409 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.778410 kernel: audit: type=1104 audit(1768349471.771:824): pid=5409 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:11.777000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-46.224.77.139:22-4.153.228.146:39136 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:11.778859 systemd[1]: sshd@17-46.224.77.139:22-4.153.228.146:39136.service: Deactivated successfully. Jan 14 00:11:11.781788 systemd[1]: session-18.scope: Deactivated successfully. Jan 14 00:11:11.785228 systemd-logind[1545]: Session 18 logged out. Waiting for processes to exit. Jan 14 00:11:11.788019 systemd-logind[1545]: Removed session 18. Jan 14 00:11:13.334259 kubelet[2832]: E0114 00:11:13.334085 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:11:15.334016 kubelet[2832]: E0114 00:11:15.333707 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:11:16.884792 systemd[1]: Started sshd@18-46.224.77.139:22-4.153.228.146:56144.service - OpenSSH per-connection server daemon (4.153.228.146:56144). Jan 14 00:11:16.887757 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:16.887789 kernel: audit: type=1130 audit(1768349476.883:826): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-46.224.77.139:22-4.153.228.146:56144 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:16.883000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-46.224.77.139:22-4.153.228.146:56144 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:17.459000 audit[5426]: USER_ACCT pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.463697 sshd[5426]: Accepted publickey for core from 4.153.228.146 port 56144 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:17.463000 audit[5426]: CRED_ACQ pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.466224 kernel: audit: type=1101 audit(1768349477.459:827): pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.466328 kernel: audit: type=1103 audit(1768349477.463:828): pid=5426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.467155 sshd-session[5426]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:17.468227 kernel: audit: type=1006 audit(1768349477.465:829): pid=5426 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Jan 14 00:11:17.465000 audit[5426]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeec9f8d0 a2=3 a3=0 items=0 ppid=1 pid=5426 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:17.470863 kernel: audit: type=1300 audit(1768349477.465:829): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeec9f8d0 a2=3 a3=0 items=0 ppid=1 pid=5426 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:17.465000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:17.472126 kernel: audit: type=1327 audit(1768349477.465:829): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:17.475585 systemd-logind[1545]: New session 19 of user core. Jan 14 00:11:17.482816 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 14 00:11:17.485000 audit[5426]: USER_START pid=5426 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.489562 kernel: audit: type=1105 audit(1768349477.485:830): pid=5426 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.489000 audit[5430]: CRED_ACQ pid=5430 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.492581 kernel: audit: type=1103 audit(1768349477.489:831): pid=5430 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.856555 sshd[5430]: Connection closed by 4.153.228.146 port 56144 Jan 14 00:11:17.857096 sshd-session[5426]: pam_unix(sshd:session): session closed for user core Jan 14 00:11:17.859000 audit[5426]: USER_END pid=5426 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.860000 audit[5426]: CRED_DISP pid=5426 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.865530 kernel: audit: type=1106 audit(1768349477.859:832): pid=5426 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.865614 kernel: audit: type=1104 audit(1768349477.860:833): pid=5426 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:17.864013 systemd[1]: sshd@18-46.224.77.139:22-4.153.228.146:56144.service: Deactivated successfully. Jan 14 00:11:17.865000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-46.224.77.139:22-4.153.228.146:56144 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:17.870097 systemd[1]: session-19.scope: Deactivated successfully. Jan 14 00:11:17.875109 systemd-logind[1545]: Session 19 logged out. Waiting for processes to exit. Jan 14 00:11:17.878592 systemd-logind[1545]: Removed session 19. Jan 14 00:11:18.336893 kubelet[2832]: E0114 00:11:18.336670 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:11:18.338787 kubelet[2832]: E0114 00:11:18.338739 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:11:20.336787 kubelet[2832]: E0114 00:11:20.336583 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:11:21.334361 kubelet[2832]: E0114 00:11:21.333725 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:11:22.965000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-46.224.77.139:22-4.153.228.146:56154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:22.965613 systemd[1]: Started sshd@19-46.224.77.139:22-4.153.228.146:56154.service - OpenSSH per-connection server daemon (4.153.228.146:56154). Jan 14 00:11:22.968616 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:22.968696 kernel: audit: type=1130 audit(1768349482.965:835): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-46.224.77.139:22-4.153.228.146:56154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:23.516000 audit[5450]: USER_ACCT pid=5450 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.518002 sshd[5450]: Accepted publickey for core from 4.153.228.146 port 56154 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:23.519000 audit[5450]: CRED_ACQ pid=5450 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.521727 sshd-session[5450]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:23.522502 kernel: audit: type=1101 audit(1768349483.516:836): pid=5450 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.522604 kernel: audit: type=1103 audit(1768349483.519:837): pid=5450 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.524873 kernel: audit: type=1006 audit(1768349483.519:838): pid=5450 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Jan 14 00:11:23.519000 audit[5450]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc5841c90 a2=3 a3=0 items=0 ppid=1 pid=5450 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:23.527540 kernel: audit: type=1300 audit(1768349483.519:838): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc5841c90 a2=3 a3=0 items=0 ppid=1 pid=5450 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:23.519000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:23.528814 kernel: audit: type=1327 audit(1768349483.519:838): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:23.534604 systemd-logind[1545]: New session 20 of user core. Jan 14 00:11:23.539759 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 14 00:11:23.544000 audit[5450]: USER_START pid=5450 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.548555 kernel: audit: type=1105 audit(1768349483.544:839): pid=5450 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.548000 audit[5456]: CRED_ACQ pid=5456 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.551561 kernel: audit: type=1103 audit(1768349483.548:840): pid=5456 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.913169 sshd[5456]: Connection closed by 4.153.228.146 port 56154 Jan 14 00:11:23.914751 sshd-session[5450]: pam_unix(sshd:session): session closed for user core Jan 14 00:11:23.916000 audit[5450]: USER_END pid=5450 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.923411 systemd-logind[1545]: Session 20 logged out. Waiting for processes to exit. Jan 14 00:11:23.926695 kernel: audit: type=1106 audit(1768349483.916:841): pid=5450 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.926785 kernel: audit: type=1104 audit(1768349483.916:842): pid=5450 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.916000 audit[5450]: CRED_DISP pid=5450 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:23.927254 systemd[1]: sshd@19-46.224.77.139:22-4.153.228.146:56154.service: Deactivated successfully. Jan 14 00:11:23.929000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-46.224.77.139:22-4.153.228.146:56154 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:23.934575 systemd[1]: session-20.scope: Deactivated successfully. Jan 14 00:11:23.940502 systemd-logind[1545]: Removed session 20. Jan 14 00:11:28.338353 containerd[1590]: time="2026-01-14T00:11:28.338258203Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:11:28.675181 containerd[1590]: time="2026-01-14T00:11:28.674759403Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:11:28.675991 containerd[1590]: time="2026-01-14T00:11:28.675925945Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:11:28.676166 containerd[1590]: time="2026-01-14T00:11:28.676011780Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:11:28.676749 kubelet[2832]: E0114 00:11:28.676623 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:11:28.676749 kubelet[2832]: E0114 00:11:28.676681 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:11:28.677844 kubelet[2832]: E0114 00:11:28.676981 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cr65d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-7bt2c_calico-apiserver(3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:11:28.678014 containerd[1590]: time="2026-01-14T00:11:28.677588422Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 00:11:28.678662 kubelet[2832]: E0114 00:11:28.678201 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:11:29.004792 containerd[1590]: time="2026-01-14T00:11:29.004175319Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:11:29.011087 containerd[1590]: time="2026-01-14T00:11:29.010957432Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 00:11:29.011232 containerd[1590]: time="2026-01-14T00:11:29.011040028Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 00:11:29.011920 kubelet[2832]: E0114 00:11:29.011701 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:11:29.011920 kubelet[2832]: E0114 00:11:29.011752 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:11:29.012448 kubelet[2832]: E0114 00:11:29.012278 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r74wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b44cc6f4-gxl6c_calico-system(5ee70bb0-55b7-4a80-b5cb-3133091615ae): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 00:11:29.013828 kubelet[2832]: E0114 00:11:29.013767 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:11:29.029335 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:29.029467 kernel: audit: type=1130 audit(1768349489.025:844): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-46.224.77.139:22-4.153.228.146:53082 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:29.025000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-46.224.77.139:22-4.153.228.146:53082 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:29.026803 systemd[1]: Started sshd@20-46.224.77.139:22-4.153.228.146:53082.service - OpenSSH per-connection server daemon (4.153.228.146:53082). Jan 14 00:11:29.334811 containerd[1590]: time="2026-01-14T00:11:29.334679949Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 00:11:29.575000 audit[5469]: USER_ACCT pid=5469 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.580703 sshd[5469]: Accepted publickey for core from 4.153.228.146 port 53082 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:29.580000 audit[5469]: CRED_ACQ pid=5469 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.583734 sshd-session[5469]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:29.583984 kernel: audit: type=1101 audit(1768349489.575:845): pid=5469 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.584020 kernel: audit: type=1103 audit(1768349489.580:846): pid=5469 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.584038 kernel: audit: type=1006 audit(1768349489.580:847): pid=5469 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 14 00:11:29.580000 audit[5469]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffda90e440 a2=3 a3=0 items=0 ppid=1 pid=5469 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:29.587966 kernel: audit: type=1300 audit(1768349489.580:847): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffda90e440 a2=3 a3=0 items=0 ppid=1 pid=5469 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:29.589485 kernel: audit: type=1327 audit(1768349489.580:847): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:29.580000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:29.595855 systemd-logind[1545]: New session 21 of user core. Jan 14 00:11:29.604706 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 14 00:11:29.608000 audit[5469]: USER_START pid=5469 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.616564 kernel: audit: type=1105 audit(1768349489.608:848): pid=5469 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.616669 kernel: audit: type=1103 audit(1768349489.612:849): pid=5473 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.612000 audit[5473]: CRED_ACQ pid=5473 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.659810 containerd[1590]: time="2026-01-14T00:11:29.659627886Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:11:29.661198 containerd[1590]: time="2026-01-14T00:11:29.661078616Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 00:11:29.661198 containerd[1590]: time="2026-01-14T00:11:29.661138774Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 00:11:29.661468 kubelet[2832]: E0114 00:11:29.661432 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:11:29.661640 kubelet[2832]: E0114 00:11:29.661585 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:11:29.662187 kubelet[2832]: E0114 00:11:29.661802 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 00:11:29.662336 containerd[1590]: time="2026-01-14T00:11:29.661964574Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:11:29.960327 sshd[5473]: Connection closed by 4.153.228.146 port 53082 Jan 14 00:11:29.961248 sshd-session[5469]: pam_unix(sshd:session): session closed for user core Jan 14 00:11:29.963000 audit[5469]: USER_END pid=5469 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.963000 audit[5469]: CRED_DISP pid=5469 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.969420 kernel: audit: type=1106 audit(1768349489.963:850): pid=5469 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.969560 kernel: audit: type=1104 audit(1768349489.963:851): pid=5469 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:29.971511 systemd[1]: sshd@20-46.224.77.139:22-4.153.228.146:53082.service: Deactivated successfully. Jan 14 00:11:29.970000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-46.224.77.139:22-4.153.228.146:53082 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:29.975010 systemd[1]: session-21.scope: Deactivated successfully. Jan 14 00:11:29.977727 containerd[1590]: time="2026-01-14T00:11:29.977562442Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:11:29.979538 containerd[1590]: time="2026-01-14T00:11:29.978833221Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:11:29.979300 systemd-logind[1545]: Session 21 logged out. Waiting for processes to exit. Jan 14 00:11:29.981929 containerd[1590]: time="2026-01-14T00:11:29.978971374Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:11:29.981929 containerd[1590]: time="2026-01-14T00:11:29.981692683Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 00:11:29.982025 kubelet[2832]: E0114 00:11:29.979878 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:11:29.982025 kubelet[2832]: E0114 00:11:29.979922 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:11:29.982025 kubelet[2832]: E0114 00:11:29.980126 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md5mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-vdxqm_calico-apiserver(1e4bec8e-a684-46cb-852e-ae05ed7b56d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:11:29.982025 kubelet[2832]: E0114 00:11:29.981543 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:11:29.982467 systemd-logind[1545]: Removed session 21. Jan 14 00:11:30.323188 containerd[1590]: time="2026-01-14T00:11:30.322663599Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:11:30.324213 containerd[1590]: time="2026-01-14T00:11:30.324163489Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 00:11:30.325011 containerd[1590]: time="2026-01-14T00:11:30.324840057Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 00:11:30.325549 kubelet[2832]: E0114 00:11:30.325323 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:11:30.325549 kubelet[2832]: E0114 00:11:30.325398 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:11:30.325827 kubelet[2832]: E0114 00:11:30.325764 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 00:11:30.326998 kubelet[2832]: E0114 00:11:30.326929 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:11:31.333343 containerd[1590]: time="2026-01-14T00:11:31.333067897Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 00:11:31.667593 containerd[1590]: time="2026-01-14T00:11:31.667533108Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:11:31.669045 containerd[1590]: time="2026-01-14T00:11:31.668984802Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 00:11:31.669131 containerd[1590]: time="2026-01-14T00:11:31.669079158Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 00:11:31.669325 kubelet[2832]: E0114 00:11:31.669270 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:11:31.669647 kubelet[2832]: E0114 00:11:31.669341 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:11:31.669647 kubelet[2832]: E0114 00:11:31.669544 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c290d008f4c4d48b25c8570357599ee,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 00:11:31.687172 containerd[1590]: time="2026-01-14T00:11:31.687116861Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 00:11:32.020422 containerd[1590]: time="2026-01-14T00:11:32.020291958Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:11:32.022079 containerd[1590]: time="2026-01-14T00:11:32.022004363Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 00:11:32.022184 containerd[1590]: time="2026-01-14T00:11:32.022091599Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 00:11:32.022922 kubelet[2832]: E0114 00:11:32.022859 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:11:32.022972 kubelet[2832]: E0114 00:11:32.022942 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:11:32.023181 kubelet[2832]: E0114 00:11:32.023112 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 00:11:32.025596 kubelet[2832]: E0114 00:11:32.025043 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:11:34.336593 containerd[1590]: time="2026-01-14T00:11:34.336544945Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 00:11:34.677285 containerd[1590]: time="2026-01-14T00:11:34.677119228Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:11:34.678777 containerd[1590]: time="2026-01-14T00:11:34.678719242Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 00:11:34.678902 containerd[1590]: time="2026-01-14T00:11:34.678812958Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 00:11:34.679075 kubelet[2832]: E0114 00:11:34.679043 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:11:34.680893 kubelet[2832]: E0114 00:11:34.680358 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:11:34.680893 kubelet[2832]: E0114 00:11:34.680534 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p28dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-hrn72_calico-system(1e53bd66-4746-482e-bb2b-bfd29a1ef20e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 00:11:34.682033 kubelet[2832]: E0114 00:11:34.681999 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:11:35.077464 systemd[1]: Started sshd@21-46.224.77.139:22-4.153.228.146:43420.service - OpenSSH per-connection server daemon (4.153.228.146:43420). Jan 14 00:11:35.076000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-46.224.77.139:22-4.153.228.146:43420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:35.078812 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:35.078842 kernel: audit: type=1130 audit(1768349495.076:853): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-46.224.77.139:22-4.153.228.146:43420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:35.641000 audit[5512]: USER_ACCT pid=5512 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:35.642623 sshd[5512]: Accepted publickey for core from 4.153.228.146 port 43420 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:35.644582 kernel: audit: type=1101 audit(1768349495.641:854): pid=5512 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:35.647978 kernel: audit: type=1103 audit(1768349495.644:855): pid=5512 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:35.644000 audit[5512]: CRED_ACQ pid=5512 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:35.645816 sshd-session[5512]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:35.650292 kernel: audit: type=1006 audit(1768349495.644:856): pid=5512 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 14 00:11:35.652540 kernel: audit: type=1300 audit(1768349495.644:856): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd45ac040 a2=3 a3=0 items=0 ppid=1 pid=5512 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:35.644000 audit[5512]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd45ac040 a2=3 a3=0 items=0 ppid=1 pid=5512 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:35.644000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:35.654618 kernel: audit: type=1327 audit(1768349495.644:856): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:35.661859 systemd-logind[1545]: New session 22 of user core. Jan 14 00:11:35.667735 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 14 00:11:35.672000 audit[5512]: USER_START pid=5512 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:35.676000 audit[5516]: CRED_ACQ pid=5516 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:35.681551 kernel: audit: type=1105 audit(1768349495.672:857): pid=5512 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:35.681641 kernel: audit: type=1103 audit(1768349495.676:858): pid=5516 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:36.033350 sshd[5516]: Connection closed by 4.153.228.146 port 43420 Jan 14 00:11:36.033671 sshd-session[5512]: pam_unix(sshd:session): session closed for user core Jan 14 00:11:36.036000 audit[5512]: USER_END pid=5512 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:36.039554 kernel: audit: type=1106 audit(1768349496.036:859): pid=5512 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:36.039000 audit[5512]: CRED_DISP pid=5512 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:36.042564 kernel: audit: type=1104 audit(1768349496.039:860): pid=5512 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:36.044297 systemd[1]: session-22.scope: Deactivated successfully. Jan 14 00:11:36.046410 systemd[1]: sshd@21-46.224.77.139:22-4.153.228.146:43420.service: Deactivated successfully. Jan 14 00:11:36.046000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-46.224.77.139:22-4.153.228.146:43420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:36.055836 systemd-logind[1545]: Session 22 logged out. Waiting for processes to exit. Jan 14 00:11:36.058136 systemd-logind[1545]: Removed session 22. Jan 14 00:11:40.335308 kubelet[2832]: E0114 00:11:40.335115 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:11:41.141485 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:41.141639 kernel: audit: type=1130 audit(1768349501.137:862): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-46.224.77.139:22-4.153.228.146:43432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:41.137000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-46.224.77.139:22-4.153.228.146:43432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:41.137951 systemd[1]: Started sshd@22-46.224.77.139:22-4.153.228.146:43432.service - OpenSSH per-connection server daemon (4.153.228.146:43432). Jan 14 00:11:41.690128 sshd[5530]: Accepted publickey for core from 4.153.228.146 port 43432 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:41.689000 audit[5530]: USER_ACCT pid=5530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:41.697566 kernel: audit: type=1101 audit(1768349501.689:863): pid=5530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:41.697688 kernel: audit: type=1103 audit(1768349501.692:864): pid=5530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:41.697713 kernel: audit: type=1006 audit(1768349501.694:865): pid=5530 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Jan 14 00:11:41.692000 audit[5530]: CRED_ACQ pid=5530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:41.695878 sshd-session[5530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:41.694000 audit[5530]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc70d55d0 a2=3 a3=0 items=0 ppid=1 pid=5530 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:41.701544 kernel: audit: type=1300 audit(1768349501.694:865): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc70d55d0 a2=3 a3=0 items=0 ppid=1 pid=5530 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:41.705941 kernel: audit: type=1327 audit(1768349501.694:865): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:41.694000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:41.727633 systemd-logind[1545]: New session 23 of user core. Jan 14 00:11:41.731997 systemd[1]: Started session-23.scope - Session 23 of User core. Jan 14 00:11:41.741000 audit[5530]: USER_START pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:41.745000 audit[5541]: CRED_ACQ pid=5541 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:41.748697 kernel: audit: type=1105 audit(1768349501.741:866): pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:41.748797 kernel: audit: type=1103 audit(1768349501.745:867): pid=5541 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:42.080158 sshd[5541]: Connection closed by 4.153.228.146 port 43432 Jan 14 00:11:42.081543 sshd-session[5530]: pam_unix(sshd:session): session closed for user core Jan 14 00:11:42.084000 audit[5530]: USER_END pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:42.088000 audit[5530]: CRED_DISP pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:42.093795 kernel: audit: type=1106 audit(1768349502.084:868): pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:42.093877 kernel: audit: type=1104 audit(1768349502.088:869): pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:42.094839 systemd[1]: sshd@22-46.224.77.139:22-4.153.228.146:43432.service: Deactivated successfully. Jan 14 00:11:42.095000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-46.224.77.139:22-4.153.228.146:43432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:42.098647 systemd[1]: session-23.scope: Deactivated successfully. Jan 14 00:11:42.102745 systemd-logind[1545]: Session 23 logged out. Waiting for processes to exit. Jan 14 00:11:42.105234 systemd-logind[1545]: Removed session 23. Jan 14 00:11:42.339573 kubelet[2832]: E0114 00:11:42.339165 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:11:42.341069 kubelet[2832]: E0114 00:11:42.340986 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:11:43.334016 kubelet[2832]: E0114 00:11:43.333893 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:11:43.335764 kubelet[2832]: E0114 00:11:43.335728 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:11:45.335100 kubelet[2832]: E0114 00:11:45.334710 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:11:47.198378 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:47.198498 kernel: audit: type=1130 audit(1768349507.195:871): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-46.224.77.139:22-4.153.228.146:57260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:47.195000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-46.224.77.139:22-4.153.228.146:57260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:47.195813 systemd[1]: Started sshd@23-46.224.77.139:22-4.153.228.146:57260.service - OpenSSH per-connection server daemon (4.153.228.146:57260). Jan 14 00:11:47.755000 audit[5571]: USER_ACCT pid=5571 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:47.759693 sshd[5571]: Accepted publickey for core from 4.153.228.146 port 57260 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:47.760000 audit[5571]: CRED_ACQ pid=5571 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:47.763245 kernel: audit: type=1101 audit(1768349507.755:872): pid=5571 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:47.763318 kernel: audit: type=1103 audit(1768349507.760:873): pid=5571 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:47.764012 sshd-session[5571]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:47.765727 kernel: audit: type=1006 audit(1768349507.762:874): pid=5571 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Jan 14 00:11:47.762000 audit[5571]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcbc84f60 a2=3 a3=0 items=0 ppid=1 pid=5571 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:47.768287 kernel: audit: type=1300 audit(1768349507.762:874): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcbc84f60 a2=3 a3=0 items=0 ppid=1 pid=5571 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:47.771013 kernel: audit: type=1327 audit(1768349507.762:874): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:47.762000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:47.775533 systemd-logind[1545]: New session 24 of user core. Jan 14 00:11:47.779939 systemd[1]: Started session-24.scope - Session 24 of User core. Jan 14 00:11:47.783000 audit[5571]: USER_START pid=5571 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:47.788000 audit[5575]: CRED_ACQ pid=5575 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:47.790943 kernel: audit: type=1105 audit(1768349507.783:875): pid=5571 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:47.791008 kernel: audit: type=1103 audit(1768349507.788:876): pid=5575 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:48.167886 sshd[5575]: Connection closed by 4.153.228.146 port 57260 Jan 14 00:11:48.168575 sshd-session[5571]: pam_unix(sshd:session): session closed for user core Jan 14 00:11:48.169000 audit[5571]: USER_END pid=5571 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:48.169000 audit[5571]: CRED_DISP pid=5571 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:48.174856 systemd[1]: sshd@23-46.224.77.139:22-4.153.228.146:57260.service: Deactivated successfully. Jan 14 00:11:48.176293 kernel: audit: type=1106 audit(1768349508.169:877): pid=5571 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:48.176366 kernel: audit: type=1104 audit(1768349508.169:878): pid=5571 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:48.174000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-46.224.77.139:22-4.153.228.146:57260 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:48.178794 systemd[1]: session-24.scope: Deactivated successfully. Jan 14 00:11:48.181599 systemd-logind[1545]: Session 24 logged out. Waiting for processes to exit. Jan 14 00:11:48.183853 systemd-logind[1545]: Removed session 24. Jan 14 00:11:52.334402 kubelet[2832]: E0114 00:11:52.334025 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:11:53.276000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-46.224.77.139:22-4.153.228.146:57274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:53.276857 systemd[1]: Started sshd@24-46.224.77.139:22-4.153.228.146:57274.service - OpenSSH per-connection server daemon (4.153.228.146:57274). Jan 14 00:11:53.277663 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:53.277707 kernel: audit: type=1130 audit(1768349513.276:880): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-46.224.77.139:22-4.153.228.146:57274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:53.336606 kubelet[2832]: E0114 00:11:53.336553 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:11:53.824000 audit[5595]: USER_ACCT pid=5595 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:53.826656 sshd[5595]: Accepted publickey for core from 4.153.228.146 port 57274 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:53.828592 kernel: audit: type=1101 audit(1768349513.824:881): pid=5595 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:53.828000 audit[5595]: CRED_ACQ pid=5595 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:53.832863 sshd-session[5595]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:53.835664 kernel: audit: type=1103 audit(1768349513.828:882): pid=5595 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:53.836043 kernel: audit: type=1006 audit(1768349513.828:883): pid=5595 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Jan 14 00:11:53.828000 audit[5595]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed13d110 a2=3 a3=0 items=0 ppid=1 pid=5595 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:53.838663 kernel: audit: type=1300 audit(1768349513.828:883): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed13d110 a2=3 a3=0 items=0 ppid=1 pid=5595 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:53.839722 kernel: audit: type=1327 audit(1768349513.828:883): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:53.828000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:53.843551 systemd-logind[1545]: New session 25 of user core. Jan 14 00:11:53.845936 systemd[1]: Started session-25.scope - Session 25 of User core. Jan 14 00:11:53.849000 audit[5595]: USER_START pid=5595 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:53.854789 kernel: audit: type=1105 audit(1768349513.849:884): pid=5595 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:53.854000 audit[5601]: CRED_ACQ pid=5601 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:53.859553 kernel: audit: type=1103 audit(1768349513.854:885): pid=5601 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:54.240841 sshd[5601]: Connection closed by 4.153.228.146 port 57274 Jan 14 00:11:54.241782 sshd-session[5595]: pam_unix(sshd:session): session closed for user core Jan 14 00:11:54.243000 audit[5595]: USER_END pid=5595 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:54.248979 systemd[1]: sshd@24-46.224.77.139:22-4.153.228.146:57274.service: Deactivated successfully. Jan 14 00:11:54.243000 audit[5595]: CRED_DISP pid=5595 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:54.252451 kernel: audit: type=1106 audit(1768349514.243:886): pid=5595 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:54.252546 kernel: audit: type=1104 audit(1768349514.243:887): pid=5595 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:54.253466 systemd[1]: session-25.scope: Deactivated successfully. Jan 14 00:11:54.248000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-46.224.77.139:22-4.153.228.146:57274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:54.256046 systemd-logind[1545]: Session 25 logged out. Waiting for processes to exit. Jan 14 00:11:54.257314 systemd-logind[1545]: Removed session 25. Jan 14 00:11:56.338551 kubelet[2832]: E0114 00:11:56.338166 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:11:57.335149 kubelet[2832]: E0114 00:11:57.335048 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:11:58.339192 kubelet[2832]: E0114 00:11:58.339000 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:11:58.340464 kubelet[2832]: E0114 00:11:58.339544 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:11:59.363847 systemd[1]: Started sshd@25-46.224.77.139:22-4.153.228.146:40500.service - OpenSSH per-connection server daemon (4.153.228.146:40500). Jan 14 00:11:59.367056 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:11:59.367106 kernel: audit: type=1130 audit(1768349519.362:889): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-46.224.77.139:22-4.153.228.146:40500 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:59.362000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-46.224.77.139:22-4.153.228.146:40500 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:11:59.914000 audit[5614]: USER_ACCT pid=5614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:59.916809 sshd[5614]: Accepted publickey for core from 4.153.228.146 port 40500 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:11:59.918000 audit[5614]: CRED_ACQ pid=5614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:59.920602 sshd-session[5614]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:11:59.922550 kernel: audit: type=1101 audit(1768349519.914:890): pid=5614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:59.922646 kernel: audit: type=1103 audit(1768349519.918:891): pid=5614 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:59.925557 kernel: audit: type=1006 audit(1768349519.918:892): pid=5614 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Jan 14 00:11:59.925633 kernel: audit: type=1300 audit(1768349519.918:892): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffaf4bb60 a2=3 a3=0 items=0 ppid=1 pid=5614 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:59.918000 audit[5614]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffaf4bb60 a2=3 a3=0 items=0 ppid=1 pid=5614 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:11:59.918000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:59.928993 kernel: audit: type=1327 audit(1768349519.918:892): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:11:59.933506 systemd-logind[1545]: New session 26 of user core. Jan 14 00:11:59.941873 systemd[1]: Started session-26.scope - Session 26 of User core. Jan 14 00:11:59.946000 audit[5614]: USER_START pid=5614 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:59.950000 audit[5618]: CRED_ACQ pid=5618 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:59.953916 kernel: audit: type=1105 audit(1768349519.946:893): pid=5614 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:11:59.953999 kernel: audit: type=1103 audit(1768349519.950:894): pid=5618 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:00.374665 sshd[5618]: Connection closed by 4.153.228.146 port 40500 Jan 14 00:12:00.375272 sshd-session[5614]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:00.376000 audit[5614]: USER_END pid=5614 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:00.376000 audit[5614]: CRED_DISP pid=5614 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:00.383592 kernel: audit: type=1106 audit(1768349520.376:895): pid=5614 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:00.383682 kernel: audit: type=1104 audit(1768349520.376:896): pid=5614 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:00.383962 systemd[1]: sshd@25-46.224.77.139:22-4.153.228.146:40500.service: Deactivated successfully. Jan 14 00:12:00.382000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@25-46.224.77.139:22-4.153.228.146:40500 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:00.388297 systemd[1]: session-26.scope: Deactivated successfully. Jan 14 00:12:00.390700 systemd-logind[1545]: Session 26 logged out. Waiting for processes to exit. Jan 14 00:12:00.394874 systemd-logind[1545]: Removed session 26. Jan 14 00:12:04.336941 kubelet[2832]: E0114 00:12:04.336889 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:12:05.477000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-46.224.77.139:22-4.153.228.146:43208 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:05.479011 systemd[1]: Started sshd@26-46.224.77.139:22-4.153.228.146:43208.service - OpenSSH per-connection server daemon (4.153.228.146:43208). Jan 14 00:12:05.479588 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:12:05.479632 kernel: audit: type=1130 audit(1768349525.477:898): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-46.224.77.139:22-4.153.228.146:43208 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:06.027000 audit[5655]: USER_ACCT pid=5655 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.031903 sshd[5655]: Accepted publickey for core from 4.153.228.146 port 43208 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:12:06.034114 sshd-session[5655]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:12:06.031000 audit[5655]: CRED_ACQ pid=5655 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.036330 kernel: audit: type=1101 audit(1768349526.027:899): pid=5655 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.036408 kernel: audit: type=1103 audit(1768349526.031:900): pid=5655 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.037735 kernel: audit: type=1006 audit(1768349526.031:901): pid=5655 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=27 res=1 Jan 14 00:12:06.031000 audit[5655]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff8964700 a2=3 a3=0 items=0 ppid=1 pid=5655 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:06.040537 kernel: audit: type=1300 audit(1768349526.031:901): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff8964700 a2=3 a3=0 items=0 ppid=1 pid=5655 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=27 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:06.031000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:06.042538 kernel: audit: type=1327 audit(1768349526.031:901): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:06.046239 systemd-logind[1545]: New session 27 of user core. Jan 14 00:12:06.052793 systemd[1]: Started session-27.scope - Session 27 of User core. Jan 14 00:12:06.055000 audit[5655]: USER_START pid=5655 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.061652 kernel: audit: type=1105 audit(1768349526.055:902): pid=5655 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.060000 audit[5659]: CRED_ACQ pid=5659 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.065758 kernel: audit: type=1103 audit(1768349526.060:903): pid=5659 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.423900 sshd[5659]: Connection closed by 4.153.228.146 port 43208 Jan 14 00:12:06.424440 sshd-session[5655]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:06.426000 audit[5655]: USER_END pid=5655 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.426000 audit[5655]: CRED_DISP pid=5655 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.432213 kernel: audit: type=1106 audit(1768349526.426:904): pid=5655 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.432294 kernel: audit: type=1104 audit(1768349526.426:905): pid=5655 uid=0 auid=500 ses=27 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:06.433820 systemd[1]: sshd@26-46.224.77.139:22-4.153.228.146:43208.service: Deactivated successfully. Jan 14 00:12:06.432000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@26-46.224.77.139:22-4.153.228.146:43208 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:06.436885 systemd[1]: session-27.scope: Deactivated successfully. Jan 14 00:12:06.441195 systemd-logind[1545]: Session 27 logged out. Waiting for processes to exit. Jan 14 00:12:06.443759 systemd-logind[1545]: Removed session 27. Jan 14 00:12:08.334652 kubelet[2832]: E0114 00:12:08.334603 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:12:08.338746 kubelet[2832]: E0114 00:12:08.338694 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:12:10.338020 kubelet[2832]: E0114 00:12:10.337300 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:12:10.338020 kubelet[2832]: E0114 00:12:10.337621 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:12:11.534997 systemd[1]: Started sshd@27-46.224.77.139:22-4.153.228.146:43216.service - OpenSSH per-connection server daemon (4.153.228.146:43216). Jan 14 00:12:11.533000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-46.224.77.139:22-4.153.228.146:43216 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:11.538862 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:12:11.538953 kernel: audit: type=1130 audit(1768349531.533:907): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-46.224.77.139:22-4.153.228.146:43216 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:12.082088 sshd[5672]: Accepted publickey for core from 4.153.228.146 port 43216 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:12:12.079000 audit[5672]: USER_ACCT pid=5672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.087099 sshd-session[5672]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:12:12.083000 audit[5672]: CRED_ACQ pid=5672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.089006 kernel: audit: type=1101 audit(1768349532.079:908): pid=5672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.089085 kernel: audit: type=1103 audit(1768349532.083:909): pid=5672 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.092690 kernel: audit: type=1006 audit(1768349532.083:910): pid=5672 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=28 res=1 Jan 14 00:12:12.092766 kernel: audit: type=1300 audit(1768349532.083:910): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffecf4e710 a2=3 a3=0 items=0 ppid=1 pid=5672 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:12.083000 audit[5672]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffecf4e710 a2=3 a3=0 items=0 ppid=1 pid=5672 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=28 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:12.083000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:12.098611 kernel: audit: type=1327 audit(1768349532.083:910): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:12.102772 systemd-logind[1545]: New session 28 of user core. Jan 14 00:12:12.107764 systemd[1]: Started session-28.scope - Session 28 of User core. Jan 14 00:12:12.110000 audit[5672]: USER_START pid=5672 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.116553 kernel: audit: type=1105 audit(1768349532.110:911): pid=5672 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.115000 audit[5676]: CRED_ACQ pid=5676 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.119589 kernel: audit: type=1103 audit(1768349532.115:912): pid=5676 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.338040 kubelet[2832]: E0114 00:12:12.337846 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:12:12.487957 sshd[5676]: Connection closed by 4.153.228.146 port 43216 Jan 14 00:12:12.488787 sshd-session[5672]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:12.490000 audit[5672]: USER_END pid=5672 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.490000 audit[5672]: CRED_DISP pid=5672 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.498284 kernel: audit: type=1106 audit(1768349532.490:913): pid=5672 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.498392 kernel: audit: type=1104 audit(1768349532.490:914): pid=5672 uid=0 auid=500 ses=28 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:12.498759 systemd[1]: sshd@27-46.224.77.139:22-4.153.228.146:43216.service: Deactivated successfully. Jan 14 00:12:12.497000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@27-46.224.77.139:22-4.153.228.146:43216 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:12.501847 systemd[1]: session-28.scope: Deactivated successfully. Jan 14 00:12:12.505947 systemd-logind[1545]: Session 28 logged out. Waiting for processes to exit. Jan 14 00:12:12.507118 systemd-logind[1545]: Removed session 28. Jan 14 00:12:15.333966 kubelet[2832]: E0114 00:12:15.333889 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:12:17.603000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-46.224.77.139:22-4.153.228.146:40864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:17.604305 systemd[1]: Started sshd@28-46.224.77.139:22-4.153.228.146:40864.service - OpenSSH per-connection server daemon (4.153.228.146:40864). Jan 14 00:12:17.607820 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:12:17.607892 kernel: audit: type=1130 audit(1768349537.603:916): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-46.224.77.139:22-4.153.228.146:40864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:18.145000 audit[5690]: USER_ACCT pid=5690 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.151021 kernel: audit: type=1101 audit(1768349538.145:917): pid=5690 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.151261 sshd[5690]: Accepted publickey for core from 4.153.228.146 port 40864 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:12:18.150000 audit[5690]: CRED_ACQ pid=5690 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.155294 sshd-session[5690]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:12:18.158663 kernel: audit: type=1103 audit(1768349538.150:918): pid=5690 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.158746 kernel: audit: type=1006 audit(1768349538.153:919): pid=5690 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=29 res=1 Jan 14 00:12:18.153000 audit[5690]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcf9dd600 a2=3 a3=0 items=0 ppid=1 pid=5690 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:18.161186 kernel: audit: type=1300 audit(1768349538.153:919): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcf9dd600 a2=3 a3=0 items=0 ppid=1 pid=5690 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=29 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:18.153000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:18.162311 kernel: audit: type=1327 audit(1768349538.153:919): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:18.166743 systemd-logind[1545]: New session 29 of user core. Jan 14 00:12:18.172560 systemd[1]: Started session-29.scope - Session 29 of User core. Jan 14 00:12:18.178000 audit[5690]: USER_START pid=5690 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.183557 kernel: audit: type=1105 audit(1768349538.178:920): pid=5690 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.182000 audit[5694]: CRED_ACQ pid=5694 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.186542 kernel: audit: type=1103 audit(1768349538.182:921): pid=5694 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.565604 sshd[5694]: Connection closed by 4.153.228.146 port 40864 Jan 14 00:12:18.565321 sshd-session[5690]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:18.566000 audit[5690]: USER_END pid=5690 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.575971 systemd-logind[1545]: Session 29 logged out. Waiting for processes to exit. Jan 14 00:12:18.578084 systemd[1]: sshd@28-46.224.77.139:22-4.153.228.146:40864.service: Deactivated successfully. Jan 14 00:12:18.571000 audit[5690]: CRED_DISP pid=5690 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.578638 kernel: audit: type=1106 audit(1768349538.566:922): pid=5690 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.582313 systemd[1]: session-29.scope: Deactivated successfully. Jan 14 00:12:18.578000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@28-46.224.77.139:22-4.153.228.146:40864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:18.584567 kernel: audit: type=1104 audit(1768349538.571:923): pid=5690 uid=0 auid=500 ses=29 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:18.587638 systemd-logind[1545]: Removed session 29. Jan 14 00:12:20.336255 kubelet[2832]: E0114 00:12:20.336029 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:12:23.335205 kubelet[2832]: E0114 00:12:23.334815 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:12:23.337095 kubelet[2832]: E0114 00:12:23.335315 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:12:23.674299 systemd[1]: Started sshd@29-46.224.77.139:22-4.153.228.146:40868.service - OpenSSH per-connection server daemon (4.153.228.146:40868). Jan 14 00:12:23.677803 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:12:23.677876 kernel: audit: type=1130 audit(1768349543.673:925): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-46.224.77.139:22-4.153.228.146:40868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:23.673000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-46.224.77.139:22-4.153.228.146:40868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:24.241000 audit[5708]: USER_ACCT pid=5708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.246558 sshd[5708]: Accepted publickey for core from 4.153.228.146 port 40868 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:12:24.249941 kernel: audit: type=1101 audit(1768349544.241:926): pid=5708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.250045 kernel: audit: type=1103 audit(1768349544.245:927): pid=5708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.245000 audit[5708]: CRED_ACQ pid=5708 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.250929 sshd-session[5708]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:12:24.252867 kernel: audit: type=1006 audit(1768349544.245:928): pid=5708 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=30 res=1 Jan 14 00:12:24.252931 kernel: audit: type=1300 audit(1768349544.245:928): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffeb7cfd0 a2=3 a3=0 items=0 ppid=1 pid=5708 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:24.245000 audit[5708]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffeb7cfd0 a2=3 a3=0 items=0 ppid=1 pid=5708 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=30 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:24.245000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:24.256335 kernel: audit: type=1327 audit(1768349544.245:928): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:24.262852 systemd-logind[1545]: New session 30 of user core. Jan 14 00:12:24.271843 systemd[1]: Started session-30.scope - Session 30 of User core. Jan 14 00:12:24.276000 audit[5708]: USER_START pid=5708 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.284862 kernel: audit: type=1105 audit(1768349544.276:929): pid=5708 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.284974 kernel: audit: type=1103 audit(1768349544.279:930): pid=5712 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.279000 audit[5712]: CRED_ACQ pid=5712 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.637158 sshd[5712]: Connection closed by 4.153.228.146 port 40868 Jan 14 00:12:24.635576 sshd-session[5708]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:24.638000 audit[5708]: USER_END pid=5708 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.638000 audit[5708]: CRED_DISP pid=5708 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.646721 kernel: audit: type=1106 audit(1768349544.638:931): pid=5708 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.646788 kernel: audit: type=1104 audit(1768349544.638:932): pid=5708 uid=0 auid=500 ses=30 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:24.645048 systemd[1]: sshd@29-46.224.77.139:22-4.153.228.146:40868.service: Deactivated successfully. Jan 14 00:12:24.647917 systemd[1]: session-30.scope: Deactivated successfully. Jan 14 00:12:24.643000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@29-46.224.77.139:22-4.153.228.146:40868 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:24.652170 systemd-logind[1545]: Session 30 logged out. Waiting for processes to exit. Jan 14 00:12:24.654882 systemd-logind[1545]: Removed session 30. Jan 14 00:12:25.333947 kubelet[2832]: E0114 00:12:25.333901 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:12:27.335291 kubelet[2832]: E0114 00:12:27.334754 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:12:28.335924 kubelet[2832]: E0114 00:12:28.335487 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:12:29.745449 systemd[1]: Started sshd@30-46.224.77.139:22-4.153.228.146:44092.service - OpenSSH per-connection server daemon (4.153.228.146:44092). Jan 14 00:12:29.748453 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:12:29.748484 kernel: audit: type=1130 audit(1768349549.744:934): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-46.224.77.139:22-4.153.228.146:44092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:29.744000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-46.224.77.139:22-4.153.228.146:44092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:30.290000 audit[5725]: USER_ACCT pid=5725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.293885 sshd[5725]: Accepted publickey for core from 4.153.228.146 port 44092 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:12:30.297737 kernel: audit: type=1101 audit(1768349550.290:935): pid=5725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.297852 kernel: audit: type=1103 audit(1768349550.296:936): pid=5725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.296000 audit[5725]: CRED_ACQ pid=5725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.299359 sshd-session[5725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:12:30.300632 kernel: audit: type=1006 audit(1768349550.296:937): pid=5725 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=31 res=1 Jan 14 00:12:30.296000 audit[5725]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe537e1a0 a2=3 a3=0 items=0 ppid=1 pid=5725 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:30.303183 kernel: audit: type=1300 audit(1768349550.296:937): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe537e1a0 a2=3 a3=0 items=0 ppid=1 pid=5725 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=31 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:30.296000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:30.304532 kernel: audit: type=1327 audit(1768349550.296:937): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:30.309634 systemd-logind[1545]: New session 31 of user core. Jan 14 00:12:30.316767 systemd[1]: Started session-31.scope - Session 31 of User core. Jan 14 00:12:30.321000 audit[5725]: USER_START pid=5725 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.325000 audit[5729]: CRED_ACQ pid=5729 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.328712 kernel: audit: type=1105 audit(1768349550.321:938): pid=5725 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.328782 kernel: audit: type=1103 audit(1768349550.325:939): pid=5729 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.680926 sshd[5729]: Connection closed by 4.153.228.146 port 44092 Jan 14 00:12:30.681822 sshd-session[5725]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:30.683000 audit[5725]: USER_END pid=5725 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.683000 audit[5725]: CRED_DISP pid=5725 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.694776 kernel: audit: type=1106 audit(1768349550.683:940): pid=5725 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.694887 kernel: audit: type=1104 audit(1768349550.683:941): pid=5725 uid=0 auid=500 ses=31 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:30.692439 systemd[1]: sshd@30-46.224.77.139:22-4.153.228.146:44092.service: Deactivated successfully. Jan 14 00:12:30.693000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@30-46.224.77.139:22-4.153.228.146:44092 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:30.697749 systemd[1]: session-31.scope: Deactivated successfully. Jan 14 00:12:30.699296 systemd-logind[1545]: Session 31 logged out. Waiting for processes to exit. Jan 14 00:12:30.701724 systemd-logind[1545]: Removed session 31. Jan 14 00:12:32.333475 kubelet[2832]: E0114 00:12:32.333120 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:12:34.337397 kubelet[2832]: E0114 00:12:34.337335 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:12:35.798547 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:12:35.798691 kernel: audit: type=1130 audit(1768349555.794:943): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-46.224.77.139:22-4.153.228.146:42678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:35.794000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-46.224.77.139:22-4.153.228.146:42678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:35.795767 systemd[1]: Started sshd@31-46.224.77.139:22-4.153.228.146:42678.service - OpenSSH per-connection server daemon (4.153.228.146:42678). Jan 14 00:12:36.339018 kubelet[2832]: E0114 00:12:36.338957 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:12:36.353080 sshd[5766]: Accepted publickey for core from 4.153.228.146 port 42678 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:12:36.351000 audit[5766]: USER_ACCT pid=5766 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.357705 sshd-session[5766]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:12:36.360285 kernel: audit: type=1101 audit(1768349556.351:944): pid=5766 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.360410 kernel: audit: type=1103 audit(1768349556.355:945): pid=5766 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.355000 audit[5766]: CRED_ACQ pid=5766 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.364015 kernel: audit: type=1006 audit(1768349556.355:946): pid=5766 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=32 res=1 Jan 14 00:12:36.355000 audit[5766]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdddcd5c0 a2=3 a3=0 items=0 ppid=1 pid=5766 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:36.371896 systemd-logind[1545]: New session 32 of user core. Jan 14 00:12:36.374543 kernel: audit: type=1300 audit(1768349556.355:946): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdddcd5c0 a2=3 a3=0 items=0 ppid=1 pid=5766 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=32 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:36.374687 kernel: audit: type=1327 audit(1768349556.355:946): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:36.355000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:36.379113 systemd[1]: Started session-32.scope - Session 32 of User core. Jan 14 00:12:36.382000 audit[5766]: USER_START pid=5766 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.390569 kernel: audit: type=1105 audit(1768349556.382:947): pid=5766 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.389000 audit[5770]: CRED_ACQ pid=5770 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.393563 kernel: audit: type=1103 audit(1768349556.389:948): pid=5770 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.761688 sshd[5770]: Connection closed by 4.153.228.146 port 42678 Jan 14 00:12:36.762504 sshd-session[5766]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:36.763000 audit[5766]: USER_END pid=5766 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.763000 audit[5766]: CRED_DISP pid=5766 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.773772 kernel: audit: type=1106 audit(1768349556.763:949): pid=5766 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.773839 kernel: audit: type=1104 audit(1768349556.763:950): pid=5766 uid=0 auid=500 ses=32 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:36.774222 systemd[1]: sshd@31-46.224.77.139:22-4.153.228.146:42678.service: Deactivated successfully. Jan 14 00:12:36.772000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@31-46.224.77.139:22-4.153.228.146:42678 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:36.779250 systemd[1]: session-32.scope: Deactivated successfully. Jan 14 00:12:36.781434 systemd-logind[1545]: Session 32 logged out. Waiting for processes to exit. Jan 14 00:12:36.784787 systemd-logind[1545]: Removed session 32. Jan 14 00:12:38.086130 systemd[1770]: Created slice background.slice - User Background Tasks Slice. Jan 14 00:12:38.087687 systemd[1770]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Jan 14 00:12:38.115240 systemd[1770]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Jan 14 00:12:38.338437 kubelet[2832]: E0114 00:12:38.337818 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:12:39.334764 kubelet[2832]: E0114 00:12:39.334707 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:12:40.835498 containerd[1590]: time="2026-01-14T00:12:40.835355716Z" level=info msg="container event discarded" container=86c7de5fa85603f06dfa46a727e6625a5335e0d1b8360bdeeefacdb5e35823cc type=CONTAINER_CREATED_EVENT Jan 14 00:12:40.835498 containerd[1590]: time="2026-01-14T00:12:40.835473238Z" level=info msg="container event discarded" container=86c7de5fa85603f06dfa46a727e6625a5335e0d1b8360bdeeefacdb5e35823cc type=CONTAINER_STARTED_EVENT Jan 14 00:12:40.895931 containerd[1590]: time="2026-01-14T00:12:40.895797537Z" level=info msg="container event discarded" container=a48336692ce359755af31eb72390eaed09cfa4c74e52f92a54aae43335e2310b type=CONTAINER_CREATED_EVENT Jan 14 00:12:40.895931 containerd[1590]: time="2026-01-14T00:12:40.895876219Z" level=info msg="container event discarded" container=9af0c7d2661630c3775ff45822448f50355062151f77f37b2a32a278aaa3241c type=CONTAINER_CREATED_EVENT Jan 14 00:12:40.895931 containerd[1590]: time="2026-01-14T00:12:40.895889259Z" level=info msg="container event discarded" container=9af0c7d2661630c3775ff45822448f50355062151f77f37b2a32a278aaa3241c type=CONTAINER_STARTED_EVENT Jan 14 00:12:40.907120 containerd[1590]: time="2026-01-14T00:12:40.907057785Z" level=info msg="container event discarded" container=7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41 type=CONTAINER_CREATED_EVENT Jan 14 00:12:40.907120 containerd[1590]: time="2026-01-14T00:12:40.907119786Z" level=info msg="container event discarded" container=7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41 type=CONTAINER_STARTED_EVENT Jan 14 00:12:40.933199 containerd[1590]: time="2026-01-14T00:12:40.933120494Z" level=info msg="container event discarded" container=f4e1bb910e9ead011550006449a851c3c10e04a29594f69723d313440c15cd0c type=CONTAINER_CREATED_EVENT Jan 14 00:12:40.933347 containerd[1590]: time="2026-01-14T00:12:40.933239656Z" level=info msg="container event discarded" container=f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7 type=CONTAINER_CREATED_EVENT Jan 14 00:12:41.000489 containerd[1590]: time="2026-01-14T00:12:41.000425978Z" level=info msg="container event discarded" container=a48336692ce359755af31eb72390eaed09cfa4c74e52f92a54aae43335e2310b type=CONTAINER_STARTED_EVENT Jan 14 00:12:41.047736 containerd[1590]: time="2026-01-14T00:12:41.047648145Z" level=info msg="container event discarded" container=f4e1bb910e9ead011550006449a851c3c10e04a29594f69723d313440c15cd0c type=CONTAINER_STARTED_EVENT Jan 14 00:12:41.047736 containerd[1590]: time="2026-01-14T00:12:41.047720546Z" level=info msg="container event discarded" container=f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7 type=CONTAINER_STARTED_EVENT Jan 14 00:12:41.878000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-46.224.77.139:22-4.153.228.146:42692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:41.879790 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:12:41.879846 kernel: audit: type=1130 audit(1768349561.878:952): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-46.224.77.139:22-4.153.228.146:42692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:41.878915 systemd[1]: Started sshd@32-46.224.77.139:22-4.153.228.146:42692.service - OpenSSH per-connection server daemon (4.153.228.146:42692). Jan 14 00:12:42.436958 sshd[5785]: Accepted publickey for core from 4.153.228.146 port 42692 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:12:42.436000 audit[5785]: USER_ACCT pid=5785 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.442620 kernel: audit: type=1101 audit(1768349562.436:953): pid=5785 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.442696 kernel: audit: type=1103 audit(1768349562.439:954): pid=5785 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.439000 audit[5785]: CRED_ACQ pid=5785 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.440801 sshd-session[5785]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:12:42.444564 kernel: audit: type=1006 audit(1768349562.439:955): pid=5785 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=33 res=1 Jan 14 00:12:42.439000 audit[5785]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd355900 a2=3 a3=0 items=0 ppid=1 pid=5785 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:42.447103 kernel: audit: type=1300 audit(1768349562.439:955): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd355900 a2=3 a3=0 items=0 ppid=1 pid=5785 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=33 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:42.448339 kernel: audit: type=1327 audit(1768349562.439:955): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:42.439000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:42.452375 systemd-logind[1545]: New session 33 of user core. Jan 14 00:12:42.456866 systemd[1]: Started session-33.scope - Session 33 of User core. Jan 14 00:12:42.461000 audit[5785]: USER_START pid=5785 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.465000 audit[5789]: CRED_ACQ pid=5789 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.468541 kernel: audit: type=1105 audit(1768349562.461:956): pid=5785 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.468683 kernel: audit: type=1103 audit(1768349562.465:957): pid=5789 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.837271 sshd[5789]: Connection closed by 4.153.228.146 port 42692 Jan 14 00:12:42.838388 sshd-session[5785]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:42.840000 audit[5785]: USER_END pid=5785 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.840000 audit[5785]: CRED_DISP pid=5785 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.846686 kernel: audit: type=1106 audit(1768349562.840:958): pid=5785 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.846796 kernel: audit: type=1104 audit(1768349562.840:959): pid=5785 uid=0 auid=500 ses=33 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:42.847910 systemd[1]: sshd@32-46.224.77.139:22-4.153.228.146:42692.service: Deactivated successfully. Jan 14 00:12:42.848145 systemd-logind[1545]: Session 33 logged out. Waiting for processes to exit. Jan 14 00:12:42.849000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@32-46.224.77.139:22-4.153.228.146:42692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:42.854127 systemd[1]: session-33.scope: Deactivated successfully. Jan 14 00:12:42.857510 systemd-logind[1545]: Removed session 33. Jan 14 00:12:43.334706 kubelet[2832]: E0114 00:12:43.334130 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:12:45.333607 kubelet[2832]: E0114 00:12:45.333561 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:12:46.335545 kubelet[2832]: E0114 00:12:46.334578 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:12:47.959924 systemd[1]: Started sshd@33-46.224.77.139:22-4.153.228.146:55900.service - OpenSSH per-connection server daemon (4.153.228.146:55900). Jan 14 00:12:47.959000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-46.224.77.139:22-4.153.228.146:55900 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:47.960933 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:12:47.960975 kernel: audit: type=1130 audit(1768349567.959:961): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-46.224.77.139:22-4.153.228.146:55900 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:48.540000 audit[5805]: USER_ACCT pid=5805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:48.543741 sshd[5805]: Accepted publickey for core from 4.153.228.146 port 55900 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:12:48.544617 kernel: audit: type=1101 audit(1768349568.540:962): pid=5805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:48.544000 audit[5805]: CRED_ACQ pid=5805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:48.548067 sshd-session[5805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:12:48.549550 kernel: audit: type=1103 audit(1768349568.544:963): pid=5805 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:48.549612 kernel: audit: type=1006 audit(1768349568.547:964): pid=5805 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=34 res=1 Jan 14 00:12:48.549640 kernel: audit: type=1300 audit(1768349568.547:964): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd9e34630 a2=3 a3=0 items=0 ppid=1 pid=5805 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:48.547000 audit[5805]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd9e34630 a2=3 a3=0 items=0 ppid=1 pid=5805 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=34 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:48.554565 kernel: audit: type=1327 audit(1768349568.547:964): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:48.547000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:48.559752 systemd-logind[1545]: New session 34 of user core. Jan 14 00:12:48.563734 systemd[1]: Started session-34.scope - Session 34 of User core. Jan 14 00:12:48.567000 audit[5805]: USER_START pid=5805 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:48.572000 audit[5809]: CRED_ACQ pid=5809 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:48.574714 kernel: audit: type=1105 audit(1768349568.567:965): pid=5805 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:48.574790 kernel: audit: type=1103 audit(1768349568.572:966): pid=5809 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:49.000503 sshd[5809]: Connection closed by 4.153.228.146 port 55900 Jan 14 00:12:49.001363 sshd-session[5805]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:49.002000 audit[5805]: USER_END pid=5805 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:49.009591 systemd[1]: sshd@33-46.224.77.139:22-4.153.228.146:55900.service: Deactivated successfully. Jan 14 00:12:49.013567 kernel: audit: type=1106 audit(1768349569.002:967): pid=5805 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:49.013659 kernel: audit: type=1104 audit(1768349569.002:968): pid=5805 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:49.002000 audit[5805]: CRED_DISP pid=5805 uid=0 auid=500 ses=34 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:49.012605 systemd[1]: session-34.scope: Deactivated successfully. Jan 14 00:12:49.009000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@33-46.224.77.139:22-4.153.228.146:55900 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:49.015327 systemd-logind[1545]: Session 34 logged out. Waiting for processes to exit. Jan 14 00:12:49.018445 systemd-logind[1545]: Removed session 34. Jan 14 00:12:51.336837 kubelet[2832]: E0114 00:12:51.336764 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:12:51.337514 kubelet[2832]: E0114 00:12:51.336841 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:12:52.333843 kubelet[2832]: E0114 00:12:52.333793 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:12:52.991227 containerd[1590]: time="2026-01-14T00:12:52.991146808Z" level=info msg="container event discarded" container=35208f41759db0500ae0aa177be3e8109eda432e46c6515eec5c744db5099999 type=CONTAINER_CREATED_EVENT Jan 14 00:12:52.991227 containerd[1590]: time="2026-01-14T00:12:52.991198969Z" level=info msg="container event discarded" container=35208f41759db0500ae0aa177be3e8109eda432e46c6515eec5c744db5099999 type=CONTAINER_STARTED_EVENT Jan 14 00:12:53.038456 containerd[1590]: time="2026-01-14T00:12:53.038388297Z" level=info msg="container event discarded" container=db97c2a3acc1d308fd1d3b43dbc819c77f659ba68d45242dc0ebe7241dc5a9af type=CONTAINER_CREATED_EVENT Jan 14 00:12:53.148898 containerd[1590]: time="2026-01-14T00:12:53.148832851Z" level=info msg="container event discarded" container=db97c2a3acc1d308fd1d3b43dbc819c77f659ba68d45242dc0ebe7241dc5a9af type=CONTAINER_STARTED_EVENT Jan 14 00:12:53.210292 containerd[1590]: time="2026-01-14T00:12:53.210177034Z" level=info msg="container event discarded" container=e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495 type=CONTAINER_CREATED_EVENT Jan 14 00:12:53.210292 containerd[1590]: time="2026-01-14T00:12:53.210249075Z" level=info msg="container event discarded" container=e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495 type=CONTAINER_STARTED_EVENT Jan 14 00:12:54.111439 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:12:54.111578 kernel: audit: type=1130 audit(1768349574.107:970): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-46.224.77.139:22-4.153.228.146:55906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:54.107000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-46.224.77.139:22-4.153.228.146:55906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:54.107841 systemd[1]: Started sshd@34-46.224.77.139:22-4.153.228.146:55906.service - OpenSSH per-connection server daemon (4.153.228.146:55906). Jan 14 00:12:54.653002 sshd[5824]: Accepted publickey for core from 4.153.228.146 port 55906 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:12:54.652000 audit[5824]: USER_ACCT pid=5824 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:54.656635 sshd-session[5824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:12:54.655000 audit[5824]: CRED_ACQ pid=5824 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:54.659333 kernel: audit: type=1101 audit(1768349574.652:971): pid=5824 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:54.659409 kernel: audit: type=1103 audit(1768349574.655:972): pid=5824 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:54.659432 kernel: audit: type=1006 audit(1768349574.655:973): pid=5824 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=35 res=1 Jan 14 00:12:54.655000 audit[5824]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdbb36e60 a2=3 a3=0 items=0 ppid=1 pid=5824 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:54.663499 kernel: audit: type=1300 audit(1768349574.655:973): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdbb36e60 a2=3 a3=0 items=0 ppid=1 pid=5824 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=35 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:12:54.655000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:54.665584 kernel: audit: type=1327 audit(1768349574.655:973): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:12:54.669536 systemd-logind[1545]: New session 35 of user core. Jan 14 00:12:54.675749 systemd[1]: Started session-35.scope - Session 35 of User core. Jan 14 00:12:54.681000 audit[5824]: USER_START pid=5824 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:54.685562 kernel: audit: type=1105 audit(1768349574.681:974): pid=5824 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:54.688000 audit[5828]: CRED_ACQ pid=5828 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:54.692568 kernel: audit: type=1103 audit(1768349574.688:975): pid=5828 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:55.039348 sshd[5828]: Connection closed by 4.153.228.146 port 55906 Jan 14 00:12:55.040948 sshd-session[5824]: pam_unix(sshd:session): session closed for user core Jan 14 00:12:55.041000 audit[5824]: USER_END pid=5824 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:55.042000 audit[5824]: CRED_DISP pid=5824 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:55.050268 systemd[1]: sshd@34-46.224.77.139:22-4.153.228.146:55906.service: Deactivated successfully. Jan 14 00:12:55.051728 kernel: audit: type=1106 audit(1768349575.041:976): pid=5824 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:55.051935 kernel: audit: type=1104 audit(1768349575.042:977): pid=5824 uid=0 auid=500 ses=35 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:12:55.051000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@34-46.224.77.139:22-4.153.228.146:55906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:12:55.055472 systemd[1]: session-35.scope: Deactivated successfully. Jan 14 00:12:55.056760 systemd-logind[1545]: Session 35 logged out. Waiting for processes to exit. Jan 14 00:12:55.060851 systemd-logind[1545]: Removed session 35. Jan 14 00:12:55.572623 containerd[1590]: time="2026-01-14T00:12:55.572544102Z" level=info msg="container event discarded" container=400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b type=CONTAINER_CREATED_EVENT Jan 14 00:12:55.644087 containerd[1590]: time="2026-01-14T00:12:55.644012389Z" level=info msg="container event discarded" container=400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b type=CONTAINER_STARTED_EVENT Jan 14 00:12:56.337534 kubelet[2832]: E0114 00:12:56.336157 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:12:57.335268 kubelet[2832]: E0114 00:12:57.333636 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:12:58.341550 kubelet[2832]: E0114 00:12:58.340548 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:13:00.155000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-46.224.77.139:22-4.153.228.146:54418 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:00.155812 systemd[1]: Started sshd@35-46.224.77.139:22-4.153.228.146:54418.service - OpenSSH per-connection server daemon (4.153.228.146:54418). Jan 14 00:13:00.158722 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:13:00.158806 kernel: audit: type=1130 audit(1768349580.155:979): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-46.224.77.139:22-4.153.228.146:54418 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:00.695978 sshd[5841]: Accepted publickey for core from 4.153.228.146 port 54418 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:00.695000 audit[5841]: USER_ACCT pid=5841 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:00.700000 audit[5841]: CRED_ACQ pid=5841 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:00.703471 kernel: audit: type=1101 audit(1768349580.695:980): pid=5841 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:00.703576 kernel: audit: type=1103 audit(1768349580.700:981): pid=5841 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:00.704422 sshd-session[5841]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:00.706618 kernel: audit: type=1006 audit(1768349580.703:982): pid=5841 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=36 res=1 Jan 14 00:13:00.703000 audit[5841]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffa27ea30 a2=3 a3=0 items=0 ppid=1 pid=5841 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:00.710066 kernel: audit: type=1300 audit(1768349580.703:982): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffa27ea30 a2=3 a3=0 items=0 ppid=1 pid=5841 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=36 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:00.703000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:00.712538 kernel: audit: type=1327 audit(1768349580.703:982): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:00.716047 systemd-logind[1545]: New session 36 of user core. Jan 14 00:13:00.724759 systemd[1]: Started session-36.scope - Session 36 of User core. Jan 14 00:13:00.729000 audit[5841]: USER_START pid=5841 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:00.733610 kernel: audit: type=1105 audit(1768349580.729:983): pid=5841 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:00.733000 audit[5845]: CRED_ACQ pid=5845 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:00.736576 kernel: audit: type=1103 audit(1768349580.733:984): pid=5845 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:01.073824 sshd[5845]: Connection closed by 4.153.228.146 port 54418 Jan 14 00:13:01.072720 sshd-session[5841]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:01.075000 audit[5841]: USER_END pid=5841 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:01.075000 audit[5841]: CRED_DISP pid=5841 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:01.084183 kernel: audit: type=1106 audit(1768349581.075:985): pid=5841 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:01.084249 kernel: audit: type=1104 audit(1768349581.075:986): pid=5841 uid=0 auid=500 ses=36 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:01.086907 systemd-logind[1545]: Session 36 logged out. Waiting for processes to exit. Jan 14 00:13:01.086984 systemd[1]: sshd@35-46.224.77.139:22-4.153.228.146:54418.service: Deactivated successfully. Jan 14 00:13:01.088000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@35-46.224.77.139:22-4.153.228.146:54418 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:01.093291 systemd[1]: session-36.scope: Deactivated successfully. Jan 14 00:13:01.096362 systemd-logind[1545]: Removed session 36. Jan 14 00:13:06.180892 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:13:06.180996 kernel: audit: type=1130 audit(1768349586.177:988): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-46.224.77.139:22-4.153.228.146:42454 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:06.177000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-46.224.77.139:22-4.153.228.146:42454 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:06.178996 systemd[1]: Started sshd@36-46.224.77.139:22-4.153.228.146:42454.service - OpenSSH per-connection server daemon (4.153.228.146:42454). Jan 14 00:13:06.337352 kubelet[2832]: E0114 00:13:06.337304 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:13:06.341287 kubelet[2832]: E0114 00:13:06.341025 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:13:06.341287 kubelet[2832]: E0114 00:13:06.341185 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:13:06.716000 audit[5882]: USER_ACCT pid=5882 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:06.721559 kernel: audit: type=1101 audit(1768349586.716:989): pid=5882 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:06.721668 sshd[5882]: Accepted publickey for core from 4.153.228.146 port 42454 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:06.721000 audit[5882]: CRED_ACQ pid=5882 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:06.730231 kernel: audit: type=1103 audit(1768349586.721:990): pid=5882 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:06.730352 kernel: audit: type=1006 audit(1768349586.724:991): pid=5882 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=37 res=1 Jan 14 00:13:06.730541 kernel: audit: type=1300 audit(1768349586.724:991): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcc8ee7e0 a2=3 a3=0 items=0 ppid=1 pid=5882 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:06.724000 audit[5882]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcc8ee7e0 a2=3 a3=0 items=0 ppid=1 pid=5882 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=37 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:06.726097 sshd-session[5882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:06.724000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:06.732537 kernel: audit: type=1327 audit(1768349586.724:991): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:06.741118 systemd-logind[1545]: New session 37 of user core. Jan 14 00:13:06.745870 systemd[1]: Started session-37.scope - Session 37 of User core. Jan 14 00:13:06.750000 audit[5882]: USER_START pid=5882 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:06.753000 audit[5886]: CRED_ACQ pid=5886 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:06.758154 kernel: audit: type=1105 audit(1768349586.750:992): pid=5882 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:06.758275 kernel: audit: type=1103 audit(1768349586.753:993): pid=5886 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:07.098786 sshd[5886]: Connection closed by 4.153.228.146 port 42454 Jan 14 00:13:07.099717 sshd-session[5882]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:07.101000 audit[5882]: USER_END pid=5882 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:07.101000 audit[5882]: CRED_DISP pid=5882 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:07.109875 kernel: audit: type=1106 audit(1768349587.101:994): pid=5882 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:07.109937 kernel: audit: type=1104 audit(1768349587.101:995): pid=5882 uid=0 auid=500 ses=37 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:07.107902 systemd[1]: sshd@36-46.224.77.139:22-4.153.228.146:42454.service: Deactivated successfully. Jan 14 00:13:07.106000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@36-46.224.77.139:22-4.153.228.146:42454 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:07.111476 systemd[1]: session-37.scope: Deactivated successfully. Jan 14 00:13:07.118387 systemd-logind[1545]: Session 37 logged out. Waiting for processes to exit. Jan 14 00:13:07.120034 systemd-logind[1545]: Removed session 37. Jan 14 00:13:09.333986 kubelet[2832]: E0114 00:13:09.333919 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:13:11.334622 kubelet[2832]: E0114 00:13:11.333818 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:13:11.336150 kubelet[2832]: E0114 00:13:11.336100 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:13:12.209694 systemd[1]: Started sshd@37-46.224.77.139:22-4.153.228.146:42458.service - OpenSSH per-connection server daemon (4.153.228.146:42458). Jan 14 00:13:12.212249 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:13:12.212328 kernel: audit: type=1130 audit(1768349592.208:997): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-46.224.77.139:22-4.153.228.146:42458 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:12.208000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-46.224.77.139:22-4.153.228.146:42458 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:12.748000 audit[5899]: USER_ACCT pid=5899 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:12.754100 sshd[5899]: Accepted publickey for core from 4.153.228.146 port 42458 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:12.757524 kernel: audit: type=1101 audit(1768349592.748:998): pid=5899 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:12.758200 kernel: audit: type=1103 audit(1768349592.753:999): pid=5899 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:12.753000 audit[5899]: CRED_ACQ pid=5899 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:12.757973 sshd-session[5899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:12.760808 kernel: audit: type=1006 audit(1768349592.753:1000): pid=5899 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=38 res=1 Jan 14 00:13:12.753000 audit[5899]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3d06940 a2=3 a3=0 items=0 ppid=1 pid=5899 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:12.764572 kernel: audit: type=1300 audit(1768349592.753:1000): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3d06940 a2=3 a3=0 items=0 ppid=1 pid=5899 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=38 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:12.753000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:12.766650 kernel: audit: type=1327 audit(1768349592.753:1000): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:12.766950 systemd-logind[1545]: New session 38 of user core. Jan 14 00:13:12.774784 systemd[1]: Started session-38.scope - Session 38 of User core. Jan 14 00:13:12.779000 audit[5899]: USER_START pid=5899 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:12.783000 audit[5903]: CRED_ACQ pid=5903 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:12.786872 kernel: audit: type=1105 audit(1768349592.779:1001): pid=5899 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:12.786947 kernel: audit: type=1103 audit(1768349592.783:1002): pid=5903 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:12.800033 containerd[1590]: time="2026-01-14T00:13:12.799937472Z" level=info msg="container event discarded" container=ede786f4624018a6dcfa25a7228636a76cd57c7e9702092be0e753776a732cd1 type=CONTAINER_CREATED_EVENT Jan 14 00:13:12.800033 containerd[1590]: time="2026-01-14T00:13:12.799997274Z" level=info msg="container event discarded" container=ede786f4624018a6dcfa25a7228636a76cd57c7e9702092be0e753776a732cd1 type=CONTAINER_STARTED_EVENT Jan 14 00:13:12.942913 containerd[1590]: time="2026-01-14T00:13:12.942430846Z" level=info msg="container event discarded" container=9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5 type=CONTAINER_CREATED_EVENT Jan 14 00:13:12.942913 containerd[1590]: time="2026-01-14T00:13:12.942902819Z" level=info msg="container event discarded" container=9698932823e4c3a03f863d265d0bbec9aa48bed76a8e4842ba8f01baf45e14b5 type=CONTAINER_STARTED_EVENT Jan 14 00:13:13.146580 sshd[5903]: Connection closed by 4.153.228.146 port 42458 Jan 14 00:13:13.146743 sshd-session[5899]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:13.147000 audit[5899]: USER_END pid=5899 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:13.155937 kernel: audit: type=1106 audit(1768349593.147:1003): pid=5899 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:13.156042 kernel: audit: type=1104 audit(1768349593.148:1004): pid=5899 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:13.148000 audit[5899]: CRED_DISP pid=5899 uid=0 auid=500 ses=38 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:13.153953 systemd[1]: sshd@37-46.224.77.139:22-4.153.228.146:42458.service: Deactivated successfully. Jan 14 00:13:13.154000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@37-46.224.77.139:22-4.153.228.146:42458 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:13.159162 systemd[1]: session-38.scope: Deactivated successfully. Jan 14 00:13:13.162126 systemd-logind[1545]: Session 38 logged out. Waiting for processes to exit. Jan 14 00:13:13.164757 systemd-logind[1545]: Removed session 38. Jan 14 00:13:15.201049 containerd[1590]: time="2026-01-14T00:13:15.200920283Z" level=info msg="container event discarded" container=23467040d4d683f0e80e50c33d132cc1af6b30ac939ca3872475c34dadd547da type=CONTAINER_CREATED_EVENT Jan 14 00:13:15.301315 containerd[1590]: time="2026-01-14T00:13:15.301223721Z" level=info msg="container event discarded" container=23467040d4d683f0e80e50c33d132cc1af6b30ac939ca3872475c34dadd547da type=CONTAINER_STARTED_EVENT Jan 14 00:13:16.790852 containerd[1590]: time="2026-01-14T00:13:16.790749018Z" level=info msg="container event discarded" container=a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2 type=CONTAINER_CREATED_EVENT Jan 14 00:13:16.910252 containerd[1590]: time="2026-01-14T00:13:16.910151302Z" level=info msg="container event discarded" container=a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2 type=CONTAINER_STARTED_EVENT Jan 14 00:13:17.075749 containerd[1590]: time="2026-01-14T00:13:17.075587663Z" level=info msg="container event discarded" container=a52f12cb57fefb8245a27a02ac12248032fd29b26a2c1d1d873258bf6c7818c2 type=CONTAINER_STOPPED_EVENT Jan 14 00:13:18.252895 systemd[1]: Started sshd@38-46.224.77.139:22-4.153.228.146:36716.service - OpenSSH per-connection server daemon (4.153.228.146:36716). Jan 14 00:13:18.255966 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:13:18.256054 kernel: audit: type=1130 audit(1768349598.252:1006): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-46.224.77.139:22-4.153.228.146:36716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:18.252000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-46.224.77.139:22-4.153.228.146:36716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:18.339474 kubelet[2832]: E0114 00:13:18.339403 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:13:18.788000 audit[5930]: USER_ACCT pid=5930 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:18.792114 sshd[5930]: Accepted publickey for core from 4.153.228.146 port 36716 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:18.796561 kernel: audit: type=1101 audit(1768349598.788:1007): pid=5930 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:18.796702 kernel: audit: type=1103 audit(1768349598.793:1008): pid=5930 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:18.793000 audit[5930]: CRED_ACQ pid=5930 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:18.795424 sshd-session[5930]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:18.798014 kernel: audit: type=1006 audit(1768349598.793:1009): pid=5930 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=39 res=1 Jan 14 00:13:18.799256 kernel: audit: type=1300 audit(1768349598.793:1009): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff157f760 a2=3 a3=0 items=0 ppid=1 pid=5930 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:18.793000 audit[5930]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff157f760 a2=3 a3=0 items=0 ppid=1 pid=5930 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=39 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:18.793000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:18.802587 kernel: audit: type=1327 audit(1768349598.793:1009): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:18.806585 systemd-logind[1545]: New session 39 of user core. Jan 14 00:13:18.812949 systemd[1]: Started session-39.scope - Session 39 of User core. Jan 14 00:13:18.815000 audit[5930]: USER_START pid=5930 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:18.819000 audit[5934]: CRED_ACQ pid=5934 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:18.822751 kernel: audit: type=1105 audit(1768349598.815:1010): pid=5930 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:18.822840 kernel: audit: type=1103 audit(1768349598.819:1011): pid=5934 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:19.179948 sshd[5934]: Connection closed by 4.153.228.146 port 36716 Jan 14 00:13:19.182828 sshd-session[5930]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:19.184000 audit[5930]: USER_END pid=5930 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:19.190620 systemd[1]: sshd@38-46.224.77.139:22-4.153.228.146:36716.service: Deactivated successfully. Jan 14 00:13:19.184000 audit[5930]: CRED_DISP pid=5930 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:19.192657 kernel: audit: type=1106 audit(1768349599.184:1012): pid=5930 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:19.192744 kernel: audit: type=1104 audit(1768349599.184:1013): pid=5930 uid=0 auid=500 ses=39 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:19.194254 systemd[1]: session-39.scope: Deactivated successfully. Jan 14 00:13:19.189000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@38-46.224.77.139:22-4.153.228.146:36716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:19.197064 systemd-logind[1545]: Session 39 logged out. Waiting for processes to exit. Jan 14 00:13:19.198412 systemd-logind[1545]: Removed session 39. Jan 14 00:13:20.336391 kubelet[2832]: E0114 00:13:20.336241 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:13:20.927393 containerd[1590]: time="2026-01-14T00:13:20.927305726Z" level=info msg="container event discarded" container=326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b type=CONTAINER_CREATED_EVENT Jan 14 00:13:21.046505 containerd[1590]: time="2026-01-14T00:13:21.046385419Z" level=info msg="container event discarded" container=326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b type=CONTAINER_STARTED_EVENT Jan 14 00:13:21.334641 kubelet[2832]: E0114 00:13:21.333711 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:13:21.846402 containerd[1590]: time="2026-01-14T00:13:21.846323571Z" level=info msg="container event discarded" container=326d37dfe70a1072858b9b27dd55cd977dbe8daa2c76fcce5871873dc1dcf77b type=CONTAINER_STOPPED_EVENT Jan 14 00:13:22.337751 kubelet[2832]: E0114 00:13:22.337678 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:13:23.333853 kubelet[2832]: E0114 00:13:23.333499 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:13:24.296704 systemd[1]: Started sshd@39-46.224.77.139:22-4.153.228.146:36720.service - OpenSSH per-connection server daemon (4.153.228.146:36720). Jan 14 00:13:24.298800 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:13:24.298862 kernel: audit: type=1130 audit(1768349604.296:1015): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-46.224.77.139:22-4.153.228.146:36720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:24.296000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-46.224.77.139:22-4.153.228.146:36720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:24.868000 audit[5955]: USER_ACCT pid=5955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:24.870343 sshd[5955]: Accepted publickey for core from 4.153.228.146 port 36720 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:24.872677 kernel: audit: type=1101 audit(1768349604.868:1016): pid=5955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:24.873000 audit[5955]: CRED_ACQ pid=5955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:24.877170 kernel: audit: type=1103 audit(1768349604.873:1017): pid=5955 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:24.877263 kernel: audit: type=1006 audit(1768349604.873:1018): pid=5955 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=40 res=1 Jan 14 00:13:24.876062 sshd-session[5955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:24.873000 audit[5955]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd4817750 a2=3 a3=0 items=0 ppid=1 pid=5955 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:24.880044 kernel: audit: type=1300 audit(1768349604.873:1018): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd4817750 a2=3 a3=0 items=0 ppid=1 pid=5955 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=40 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:24.873000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:24.880996 kernel: audit: type=1327 audit(1768349604.873:1018): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:24.881238 systemd-logind[1545]: New session 40 of user core. Jan 14 00:13:24.890214 systemd[1]: Started session-40.scope - Session 40 of User core. Jan 14 00:13:24.894000 audit[5955]: USER_START pid=5955 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:24.897000 audit[5959]: CRED_ACQ pid=5959 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:24.900884 kernel: audit: type=1105 audit(1768349604.894:1019): pid=5955 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:24.901029 kernel: audit: type=1103 audit(1768349604.897:1020): pid=5959 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:25.267326 sshd[5959]: Connection closed by 4.153.228.146 port 36720 Jan 14 00:13:25.266726 sshd-session[5955]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:25.269000 audit[5955]: USER_END pid=5955 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:25.272000 audit[5955]: CRED_DISP pid=5955 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:25.274892 kernel: audit: type=1106 audit(1768349605.269:1021): pid=5955 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:25.274949 kernel: audit: type=1104 audit(1768349605.272:1022): pid=5955 uid=0 auid=500 ses=40 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:25.276000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@39-46.224.77.139:22-4.153.228.146:36720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:25.277128 systemd[1]: sshd@39-46.224.77.139:22-4.153.228.146:36720.service: Deactivated successfully. Jan 14 00:13:25.280821 systemd[1]: session-40.scope: Deactivated successfully. Jan 14 00:13:25.284685 systemd-logind[1545]: Session 40 logged out. Waiting for processes to exit. Jan 14 00:13:25.287562 systemd-logind[1545]: Removed session 40. Jan 14 00:13:26.334064 kubelet[2832]: E0114 00:13:26.334006 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:13:30.248256 containerd[1590]: time="2026-01-14T00:13:30.248180807Z" level=info msg="container event discarded" container=bd67ebed549388e0f56e5a453e43e855e29a3416e724d84afe08846d49715c39 type=CONTAINER_CREATED_EVENT Jan 14 00:13:30.379740 systemd[1]: Started sshd@40-46.224.77.139:22-4.153.228.146:39008.service - OpenSSH per-connection server daemon (4.153.228.146:39008). Jan 14 00:13:30.384847 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:13:30.384895 kernel: audit: type=1130 audit(1768349610.379:1024): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-46.224.77.139:22-4.153.228.146:39008 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:30.379000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-46.224.77.139:22-4.153.228.146:39008 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:30.398588 containerd[1590]: time="2026-01-14T00:13:30.397481648Z" level=info msg="container event discarded" container=bd67ebed549388e0f56e5a453e43e855e29a3416e724d84afe08846d49715c39 type=CONTAINER_STARTED_EVENT Jan 14 00:13:30.945000 audit[5972]: USER_ACCT pid=5972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:30.946095 sshd[5972]: Accepted publickey for core from 4.153.228.146 port 39008 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:30.951152 kernel: audit: type=1101 audit(1768349610.945:1025): pid=5972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:30.951251 kernel: audit: type=1103 audit(1768349610.948:1026): pid=5972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:30.948000 audit[5972]: CRED_ACQ pid=5972 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:30.950064 sshd-session[5972]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:30.952929 kernel: audit: type=1006 audit(1768349610.948:1027): pid=5972 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=41 res=1 Jan 14 00:13:30.948000 audit[5972]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc4e1f010 a2=3 a3=0 items=0 ppid=1 pid=5972 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:30.948000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:30.956606 kernel: audit: type=1300 audit(1768349610.948:1027): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc4e1f010 a2=3 a3=0 items=0 ppid=1 pid=5972 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=41 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:30.958556 kernel: audit: type=1327 audit(1768349610.948:1027): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:30.962248 systemd-logind[1545]: New session 41 of user core. Jan 14 00:13:30.967735 systemd[1]: Started session-41.scope - Session 41 of User core. Jan 14 00:13:30.972000 audit[5972]: USER_START pid=5972 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:30.974000 audit[5976]: CRED_ACQ pid=5976 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:30.979617 kernel: audit: type=1105 audit(1768349610.972:1028): pid=5972 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:30.979683 kernel: audit: type=1103 audit(1768349610.974:1029): pid=5976 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:31.335164 kubelet[2832]: E0114 00:13:31.334576 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:13:31.357225 sshd[5976]: Connection closed by 4.153.228.146 port 39008 Jan 14 00:13:31.358701 sshd-session[5972]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:31.360000 audit[5972]: USER_END pid=5972 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:31.367780 systemd-logind[1545]: Session 41 logged out. Waiting for processes to exit. Jan 14 00:13:31.360000 audit[5972]: CRED_DISP pid=5972 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:31.370417 kernel: audit: type=1106 audit(1768349611.360:1030): pid=5972 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:31.370581 kernel: audit: type=1104 audit(1768349611.360:1031): pid=5972 uid=0 auid=500 ses=41 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:31.369157 systemd[1]: sshd@40-46.224.77.139:22-4.153.228.146:39008.service: Deactivated successfully. Jan 14 00:13:31.368000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@40-46.224.77.139:22-4.153.228.146:39008 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:31.376601 systemd[1]: session-41.scope: Deactivated successfully. Jan 14 00:13:31.379315 systemd-logind[1545]: Removed session 41. Jan 14 00:13:32.335580 kubelet[2832]: E0114 00:13:32.334495 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:13:32.406730 containerd[1590]: time="2026-01-14T00:13:32.406642763Z" level=info msg="container event discarded" container=fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba type=CONTAINER_CREATED_EVENT Jan 14 00:13:32.406730 containerd[1590]: time="2026-01-14T00:13:32.406697885Z" level=info msg="container event discarded" container=fbd73fb1c45621429ecb264bbe0783604add650cc2f4d2db22f42f6964abc8ba type=CONTAINER_STARTED_EVENT Jan 14 00:13:33.335906 kubelet[2832]: E0114 00:13:33.335752 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:13:33.865436 containerd[1590]: time="2026-01-14T00:13:33.864837759Z" level=info msg="container event discarded" container=53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9 type=CONTAINER_CREATED_EVENT Jan 14 00:13:33.865959 containerd[1590]: time="2026-01-14T00:13:33.865903073Z" level=info msg="container event discarded" container=53475e5e0358cdfe9fb242581afafd6f92c2513c3afb63c351b9981a744e6ac9 type=CONTAINER_STARTED_EVENT Jan 14 00:13:33.888206 containerd[1590]: time="2026-01-14T00:13:33.888127350Z" level=info msg="container event discarded" container=5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3 type=CONTAINER_CREATED_EVENT Jan 14 00:13:33.888384 containerd[1590]: time="2026-01-14T00:13:33.888347957Z" level=info msg="container event discarded" container=5a5196318175fb341001e1c4b2893ec31b900133fa1a81dbf5f7e0035301afb3 type=CONTAINER_STARTED_EVENT Jan 14 00:13:33.899628 containerd[1590]: time="2026-01-14T00:13:33.899557399Z" level=info msg="container event discarded" container=9f723df0987aae60bee80b4ae19a7681f7f058b4b499acfbce9a4883ad198319 type=CONTAINER_CREATED_EVENT Jan 14 00:13:33.968398 containerd[1590]: time="2026-01-14T00:13:33.968330297Z" level=info msg="container event discarded" container=9f723df0987aae60bee80b4ae19a7681f7f058b4b499acfbce9a4883ad198319 type=CONTAINER_STARTED_EVENT Jan 14 00:13:34.334731 kubelet[2832]: E0114 00:13:34.334202 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:13:35.073869 containerd[1590]: time="2026-01-14T00:13:35.073762316Z" level=info msg="container event discarded" container=aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5 type=CONTAINER_CREATED_EVENT Jan 14 00:13:35.073869 containerd[1590]: time="2026-01-14T00:13:35.073834438Z" level=info msg="container event discarded" container=aab4a063b1f879f60b6e8108ea838148dedae09421b1ed3b3d3752dad36052f5 type=CONTAINER_STARTED_EVENT Jan 14 00:13:35.226346 containerd[1590]: time="2026-01-14T00:13:35.226264059Z" level=info msg="container event discarded" container=0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d type=CONTAINER_CREATED_EVENT Jan 14 00:13:35.226346 containerd[1590]: time="2026-01-14T00:13:35.226315820Z" level=info msg="container event discarded" container=0d8f70bf6c2a3af67a72b2ee74adb89f318b7cb4ffe813acfa544ab4434e9c6d type=CONTAINER_STARTED_EVENT Jan 14 00:13:35.267639 containerd[1590]: time="2026-01-14T00:13:35.267561208Z" level=info msg="container event discarded" container=c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905 type=CONTAINER_CREATED_EVENT Jan 14 00:13:35.267639 containerd[1590]: time="2026-01-14T00:13:35.267615330Z" level=info msg="container event discarded" container=c5dddcc3f7cb512ef1ae6c76f3275ea9b2240776b892f235884344172a08c905 type=CONTAINER_STARTED_EVENT Jan 14 00:13:35.416622 containerd[1590]: time="2026-01-14T00:13:35.416489954Z" level=info msg="container event discarded" container=b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf type=CONTAINER_CREATED_EVENT Jan 14 00:13:35.416622 containerd[1590]: time="2026-01-14T00:13:35.416609438Z" level=info msg="container event discarded" container=b6553d4a7ac9c4c9c8c2df44f23eb201c89943eecaf9a9a4281bd210bbb7cccf type=CONTAINER_STARTED_EVENT Jan 14 00:13:35.794332 containerd[1590]: time="2026-01-14T00:13:35.794152093Z" level=info msg="container event discarded" container=779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe type=CONTAINER_CREATED_EVENT Jan 14 00:13:35.794332 containerd[1590]: time="2026-01-14T00:13:35.794206775Z" level=info msg="container event discarded" container=779e70a645316b0b011ac939f6c22e7eaa5634a8f70464f0be5922e5af5a67fe type=CONTAINER_STARTED_EVENT Jan 14 00:13:35.825633 containerd[1590]: time="2026-01-14T00:13:35.825500717Z" level=info msg="container event discarded" container=b59d546f2c0d817fe3aa9c120261662c60ee0beed232120aecb63cea4e1b7422 type=CONTAINER_CREATED_EVENT Jan 14 00:13:35.903879 containerd[1590]: time="2026-01-14T00:13:35.903804115Z" level=info msg="container event discarded" container=b59d546f2c0d817fe3aa9c120261662c60ee0beed232120aecb63cea4e1b7422 type=CONTAINER_STARTED_EVENT Jan 14 00:13:36.468793 systemd[1]: Started sshd@41-46.224.77.139:22-4.153.228.146:39164.service - OpenSSH per-connection server daemon (4.153.228.146:39164). Jan 14 00:13:36.471345 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:13:36.471407 kernel: audit: type=1130 audit(1768349616.468:1033): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-46.224.77.139:22-4.153.228.146:39164 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:36.468000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-46.224.77.139:22-4.153.228.146:39164 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:37.010000 audit[6012]: USER_ACCT pid=6012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.012075 sshd[6012]: Accepted publickey for core from 4.153.228.146 port 39164 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:37.015583 kernel: audit: type=1101 audit(1768349617.010:1034): pid=6012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.014000 audit[6012]: CRED_ACQ pid=6012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.018168 sshd-session[6012]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:37.019580 kernel: audit: type=1103 audit(1768349617.014:1035): pid=6012 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.019663 kernel: audit: type=1006 audit(1768349617.014:1036): pid=6012 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=42 res=1 Jan 14 00:13:37.014000 audit[6012]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffee5bbff0 a2=3 a3=0 items=0 ppid=1 pid=6012 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:37.022091 kernel: audit: type=1300 audit(1768349617.014:1036): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffee5bbff0 a2=3 a3=0 items=0 ppid=1 pid=6012 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=42 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:37.022174 kernel: audit: type=1327 audit(1768349617.014:1036): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:37.014000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:37.030368 systemd-logind[1545]: New session 42 of user core. Jan 14 00:13:37.038802 systemd[1]: Started session-42.scope - Session 42 of User core. Jan 14 00:13:37.043000 audit[6012]: USER_START pid=6012 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.046000 audit[6016]: CRED_ACQ pid=6016 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.049479 kernel: audit: type=1105 audit(1768349617.043:1037): pid=6012 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.049800 kernel: audit: type=1103 audit(1768349617.046:1038): pid=6016 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.335133 kubelet[2832]: E0114 00:13:37.334125 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:13:37.411207 sshd[6016]: Connection closed by 4.153.228.146 port 39164 Jan 14 00:13:37.412062 sshd-session[6012]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:37.413000 audit[6012]: USER_END pid=6012 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.419244 systemd[1]: sshd@41-46.224.77.139:22-4.153.228.146:39164.service: Deactivated successfully. Jan 14 00:13:37.419535 kernel: audit: type=1106 audit(1768349617.413:1039): pid=6012 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.414000 audit[6012]: CRED_DISP pid=6012 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.419000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@41-46.224.77.139:22-4.153.228.146:39164 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:37.423275 kernel: audit: type=1104 audit(1768349617.414:1040): pid=6012 uid=0 auid=500 ses=42 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:37.423548 systemd[1]: session-42.scope: Deactivated successfully. Jan 14 00:13:37.425363 systemd-logind[1545]: Session 42 logged out. Waiting for processes to exit. Jan 14 00:13:37.428781 systemd-logind[1545]: Removed session 42. Jan 14 00:13:41.335761 kubelet[2832]: E0114 00:13:41.335690 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:13:42.527460 systemd[1]: Started sshd@42-46.224.77.139:22-4.153.228.146:39180.service - OpenSSH per-connection server daemon (4.153.228.146:39180). Jan 14 00:13:42.532404 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:13:42.532467 kernel: audit: type=1130 audit(1768349622.526:1042): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-46.224.77.139:22-4.153.228.146:39180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:42.526000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-46.224.77.139:22-4.153.228.146:39180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:43.063000 audit[6028]: USER_ACCT pid=6028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.065138 sshd[6028]: Accepted publickey for core from 4.153.228.146 port 39180 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:43.068000 audit[6028]: CRED_ACQ pid=6028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.072239 kernel: audit: type=1101 audit(1768349623.063:1043): pid=6028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.072300 kernel: audit: type=1103 audit(1768349623.068:1044): pid=6028 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.070918 sshd-session[6028]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:43.073828 kernel: audit: type=1006 audit(1768349623.068:1045): pid=6028 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=43 res=1 Jan 14 00:13:43.068000 audit[6028]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffecb1b50 a2=3 a3=0 items=0 ppid=1 pid=6028 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:43.076885 kernel: audit: type=1300 audit(1768349623.068:1045): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffecb1b50 a2=3 a3=0 items=0 ppid=1 pid=6028 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=43 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:43.068000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:43.078595 kernel: audit: type=1327 audit(1768349623.068:1045): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:43.084604 systemd-logind[1545]: New session 43 of user core. Jan 14 00:13:43.091802 systemd[1]: Started session-43.scope - Session 43 of User core. Jan 14 00:13:43.097000 audit[6028]: USER_START pid=6028 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.101000 audit[6032]: CRED_ACQ pid=6032 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.103782 kernel: audit: type=1105 audit(1768349623.097:1046): pid=6028 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.106569 kernel: audit: type=1103 audit(1768349623.101:1047): pid=6032 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.482323 sshd[6032]: Connection closed by 4.153.228.146 port 39180 Jan 14 00:13:43.485171 sshd-session[6028]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:43.485000 audit[6028]: USER_END pid=6028 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.492288 systemd[1]: sshd@42-46.224.77.139:22-4.153.228.146:39180.service: Deactivated successfully. Jan 14 00:13:43.485000 audit[6028]: CRED_DISP pid=6028 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.496257 systemd[1]: session-43.scope: Deactivated successfully. Jan 14 00:13:43.497217 kernel: audit: type=1106 audit(1768349623.485:1048): pid=6028 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.497302 kernel: audit: type=1104 audit(1768349623.485:1049): pid=6028 uid=0 auid=500 ses=43 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:43.491000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@42-46.224.77.139:22-4.153.228.146:39180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:43.499005 systemd-logind[1545]: Session 43 logged out. Waiting for processes to exit. Jan 14 00:13:43.500881 systemd-logind[1545]: Removed session 43. Jan 14 00:13:44.338694 kubelet[2832]: E0114 00:13:44.338420 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:13:45.335860 kubelet[2832]: E0114 00:13:45.335791 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:13:45.337313 kubelet[2832]: E0114 00:13:45.337253 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:13:48.596072 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:13:48.596263 kernel: audit: type=1130 audit(1768349628.592:1051): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-46.224.77.139:22-4.153.228.146:48656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:48.592000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-46.224.77.139:22-4.153.228.146:48656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:48.593701 systemd[1]: Started sshd@43-46.224.77.139:22-4.153.228.146:48656.service - OpenSSH per-connection server daemon (4.153.228.146:48656). Jan 14 00:13:49.142952 sshd[6046]: Accepted publickey for core from 4.153.228.146 port 48656 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:49.141000 audit[6046]: USER_ACCT pid=6046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.151161 kernel: audit: type=1101 audit(1768349629.141:1052): pid=6046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.151263 kernel: audit: type=1103 audit(1768349629.146:1053): pid=6046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.146000 audit[6046]: CRED_ACQ pid=6046 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.149121 sshd-session[6046]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:49.155283 kernel: audit: type=1006 audit(1768349629.146:1054): pid=6046 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=44 res=1 Jan 14 00:13:49.158507 kernel: audit: type=1300 audit(1768349629.146:1054): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff7189760 a2=3 a3=0 items=0 ppid=1 pid=6046 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:49.146000 audit[6046]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff7189760 a2=3 a3=0 items=0 ppid=1 pid=6046 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=44 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:49.146000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:49.161620 kernel: audit: type=1327 audit(1768349629.146:1054): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:49.164729 systemd-logind[1545]: New session 44 of user core. Jan 14 00:13:49.171832 systemd[1]: Started session-44.scope - Session 44 of User core. Jan 14 00:13:49.174000 audit[6046]: USER_START pid=6046 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.180538 kernel: audit: type=1105 audit(1768349629.174:1055): pid=6046 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.178000 audit[6050]: CRED_ACQ pid=6050 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.184562 kernel: audit: type=1103 audit(1768349629.178:1056): pid=6050 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.334564 kubelet[2832]: E0114 00:13:49.334496 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:13:49.524654 sshd[6050]: Connection closed by 4.153.228.146 port 48656 Jan 14 00:13:49.524920 sshd-session[6046]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:49.527000 audit[6046]: USER_END pid=6046 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.527000 audit[6046]: CRED_DISP pid=6046 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.535827 kernel: audit: type=1106 audit(1768349629.527:1057): pid=6046 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.535912 kernel: audit: type=1104 audit(1768349629.527:1058): pid=6046 uid=0 auid=500 ses=44 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:49.535753 systemd[1]: sshd@43-46.224.77.139:22-4.153.228.146:48656.service: Deactivated successfully. Jan 14 00:13:49.534000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@43-46.224.77.139:22-4.153.228.146:48656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:49.538597 systemd[1]: session-44.scope: Deactivated successfully. Jan 14 00:13:49.541106 systemd-logind[1545]: Session 44 logged out. Waiting for processes to exit. Jan 14 00:13:49.545258 systemd-logind[1545]: Removed session 44. Jan 14 00:13:49.634000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-46.224.77.139:22-4.153.228.146:48664 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:49.636049 systemd[1]: Started sshd@44-46.224.77.139:22-4.153.228.146:48664.service - OpenSSH per-connection server daemon (4.153.228.146:48664). Jan 14 00:13:50.212000 audit[6062]: USER_ACCT pid=6062 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:50.214214 sshd[6062]: Accepted publickey for core from 4.153.228.146 port 48664 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:50.214000 audit[6062]: CRED_ACQ pid=6062 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:50.214000 audit[6062]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd3da5080 a2=3 a3=0 items=0 ppid=1 pid=6062 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=45 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:50.214000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:50.218782 sshd-session[6062]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:50.227285 systemd-logind[1545]: New session 45 of user core. Jan 14 00:13:50.231872 systemd[1]: Started session-45.scope - Session 45 of User core. Jan 14 00:13:50.233000 audit[6062]: USER_START pid=6062 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:50.235000 audit[6066]: CRED_ACQ pid=6066 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:50.674454 sshd[6066]: Connection closed by 4.153.228.146 port 48664 Jan 14 00:13:50.674362 sshd-session[6062]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:50.677000 audit[6062]: USER_END pid=6062 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:50.677000 audit[6062]: CRED_DISP pid=6062 uid=0 auid=500 ses=45 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:50.684018 systemd[1]: sshd@44-46.224.77.139:22-4.153.228.146:48664.service: Deactivated successfully. Jan 14 00:13:50.682000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@44-46.224.77.139:22-4.153.228.146:48664 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:50.686184 systemd[1]: session-45.scope: Deactivated successfully. Jan 14 00:13:50.688349 systemd-logind[1545]: Session 45 logged out. Waiting for processes to exit. Jan 14 00:13:50.691500 systemd-logind[1545]: Removed session 45. Jan 14 00:13:50.787000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-46.224.77.139:22-4.153.228.146:48674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:50.788851 systemd[1]: Started sshd@45-46.224.77.139:22-4.153.228.146:48674.service - OpenSSH per-connection server daemon (4.153.228.146:48674). Jan 14 00:13:51.317000 audit[6077]: USER_ACCT pid=6077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:51.319207 sshd[6077]: Accepted publickey for core from 4.153.228.146 port 48674 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:51.318000 audit[6077]: CRED_ACQ pid=6077 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:51.318000 audit[6077]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3a236c0 a2=3 a3=0 items=0 ppid=1 pid=6077 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=46 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:51.318000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:51.321150 sshd-session[6077]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:51.331423 systemd-logind[1545]: New session 46 of user core. Jan 14 00:13:51.335844 systemd[1]: Started session-46.scope - Session 46 of User core. Jan 14 00:13:51.339000 audit[6077]: USER_START pid=6077 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:51.341000 audit[6081]: CRED_ACQ pid=6081 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:51.716295 sshd[6081]: Connection closed by 4.153.228.146 port 48674 Jan 14 00:13:51.716631 sshd-session[6077]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:51.719000 audit[6077]: USER_END pid=6077 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:51.719000 audit[6077]: CRED_DISP pid=6077 uid=0 auid=500 ses=46 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:51.726375 systemd[1]: sshd@45-46.224.77.139:22-4.153.228.146:48674.service: Deactivated successfully. Jan 14 00:13:51.727000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@45-46.224.77.139:22-4.153.228.146:48674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:51.732182 systemd[1]: session-46.scope: Deactivated successfully. Jan 14 00:13:51.733715 systemd-logind[1545]: Session 46 logged out. Waiting for processes to exit. Jan 14 00:13:51.735357 systemd-logind[1545]: Removed session 46. Jan 14 00:13:52.335836 kubelet[2832]: E0114 00:13:52.335784 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:13:56.338326 kubelet[2832]: E0114 00:13:56.337990 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:13:56.829714 systemd[1]: Started sshd@46-46.224.77.139:22-4.153.228.146:39496.service - OpenSSH per-connection server daemon (4.153.228.146:39496). Jan 14 00:13:56.830000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-46.224.77.139:22-4.153.228.146:39496 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:56.833325 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 14 00:13:56.833393 kernel: audit: type=1130 audit(1768349636.830:1078): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-46.224.77.139:22-4.153.228.146:39496 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:57.333998 kubelet[2832]: E0114 00:13:57.333945 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:13:57.388569 sshd[6095]: Accepted publickey for core from 4.153.228.146 port 39496 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:13:57.387000 audit[6095]: USER_ACCT pid=6095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.393368 kernel: audit: type=1101 audit(1768349637.387:1079): pid=6095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.393458 kernel: audit: type=1103 audit(1768349637.390:1080): pid=6095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.393489 kernel: audit: type=1006 audit(1768349637.391:1081): pid=6095 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=47 res=1 Jan 14 00:13:57.390000 audit[6095]: CRED_ACQ pid=6095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.392380 sshd-session[6095]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:13:57.391000 audit[6095]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff3a86700 a2=3 a3=0 items=0 ppid=1 pid=6095 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:57.397103 kernel: audit: type=1300 audit(1768349637.391:1081): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff3a86700 a2=3 a3=0 items=0 ppid=1 pid=6095 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=47 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:13:57.391000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:57.398022 kernel: audit: type=1327 audit(1768349637.391:1081): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:13:57.402071 systemd-logind[1545]: New session 47 of user core. Jan 14 00:13:57.409069 systemd[1]: Started session-47.scope - Session 47 of User core. Jan 14 00:13:57.413000 audit[6095]: USER_START pid=6095 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.416000 audit[6099]: CRED_ACQ pid=6099 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.419130 kernel: audit: type=1105 audit(1768349637.413:1082): pid=6095 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.419220 kernel: audit: type=1103 audit(1768349637.416:1083): pid=6099 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.764080 sshd[6099]: Connection closed by 4.153.228.146 port 39496 Jan 14 00:13:57.763234 sshd-session[6095]: pam_unix(sshd:session): session closed for user core Jan 14 00:13:57.764000 audit[6095]: USER_END pid=6095 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.767000 audit[6095]: CRED_DISP pid=6095 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.772502 kernel: audit: type=1106 audit(1768349637.764:1084): pid=6095 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.772667 kernel: audit: type=1104 audit(1768349637.767:1085): pid=6095 uid=0 auid=500 ses=47 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:13:57.773231 systemd[1]: sshd@46-46.224.77.139:22-4.153.228.146:39496.service: Deactivated successfully. Jan 14 00:13:57.773000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@46-46.224.77.139:22-4.153.228.146:39496 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:13:57.776595 systemd[1]: session-47.scope: Deactivated successfully. Jan 14 00:13:57.778749 systemd-logind[1545]: Session 47 logged out. Waiting for processes to exit. Jan 14 00:13:57.780452 systemd-logind[1545]: Removed session 47. Jan 14 00:13:58.334594 kubelet[2832]: E0114 00:13:58.334476 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:13:59.335217 kubelet[2832]: E0114 00:13:59.334936 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:14:02.878686 systemd[1]: Started sshd@47-46.224.77.139:22-4.153.228.146:39502.service - OpenSSH per-connection server daemon (4.153.228.146:39502). Jan 14 00:14:02.878000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-46.224.77.139:22-4.153.228.146:39502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:02.883910 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:02.884004 kernel: audit: type=1130 audit(1768349642.878:1087): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-46.224.77.139:22-4.153.228.146:39502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:03.333755 kubelet[2832]: E0114 00:14:03.333619 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:14:03.420000 audit[6141]: USER_ACCT pid=6141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.424421 sshd[6141]: Accepted publickey for core from 4.153.228.146 port 39502 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:03.425664 sshd-session[6141]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:03.424000 audit[6141]: CRED_ACQ pid=6141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.427751 kernel: audit: type=1101 audit(1768349643.420:1088): pid=6141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.427833 kernel: audit: type=1103 audit(1768349643.424:1089): pid=6141 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.429226 kernel: audit: type=1006 audit(1768349643.424:1090): pid=6141 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=48 res=1 Jan 14 00:14:03.429324 kernel: audit: type=1300 audit(1768349643.424:1090): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffec110a00 a2=3 a3=0 items=0 ppid=1 pid=6141 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:03.424000 audit[6141]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffec110a00 a2=3 a3=0 items=0 ppid=1 pid=6141 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=48 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:03.424000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:03.435198 kernel: audit: type=1327 audit(1768349643.424:1090): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:03.439847 systemd-logind[1545]: New session 48 of user core. Jan 14 00:14:03.447756 systemd[1]: Started session-48.scope - Session 48 of User core. Jan 14 00:14:03.453000 audit[6141]: USER_START pid=6141 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.458000 audit[6145]: CRED_ACQ pid=6145 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.461204 kernel: audit: type=1105 audit(1768349643.453:1091): pid=6141 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.461330 kernel: audit: type=1103 audit(1768349643.458:1092): pid=6145 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.812037 sshd[6145]: Connection closed by 4.153.228.146 port 39502 Jan 14 00:14:03.812428 sshd-session[6141]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:03.816000 audit[6141]: USER_END pid=6141 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.816000 audit[6141]: CRED_DISP pid=6141 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.820929 kernel: audit: type=1106 audit(1768349643.816:1093): pid=6141 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.821002 kernel: audit: type=1104 audit(1768349643.816:1094): pid=6141 uid=0 auid=500 ses=48 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:03.822823 systemd-logind[1545]: Session 48 logged out. Waiting for processes to exit. Jan 14 00:14:03.823748 systemd[1]: sshd@47-46.224.77.139:22-4.153.228.146:39502.service: Deactivated successfully. Jan 14 00:14:03.824000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@47-46.224.77.139:22-4.153.228.146:39502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:03.827989 systemd[1]: session-48.scope: Deactivated successfully. Jan 14 00:14:03.830462 systemd-logind[1545]: Removed session 48. Jan 14 00:14:04.334686 kubelet[2832]: E0114 00:14:04.334635 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:14:08.923609 systemd[1]: Started sshd@48-46.224.77.139:22-4.153.228.146:54670.service - OpenSSH per-connection server daemon (4.153.228.146:54670). Jan 14 00:14:08.922000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-46.224.77.139:22-4.153.228.146:54670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:08.926994 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:08.927093 kernel: audit: type=1130 audit(1768349648.922:1096): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-46.224.77.139:22-4.153.228.146:54670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:09.460000 audit[6156]: USER_ACCT pid=6156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.463641 sshd[6156]: Accepted publickey for core from 4.153.228.146 port 54670 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:09.465000 audit[6156]: CRED_ACQ pid=6156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.469734 kernel: audit: type=1101 audit(1768349649.460:1097): pid=6156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.469807 kernel: audit: type=1103 audit(1768349649.465:1098): pid=6156 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.469830 kernel: audit: type=1006 audit(1768349649.465:1099): pid=6156 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=49 res=1 Jan 14 00:14:09.469647 sshd-session[6156]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:09.465000 audit[6156]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe33f09c0 a2=3 a3=0 items=0 ppid=1 pid=6156 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:09.473683 kernel: audit: type=1300 audit(1768349649.465:1099): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe33f09c0 a2=3 a3=0 items=0 ppid=1 pid=6156 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=49 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:09.465000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:09.474570 kernel: audit: type=1327 audit(1768349649.465:1099): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:09.478888 systemd-logind[1545]: New session 49 of user core. Jan 14 00:14:09.482912 systemd[1]: Started session-49.scope - Session 49 of User core. Jan 14 00:14:09.484000 audit[6156]: USER_START pid=6156 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.488947 kernel: audit: type=1105 audit(1768349649.484:1100): pid=6156 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.489019 kernel: audit: type=1103 audit(1768349649.487:1101): pid=6160 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.487000 audit[6160]: CRED_ACQ pid=6160 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.850547 sshd[6160]: Connection closed by 4.153.228.146 port 54670 Jan 14 00:14:09.851427 sshd-session[6156]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:09.851000 audit[6156]: USER_END pid=6156 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.857729 systemd[1]: sshd@48-46.224.77.139:22-4.153.228.146:54670.service: Deactivated successfully. Jan 14 00:14:09.852000 audit[6156]: CRED_DISP pid=6156 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.861169 kernel: audit: type=1106 audit(1768349649.851:1102): pid=6156 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.861239 kernel: audit: type=1104 audit(1768349649.852:1103): pid=6156 uid=0 auid=500 ses=49 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:09.856000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@48-46.224.77.139:22-4.153.228.146:54670 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:09.862361 systemd[1]: session-49.scope: Deactivated successfully. Jan 14 00:14:09.864084 systemd-logind[1545]: Session 49 logged out. Waiting for processes to exit. Jan 14 00:14:09.866744 systemd-logind[1545]: Removed session 49. Jan 14 00:14:10.334720 kubelet[2832]: E0114 00:14:10.334112 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:14:10.336199 containerd[1590]: time="2026-01-14T00:14:10.335317627Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 00:14:10.682734 containerd[1590]: time="2026-01-14T00:14:10.682688313Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:14:10.684435 containerd[1590]: time="2026-01-14T00:14:10.684301215Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 00:14:10.684435 containerd[1590]: time="2026-01-14T00:14:10.684363017Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 00:14:10.685078 kubelet[2832]: E0114 00:14:10.685043 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:14:10.685143 kubelet[2832]: E0114 00:14:10.685089 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:14:10.685267 kubelet[2832]: E0114 00:14:10.685221 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r74wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b44cc6f4-gxl6c_calico-system(5ee70bb0-55b7-4a80-b5cb-3133091615ae): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 00:14:10.686661 kubelet[2832]: E0114 00:14:10.686615 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:14:12.338384 containerd[1590]: time="2026-01-14T00:14:12.338043672Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 00:14:12.658281 containerd[1590]: time="2026-01-14T00:14:12.658130151Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:14:12.660600 containerd[1590]: time="2026-01-14T00:14:12.660468001Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 00:14:12.660935 containerd[1590]: time="2026-01-14T00:14:12.660565565Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 00:14:12.661203 kubelet[2832]: E0114 00:14:12.661150 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:14:12.662093 kubelet[2832]: E0114 00:14:12.661219 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:14:12.662093 kubelet[2832]: E0114 00:14:12.661440 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 00:14:12.664561 containerd[1590]: time="2026-01-14T00:14:12.664312509Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 00:14:13.002698 containerd[1590]: time="2026-01-14T00:14:13.002274394Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:14:13.004815 containerd[1590]: time="2026-01-14T00:14:13.004756170Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 00:14:13.004944 containerd[1590]: time="2026-01-14T00:14:13.004863254Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 00:14:13.005084 kubelet[2832]: E0114 00:14:13.005024 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:14:13.005084 kubelet[2832]: E0114 00:14:13.005073 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:14:13.005229 kubelet[2832]: E0114 00:14:13.005189 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 00:14:13.006736 kubelet[2832]: E0114 00:14:13.006673 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:14:13.335096 containerd[1590]: time="2026-01-14T00:14:13.334913193Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 00:14:13.673876 containerd[1590]: time="2026-01-14T00:14:13.673598904Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:14:13.675073 containerd[1590]: time="2026-01-14T00:14:13.675001078Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 00:14:13.675777 containerd[1590]: time="2026-01-14T00:14:13.675070721Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 00:14:13.676098 kubelet[2832]: E0114 00:14:13.676023 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:14:13.676098 kubelet[2832]: E0114 00:14:13.676079 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:14:13.677228 kubelet[2832]: E0114 00:14:13.677032 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c290d008f4c4d48b25c8570357599ee,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 00:14:13.680558 containerd[1590]: time="2026-01-14T00:14:13.680491610Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 00:14:14.023322 containerd[1590]: time="2026-01-14T00:14:14.022959949Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:14:14.024363 containerd[1590]: time="2026-01-14T00:14:14.024134954Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 00:14:14.024363 containerd[1590]: time="2026-01-14T00:14:14.024250319Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 00:14:14.024544 kubelet[2832]: E0114 00:14:14.024454 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:14:14.024620 kubelet[2832]: E0114 00:14:14.024599 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:14:14.024958 kubelet[2832]: E0114 00:14:14.024857 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 00:14:14.026311 kubelet[2832]: E0114 00:14:14.026253 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:14:14.962000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-46.224.77.139:22-4.153.228.146:56708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:14.963953 systemd[1]: Started sshd@49-46.224.77.139:22-4.153.228.146:56708.service - OpenSSH per-connection server daemon (4.153.228.146:56708). Jan 14 00:14:14.969559 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:14.969713 kernel: audit: type=1130 audit(1768349654.962:1105): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-46.224.77.139:22-4.153.228.146:56708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:15.527000 audit[6172]: USER_ACCT pid=6172 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.529662 sshd[6172]: Accepted publickey for core from 4.153.228.146 port 56708 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:15.531000 audit[6172]: CRED_ACQ pid=6172 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.535718 kernel: audit: type=1101 audit(1768349655.527:1106): pid=6172 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.535822 kernel: audit: type=1103 audit(1768349655.531:1107): pid=6172 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.537347 sshd-session[6172]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:15.539780 kernel: audit: type=1006 audit(1768349655.534:1108): pid=6172 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=50 res=1 Jan 14 00:14:15.534000 audit[6172]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcb77c720 a2=3 a3=0 items=0 ppid=1 pid=6172 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:15.543068 kernel: audit: type=1300 audit(1768349655.534:1108): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcb77c720 a2=3 a3=0 items=0 ppid=1 pid=6172 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=50 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:15.534000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:15.544753 kernel: audit: type=1327 audit(1768349655.534:1108): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:15.549770 systemd-logind[1545]: New session 50 of user core. Jan 14 00:14:15.554784 systemd[1]: Started session-50.scope - Session 50 of User core. Jan 14 00:14:15.557000 audit[6172]: USER_START pid=6172 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.561000 audit[6176]: CRED_ACQ pid=6176 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.565086 kernel: audit: type=1105 audit(1768349655.557:1109): pid=6172 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.565182 kernel: audit: type=1103 audit(1768349655.561:1110): pid=6176 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.945611 sshd[6176]: Connection closed by 4.153.228.146 port 56708 Jan 14 00:14:15.946336 sshd-session[6172]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:15.948000 audit[6172]: USER_END pid=6172 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.952241 systemd[1]: sshd@49-46.224.77.139:22-4.153.228.146:56708.service: Deactivated successfully. Jan 14 00:14:15.948000 audit[6172]: CRED_DISP pid=6172 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.954855 kernel: audit: type=1106 audit(1768349655.948:1111): pid=6172 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.954919 kernel: audit: type=1104 audit(1768349655.948:1112): pid=6172 uid=0 auid=500 ses=50 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:15.955192 systemd[1]: session-50.scope: Deactivated successfully. Jan 14 00:14:15.950000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@49-46.224.77.139:22-4.153.228.146:56708 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:15.961158 systemd-logind[1545]: Session 50 logged out. Waiting for processes to exit. Jan 14 00:14:15.963002 systemd-logind[1545]: Removed session 50. Jan 14 00:14:16.336548 containerd[1590]: time="2026-01-14T00:14:16.334999025Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:14:16.684113 containerd[1590]: time="2026-01-14T00:14:16.683780439Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:14:16.685300 containerd[1590]: time="2026-01-14T00:14:16.685224895Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:14:16.685405 containerd[1590]: time="2026-01-14T00:14:16.685322659Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:14:16.685498 kubelet[2832]: E0114 00:14:16.685462 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:14:16.685498 kubelet[2832]: E0114 00:14:16.685507 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:14:16.686004 kubelet[2832]: E0114 00:14:16.685652 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cr65d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-7bt2c_calico-apiserver(3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:14:16.687107 kubelet[2832]: E0114 00:14:16.687049 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:14:19.335537 containerd[1590]: time="2026-01-14T00:14:19.335462232Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:14:19.677224 containerd[1590]: time="2026-01-14T00:14:19.677067874Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:14:19.678758 containerd[1590]: time="2026-01-14T00:14:19.678687217Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:14:19.678891 containerd[1590]: time="2026-01-14T00:14:19.678755140Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:14:19.679199 kubelet[2832]: E0114 00:14:19.679153 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:14:19.679489 kubelet[2832]: E0114 00:14:19.679225 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:14:19.679895 kubelet[2832]: E0114 00:14:19.679756 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md5mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-vdxqm_calico-apiserver(1e4bec8e-a684-46cb-852e-ae05ed7b56d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:14:19.681215 kubelet[2832]: E0114 00:14:19.681148 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:14:21.067203 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:21.067312 kernel: audit: type=1130 audit(1768349661.063:1114): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-46.224.77.139:22-4.153.228.146:56716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:21.063000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-46.224.77.139:22-4.153.228.146:56716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:21.063420 systemd[1]: Started sshd@50-46.224.77.139:22-4.153.228.146:56716.service - OpenSSH per-connection server daemon (4.153.228.146:56716). Jan 14 00:14:21.620000 audit[6188]: USER_ACCT pid=6188 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:21.623984 sshd[6188]: Accepted publickey for core from 4.153.228.146 port 56716 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:21.626609 kernel: audit: type=1101 audit(1768349661.620:1115): pid=6188 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:21.626689 kernel: audit: type=1103 audit(1768349661.623:1116): pid=6188 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:21.623000 audit[6188]: CRED_ACQ pid=6188 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:21.625000 sshd-session[6188]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:21.629286 kernel: audit: type=1006 audit(1768349661.623:1117): pid=6188 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=51 res=1 Jan 14 00:14:21.629352 kernel: audit: type=1300 audit(1768349661.623:1117): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe702cf70 a2=3 a3=0 items=0 ppid=1 pid=6188 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:21.623000 audit[6188]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe702cf70 a2=3 a3=0 items=0 ppid=1 pid=6188 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=51 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:21.623000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:21.634705 kernel: audit: type=1327 audit(1768349661.623:1117): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:21.639596 systemd-logind[1545]: New session 51 of user core. Jan 14 00:14:21.645722 systemd[1]: Started session-51.scope - Session 51 of User core. Jan 14 00:14:21.649000 audit[6188]: USER_START pid=6188 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:21.653000 audit[6192]: CRED_ACQ pid=6192 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:21.655529 kernel: audit: type=1105 audit(1768349661.649:1118): pid=6188 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:21.655695 kernel: audit: type=1103 audit(1768349661.653:1119): pid=6192 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:22.017613 sshd[6192]: Connection closed by 4.153.228.146 port 56716 Jan 14 00:14:22.018758 sshd-session[6188]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:22.020000 audit[6188]: USER_END pid=6188 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:22.020000 audit[6188]: CRED_DISP pid=6188 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:22.026389 kernel: audit: type=1106 audit(1768349662.020:1120): pid=6188 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:22.026478 kernel: audit: type=1104 audit(1768349662.020:1121): pid=6188 uid=0 auid=500 ses=51 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:22.028366 systemd[1]: sshd@50-46.224.77.139:22-4.153.228.146:56716.service: Deactivated successfully. Jan 14 00:14:22.028000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@50-46.224.77.139:22-4.153.228.146:56716 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:22.031113 systemd[1]: session-51.scope: Deactivated successfully. Jan 14 00:14:22.034884 systemd-logind[1545]: Session 51 logged out. Waiting for processes to exit. Jan 14 00:14:22.039996 systemd-logind[1545]: Removed session 51. Jan 14 00:14:22.335666 containerd[1590]: time="2026-01-14T00:14:22.334995800Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 00:14:22.657781 containerd[1590]: time="2026-01-14T00:14:22.657266261Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:14:22.659070 containerd[1590]: time="2026-01-14T00:14:22.658997969Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 00:14:22.659281 containerd[1590]: time="2026-01-14T00:14:22.659044651Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 00:14:22.659654 kubelet[2832]: E0114 00:14:22.659602 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:14:22.660051 kubelet[2832]: E0114 00:14:22.659663 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:14:22.660051 kubelet[2832]: E0114 00:14:22.659792 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p28dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-hrn72_calico-system(1e53bd66-4746-482e-bb2b-bfd29a1ef20e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 00:14:22.661065 kubelet[2832]: E0114 00:14:22.661010 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:14:26.338440 kubelet[2832]: E0114 00:14:26.338182 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:14:27.139805 systemd[1]: Started sshd@51-46.224.77.139:22-4.153.228.146:42652.service - OpenSSH per-connection server daemon (4.153.228.146:42652). Jan 14 00:14:27.141629 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:27.141711 kernel: audit: type=1130 audit(1768349667.139:1123): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-46.224.77.139:22-4.153.228.146:42652 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:27.139000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-46.224.77.139:22-4.153.228.146:42652 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:27.336340 kubelet[2832]: E0114 00:14:27.336285 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:14:27.706000 audit[6206]: USER_ACCT pid=6206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:27.709715 sshd[6206]: Accepted publickey for core from 4.153.228.146 port 42652 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:27.710590 kernel: audit: type=1101 audit(1768349667.706:1124): pid=6206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:27.710000 audit[6206]: CRED_ACQ pid=6206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:27.713941 sshd-session[6206]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:27.718175 kernel: audit: type=1103 audit(1768349667.710:1125): pid=6206 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:27.718269 kernel: audit: type=1006 audit(1768349667.712:1126): pid=6206 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=52 res=1 Jan 14 00:14:27.718291 kernel: audit: type=1300 audit(1768349667.712:1126): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc09b4370 a2=3 a3=0 items=0 ppid=1 pid=6206 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:27.712000 audit[6206]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc09b4370 a2=3 a3=0 items=0 ppid=1 pid=6206 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=52 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:27.712000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:27.721016 kernel: audit: type=1327 audit(1768349667.712:1126): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:27.726882 systemd-logind[1545]: New session 52 of user core. Jan 14 00:14:27.731779 systemd[1]: Started session-52.scope - Session 52 of User core. Jan 14 00:14:27.735000 audit[6206]: USER_START pid=6206 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:27.739621 kernel: audit: type=1105 audit(1768349667.735:1127): pid=6206 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:27.740000 audit[6210]: CRED_ACQ pid=6210 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:27.743567 kernel: audit: type=1103 audit(1768349667.740:1128): pid=6210 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:28.135807 sshd[6210]: Connection closed by 4.153.228.146 port 42652 Jan 14 00:14:28.136458 sshd-session[6206]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:28.137000 audit[6206]: USER_END pid=6206 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:28.137000 audit[6206]: CRED_DISP pid=6206 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:28.145506 kernel: audit: type=1106 audit(1768349668.137:1129): pid=6206 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:28.145664 kernel: audit: type=1104 audit(1768349668.137:1130): pid=6206 uid=0 auid=500 ses=52 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:28.146373 systemd[1]: sshd@51-46.224.77.139:22-4.153.228.146:42652.service: Deactivated successfully. Jan 14 00:14:28.146000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@51-46.224.77.139:22-4.153.228.146:42652 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:28.149916 systemd[1]: session-52.scope: Deactivated successfully. Jan 14 00:14:28.152626 systemd-logind[1545]: Session 52 logged out. Waiting for processes to exit. Jan 14 00:14:28.156751 systemd-logind[1545]: Removed session 52. Jan 14 00:14:28.334273 kubelet[2832]: E0114 00:14:28.334213 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:14:30.335584 kubelet[2832]: E0114 00:14:30.335309 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:14:33.242648 systemd[1]: Started sshd@52-46.224.77.139:22-4.153.228.146:42656.service - OpenSSH per-connection server daemon (4.153.228.146:42656). Jan 14 00:14:33.241000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-46.224.77.139:22-4.153.228.146:42656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:33.245307 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:33.245427 kernel: audit: type=1130 audit(1768349673.241:1132): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-46.224.77.139:22-4.153.228.146:42656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:33.334925 kubelet[2832]: E0114 00:14:33.334631 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:14:33.776000 audit[6248]: USER_ACCT pid=6248 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:33.778633 sshd[6248]: Accepted publickey for core from 4.153.228.146 port 42656 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:33.782552 kernel: audit: type=1101 audit(1768349673.776:1133): pid=6248 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:33.783465 sshd-session[6248]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:33.781000 audit[6248]: CRED_ACQ pid=6248 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:33.787460 kernel: audit: type=1103 audit(1768349673.781:1134): pid=6248 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:33.787835 kernel: audit: type=1006 audit(1768349673.781:1135): pid=6248 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=53 res=1 Jan 14 00:14:33.787864 kernel: audit: type=1300 audit(1768349673.781:1135): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcd0de3a0 a2=3 a3=0 items=0 ppid=1 pid=6248 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:33.781000 audit[6248]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcd0de3a0 a2=3 a3=0 items=0 ppid=1 pid=6248 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=53 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:33.781000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:33.790572 kernel: audit: type=1327 audit(1768349673.781:1135): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:33.795559 systemd-logind[1545]: New session 53 of user core. Jan 14 00:14:33.798732 systemd[1]: Started session-53.scope - Session 53 of User core. Jan 14 00:14:33.801000 audit[6248]: USER_START pid=6248 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:33.805000 audit[6252]: CRED_ACQ pid=6252 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:33.810321 kernel: audit: type=1105 audit(1768349673.801:1136): pid=6248 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:33.810433 kernel: audit: type=1103 audit(1768349673.805:1137): pid=6252 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:34.165643 sshd[6252]: Connection closed by 4.153.228.146 port 42656 Jan 14 00:14:34.166204 sshd-session[6248]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:34.166000 audit[6248]: USER_END pid=6248 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:34.166000 audit[6248]: CRED_DISP pid=6248 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:34.175469 kernel: audit: type=1106 audit(1768349674.166:1138): pid=6248 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:34.175576 kernel: audit: type=1104 audit(1768349674.166:1139): pid=6248 uid=0 auid=500 ses=53 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:34.176644 systemd-logind[1545]: Session 53 logged out. Waiting for processes to exit. Jan 14 00:14:34.176867 systemd[1]: sshd@52-46.224.77.139:22-4.153.228.146:42656.service: Deactivated successfully. Jan 14 00:14:34.175000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@52-46.224.77.139:22-4.153.228.146:42656 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:34.179818 systemd[1]: session-53.scope: Deactivated successfully. Jan 14 00:14:34.182841 systemd-logind[1545]: Removed session 53. Jan 14 00:14:34.336301 kubelet[2832]: E0114 00:14:34.336262 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:14:37.334086 kubelet[2832]: E0114 00:14:37.334040 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:14:39.279697 systemd[1]: Started sshd@53-46.224.77.139:22-4.153.228.146:49180.service - OpenSSH per-connection server daemon (4.153.228.146:49180). Jan 14 00:14:39.278000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-46.224.77.139:22-4.153.228.146:49180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:39.282917 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:39.283142 kernel: audit: type=1130 audit(1768349679.278:1141): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-46.224.77.139:22-4.153.228.146:49180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:39.839000 audit[6263]: USER_ACCT pid=6263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:39.844579 kernel: audit: type=1101 audit(1768349679.839:1142): pid=6263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:39.845031 sshd[6263]: Accepted publickey for core from 4.153.228.146 port 49180 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:39.844000 audit[6263]: CRED_ACQ pid=6263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:39.847540 sshd-session[6263]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:39.849583 kernel: audit: type=1103 audit(1768349679.844:1143): pid=6263 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:39.849689 kernel: audit: type=1006 audit(1768349679.844:1144): pid=6263 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=54 res=1 Jan 14 00:14:39.844000 audit[6263]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff298dd10 a2=3 a3=0 items=0 ppid=1 pid=6263 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:39.852706 kernel: audit: type=1300 audit(1768349679.844:1144): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff298dd10 a2=3 a3=0 items=0 ppid=1 pid=6263 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=54 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:39.844000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:39.853930 kernel: audit: type=1327 audit(1768349679.844:1144): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:39.860604 systemd-logind[1545]: New session 54 of user core. Jan 14 00:14:39.865898 systemd[1]: Started session-54.scope - Session 54 of User core. Jan 14 00:14:39.870000 audit[6263]: USER_START pid=6263 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:39.874000 audit[6267]: CRED_ACQ pid=6267 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:39.879064 kernel: audit: type=1105 audit(1768349679.870:1145): pid=6263 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:39.879661 kernel: audit: type=1103 audit(1768349679.874:1146): pid=6267 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:40.227229 sshd[6267]: Connection closed by 4.153.228.146 port 49180 Jan 14 00:14:40.229066 sshd-session[6263]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:40.229000 audit[6263]: USER_END pid=6263 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:40.229000 audit[6263]: CRED_DISP pid=6263 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:40.237444 kernel: audit: type=1106 audit(1768349680.229:1147): pid=6263 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:40.237505 kernel: audit: type=1104 audit(1768349680.229:1148): pid=6263 uid=0 auid=500 ses=54 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:40.237788 systemd[1]: sshd@53-46.224.77.139:22-4.153.228.146:49180.service: Deactivated successfully. Jan 14 00:14:40.236000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@53-46.224.77.139:22-4.153.228.146:49180 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:40.241370 systemd[1]: session-54.scope: Deactivated successfully. Jan 14 00:14:40.244512 systemd-logind[1545]: Session 54 logged out. Waiting for processes to exit. Jan 14 00:14:40.246393 systemd-logind[1545]: Removed session 54. Jan 14 00:14:40.337506 kubelet[2832]: E0114 00:14:40.337397 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:14:42.337943 kubelet[2832]: E0114 00:14:42.337883 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:14:43.336077 kubelet[2832]: E0114 00:14:43.335970 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:14:45.354853 systemd[1]: Started sshd@54-46.224.77.139:22-4.153.228.146:44900.service - OpenSSH per-connection server daemon (4.153.228.146:44900). Jan 14 00:14:45.354000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-46.224.77.139:22-4.153.228.146:44900 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:45.355787 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:45.355880 kernel: audit: type=1130 audit(1768349685.354:1150): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-46.224.77.139:22-4.153.228.146:44900 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:45.927000 audit[6279]: USER_ACCT pid=6279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:45.928352 sshd[6279]: Accepted publickey for core from 4.153.228.146 port 44900 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:45.930000 audit[6279]: CRED_ACQ pid=6279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:45.931823 sshd-session[6279]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:45.933220 kernel: audit: type=1101 audit(1768349685.927:1151): pid=6279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:45.933292 kernel: audit: type=1103 audit(1768349685.930:1152): pid=6279 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:45.933314 kernel: audit: type=1006 audit(1768349685.930:1153): pid=6279 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=55 res=1 Jan 14 00:14:45.930000 audit[6279]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffecdcc790 a2=3 a3=0 items=0 ppid=1 pid=6279 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:45.937511 kernel: audit: type=1300 audit(1768349685.930:1153): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffecdcc790 a2=3 a3=0 items=0 ppid=1 pid=6279 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=55 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:45.930000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:45.938611 kernel: audit: type=1327 audit(1768349685.930:1153): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:45.944012 systemd-logind[1545]: New session 55 of user core. Jan 14 00:14:45.951471 systemd[1]: Started session-55.scope - Session 55 of User core. Jan 14 00:14:45.957000 audit[6279]: USER_START pid=6279 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:45.963551 kernel: audit: type=1105 audit(1768349685.957:1154): pid=6279 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:45.963000 audit[6285]: CRED_ACQ pid=6285 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:45.966559 kernel: audit: type=1103 audit(1768349685.963:1155): pid=6285 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:46.337304 kubelet[2832]: E0114 00:14:46.337236 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:14:46.352726 sshd[6285]: Connection closed by 4.153.228.146 port 44900 Jan 14 00:14:46.353492 sshd-session[6279]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:46.355000 audit[6279]: USER_END pid=6279 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:46.355000 audit[6279]: CRED_DISP pid=6279 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:46.361226 kernel: audit: type=1106 audit(1768349686.355:1156): pid=6279 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:46.361308 kernel: audit: type=1104 audit(1768349686.355:1157): pid=6279 uid=0 auid=500 ses=55 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:46.362612 systemd[1]: sshd@54-46.224.77.139:22-4.153.228.146:44900.service: Deactivated successfully. Jan 14 00:14:46.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@54-46.224.77.139:22-4.153.228.146:44900 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:46.367694 systemd[1]: session-55.scope: Deactivated successfully. Jan 14 00:14:46.369176 systemd-logind[1545]: Session 55 logged out. Waiting for processes to exit. Jan 14 00:14:46.373366 systemd-logind[1545]: Removed session 55. Jan 14 00:14:48.337583 kubelet[2832]: E0114 00:14:48.337259 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:14:48.341509 kubelet[2832]: E0114 00:14:48.341451 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:14:51.334195 kubelet[2832]: E0114 00:14:51.334117 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:14:51.467680 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:51.467767 kernel: audit: type=1130 audit(1768349691.463:1159): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-46.224.77.139:22-4.153.228.146:44902 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:51.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-46.224.77.139:22-4.153.228.146:44902 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:51.463860 systemd[1]: Started sshd@55-46.224.77.139:22-4.153.228.146:44902.service - OpenSSH per-connection server daemon (4.153.228.146:44902). Jan 14 00:14:52.012000 audit[6313]: USER_ACCT pid=6313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.015707 sshd[6313]: Accepted publickey for core from 4.153.228.146 port 44902 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:52.016908 sshd-session[6313]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:52.015000 audit[6313]: CRED_ACQ pid=6313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.020817 kernel: audit: type=1101 audit(1768349692.012:1160): pid=6313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.020883 kernel: audit: type=1103 audit(1768349692.015:1161): pid=6313 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.024491 kernel: audit: type=1006 audit(1768349692.015:1162): pid=6313 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=56 res=1 Jan 14 00:14:52.024588 kernel: audit: type=1300 audit(1768349692.015:1162): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd6487c20 a2=3 a3=0 items=0 ppid=1 pid=6313 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:52.015000 audit[6313]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd6487c20 a2=3 a3=0 items=0 ppid=1 pid=6313 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=56 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:52.015000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:52.031298 kernel: audit: type=1327 audit(1768349692.015:1162): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:52.032121 systemd-logind[1545]: New session 56 of user core. Jan 14 00:14:52.037774 systemd[1]: Started session-56.scope - Session 56 of User core. Jan 14 00:14:52.041000 audit[6313]: USER_START pid=6313 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.046677 kernel: audit: type=1105 audit(1768349692.041:1163): pid=6313 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.046775 kernel: audit: type=1103 audit(1768349692.045:1164): pid=6318 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.045000 audit[6318]: CRED_ACQ pid=6318 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.412563 sshd[6318]: Connection closed by 4.153.228.146 port 44902 Jan 14 00:14:52.413464 sshd-session[6313]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:52.414000 audit[6313]: USER_END pid=6313 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.420945 systemd[1]: sshd@55-46.224.77.139:22-4.153.228.146:44902.service: Deactivated successfully. Jan 14 00:14:52.415000 audit[6313]: CRED_DISP pid=6313 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.425856 kernel: audit: type=1106 audit(1768349692.414:1165): pid=6313 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.425912 kernel: audit: type=1104 audit(1768349692.415:1166): pid=6313 uid=0 auid=500 ses=56 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:52.425322 systemd[1]: session-56.scope: Deactivated successfully. Jan 14 00:14:52.420000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@55-46.224.77.139:22-4.153.228.146:44902 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:52.428809 systemd-logind[1545]: Session 56 logged out. Waiting for processes to exit. Jan 14 00:14:52.430625 systemd-logind[1545]: Removed session 56. Jan 14 00:14:54.335331 kubelet[2832]: E0114 00:14:54.335008 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:14:56.335786 kubelet[2832]: E0114 00:14:56.335713 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:14:57.523410 systemd[1]: Started sshd@56-46.224.77.139:22-4.153.228.146:59484.service - OpenSSH per-connection server daemon (4.153.228.146:59484). Jan 14 00:14:57.527977 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:14:57.528011 kernel: audit: type=1130 audit(1768349697.522:1168): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-46.224.77.139:22-4.153.228.146:59484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:57.522000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-46.224.77.139:22-4.153.228.146:59484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:58.086000 audit[6340]: USER_ACCT pid=6340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.091556 sshd[6340]: Accepted publickey for core from 4.153.228.146 port 59484 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:14:58.093198 sshd-session[6340]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:14:58.090000 audit[6340]: CRED_ACQ pid=6340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.095710 kernel: audit: type=1101 audit(1768349698.086:1169): pid=6340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.095788 kernel: audit: type=1103 audit(1768349698.090:1170): pid=6340 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.098263 kernel: audit: type=1006 audit(1768349698.090:1171): pid=6340 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=57 res=1 Jan 14 00:14:58.098342 kernel: audit: type=1300 audit(1768349698.090:1171): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcc75dd10 a2=3 a3=0 items=0 ppid=1 pid=6340 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:58.090000 audit[6340]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcc75dd10 a2=3 a3=0 items=0 ppid=1 pid=6340 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=57 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:14:58.090000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:58.101271 kernel: audit: type=1327 audit(1768349698.090:1171): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:14:58.104864 systemd-logind[1545]: New session 57 of user core. Jan 14 00:14:58.111988 systemd[1]: Started session-57.scope - Session 57 of User core. Jan 14 00:14:58.114000 audit[6340]: USER_START pid=6340 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.120588 kernel: audit: type=1105 audit(1768349698.114:1172): pid=6340 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.120682 kernel: audit: type=1103 audit(1768349698.118:1173): pid=6344 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.118000 audit[6344]: CRED_ACQ pid=6344 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.528173 sshd[6344]: Connection closed by 4.153.228.146 port 59484 Jan 14 00:14:58.530045 sshd-session[6340]: pam_unix(sshd:session): session closed for user core Jan 14 00:14:58.531000 audit[6340]: USER_END pid=6340 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.531000 audit[6340]: CRED_DISP pid=6340 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.539097 kernel: audit: type=1106 audit(1768349698.531:1174): pid=6340 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.539171 kernel: audit: type=1104 audit(1768349698.531:1175): pid=6340 uid=0 auid=500 ses=57 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:14:58.539450 systemd[1]: sshd@56-46.224.77.139:22-4.153.228.146:59484.service: Deactivated successfully. Jan 14 00:14:58.538000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@56-46.224.77.139:22-4.153.228.146:59484 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:14:58.542501 systemd[1]: session-57.scope: Deactivated successfully. Jan 14 00:14:58.547165 systemd-logind[1545]: Session 57 logged out. Waiting for processes to exit. Jan 14 00:14:58.548159 systemd-logind[1545]: Removed session 57. Jan 14 00:15:00.334110 kubelet[2832]: E0114 00:15:00.333574 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:15:03.334969 kubelet[2832]: E0114 00:15:03.334509 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:15:03.334969 kubelet[2832]: E0114 00:15:03.334888 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:15:03.336299 kubelet[2832]: E0114 00:15:03.336088 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:15:03.637000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-46.224.77.139:22-4.153.228.146:59500 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:03.637941 systemd[1]: Started sshd@57-46.224.77.139:22-4.153.228.146:59500.service - OpenSSH per-connection server daemon (4.153.228.146:59500). Jan 14 00:15:03.642438 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:03.642595 kernel: audit: type=1130 audit(1768349703.637:1177): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-46.224.77.139:22-4.153.228.146:59500 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:04.176000 audit[6378]: USER_ACCT pid=6378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.182550 sshd[6378]: Accepted publickey for core from 4.153.228.146 port 59500 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:04.183085 sshd-session[6378]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:04.180000 audit[6378]: CRED_ACQ pid=6378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.185500 kernel: audit: type=1101 audit(1768349704.176:1178): pid=6378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.185569 kernel: audit: type=1103 audit(1768349704.180:1179): pid=6378 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.188005 kernel: audit: type=1006 audit(1768349704.180:1180): pid=6378 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=58 res=1 Jan 14 00:15:04.180000 audit[6378]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd9962eb0 a2=3 a3=0 items=0 ppid=1 pid=6378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:04.188543 kernel: audit: type=1300 audit(1768349704.180:1180): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd9962eb0 a2=3 a3=0 items=0 ppid=1 pid=6378 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=58 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:04.180000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:04.195711 kernel: audit: type=1327 audit(1768349704.180:1180): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:04.195378 systemd-logind[1545]: New session 58 of user core. Jan 14 00:15:04.204941 systemd[1]: Started session-58.scope - Session 58 of User core. Jan 14 00:15:04.208000 audit[6378]: USER_START pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.213000 audit[6382]: CRED_ACQ pid=6382 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.216839 kernel: audit: type=1105 audit(1768349704.208:1181): pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.216966 kernel: audit: type=1103 audit(1768349704.213:1182): pid=6382 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.578622 sshd[6382]: Connection closed by 4.153.228.146 port 59500 Jan 14 00:15:04.579418 sshd-session[6378]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:04.580000 audit[6378]: USER_END pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.587839 systemd[1]: sshd@57-46.224.77.139:22-4.153.228.146:59500.service: Deactivated successfully. Jan 14 00:15:04.581000 audit[6378]: CRED_DISP pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.589871 kernel: audit: type=1106 audit(1768349704.580:1183): pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.589929 kernel: audit: type=1104 audit(1768349704.581:1184): pid=6378 uid=0 auid=500 ses=58 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:04.588000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@57-46.224.77.139:22-4.153.228.146:59500 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:04.592579 systemd[1]: session-58.scope: Deactivated successfully. Jan 14 00:15:04.595465 systemd-logind[1545]: Session 58 logged out. Waiting for processes to exit. Jan 14 00:15:04.598082 systemd-logind[1545]: Removed session 58. Jan 14 00:15:07.335213 kubelet[2832]: E0114 00:15:07.335131 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:15:09.333817 kubelet[2832]: E0114 00:15:09.333729 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:15:09.692841 systemd[1]: Started sshd@58-46.224.77.139:22-4.153.228.146:51226.service - OpenSSH per-connection server daemon (4.153.228.146:51226). Jan 14 00:15:09.695378 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:09.695473 kernel: audit: type=1130 audit(1768349709.692:1186): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-46.224.77.139:22-4.153.228.146:51226 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:09.692000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-46.224.77.139:22-4.153.228.146:51226 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:10.238000 audit[6394]: USER_ACCT pid=6394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.241233 sshd[6394]: Accepted publickey for core from 4.153.228.146 port 51226 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:10.246608 kernel: audit: type=1101 audit(1768349710.238:1187): pid=6394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.247098 kernel: audit: type=1103 audit(1768349710.242:1188): pid=6394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.242000 audit[6394]: CRED_ACQ pid=6394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.246891 sshd-session[6394]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:10.250014 kernel: audit: type=1006 audit(1768349710.242:1189): pid=6394 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=59 res=1 Jan 14 00:15:10.242000 audit[6394]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd65c81b0 a2=3 a3=0 items=0 ppid=1 pid=6394 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:10.242000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:10.255535 kernel: audit: type=1300 audit(1768349710.242:1189): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd65c81b0 a2=3 a3=0 items=0 ppid=1 pid=6394 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=59 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:10.255589 kernel: audit: type=1327 audit(1768349710.242:1189): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:10.256806 systemd-logind[1545]: New session 59 of user core. Jan 14 00:15:10.262793 systemd[1]: Started session-59.scope - Session 59 of User core. Jan 14 00:15:10.269000 audit[6394]: USER_START pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.277367 kernel: audit: type=1105 audit(1768349710.269:1190): pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.277489 kernel: audit: type=1103 audit(1768349710.272:1191): pid=6398 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.272000 audit[6398]: CRED_ACQ pid=6398 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.622302 sshd[6398]: Connection closed by 4.153.228.146 port 51226 Jan 14 00:15:10.623189 sshd-session[6394]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:10.625000 audit[6394]: USER_END pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.629000 audit[6394]: CRED_DISP pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.634340 kernel: audit: type=1106 audit(1768349710.625:1192): pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.634419 kernel: audit: type=1104 audit(1768349710.629:1193): pid=6394 uid=0 auid=500 ses=59 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:10.636136 systemd[1]: sshd@58-46.224.77.139:22-4.153.228.146:51226.service: Deactivated successfully. Jan 14 00:15:10.635000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@58-46.224.77.139:22-4.153.228.146:51226 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:10.640413 systemd[1]: session-59.scope: Deactivated successfully. Jan 14 00:15:10.642307 systemd-logind[1545]: Session 59 logged out. Waiting for processes to exit. Jan 14 00:15:10.644913 systemd-logind[1545]: Removed session 59. Jan 14 00:15:13.334920 kubelet[2832]: E0114 00:15:13.334683 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:15:14.334658 kubelet[2832]: E0114 00:15:14.334607 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:15:15.741551 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:15.741662 kernel: audit: type=1130 audit(1768349715.738:1195): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-46.224.77.139:22-4.153.228.146:35996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:15.738000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-46.224.77.139:22-4.153.228.146:35996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:15.738716 systemd[1]: Started sshd@59-46.224.77.139:22-4.153.228.146:35996.service - OpenSSH per-connection server daemon (4.153.228.146:35996). Jan 14 00:15:16.316000 audit[6410]: USER_ACCT pid=6410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.319015 sshd[6410]: Accepted publickey for core from 4.153.228.146 port 35996 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:16.320542 kernel: audit: type=1101 audit(1768349716.316:1196): pid=6410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.320620 kernel: audit: type=1103 audit(1768349716.320:1197): pid=6410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.320000 audit[6410]: CRED_ACQ pid=6410 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.323276 sshd-session[6410]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:16.324808 kernel: audit: type=1006 audit(1768349716.321:1198): pid=6410 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=60 res=1 Jan 14 00:15:16.324959 kernel: audit: type=1300 audit(1768349716.321:1198): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffedd6f5c0 a2=3 a3=0 items=0 ppid=1 pid=6410 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:16.321000 audit[6410]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffedd6f5c0 a2=3 a3=0 items=0 ppid=1 pid=6410 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=60 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:16.321000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:16.329288 kernel: audit: type=1327 audit(1768349716.321:1198): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:16.333654 systemd-logind[1545]: New session 60 of user core. Jan 14 00:15:16.340183 systemd[1]: Started session-60.scope - Session 60 of User core. Jan 14 00:15:16.343000 audit[6410]: USER_START pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.346000 audit[6414]: CRED_ACQ pid=6414 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.349107 kernel: audit: type=1105 audit(1768349716.343:1199): pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.349206 kernel: audit: type=1103 audit(1768349716.346:1200): pid=6414 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.707367 sshd[6414]: Connection closed by 4.153.228.146 port 35996 Jan 14 00:15:16.708695 sshd-session[6410]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:16.709000 audit[6410]: USER_END pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.709000 audit[6410]: CRED_DISP pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.716004 kernel: audit: type=1106 audit(1768349716.709:1201): pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.716079 kernel: audit: type=1104 audit(1768349716.709:1202): pid=6410 uid=0 auid=500 ses=60 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:16.716239 systemd[1]: sshd@59-46.224.77.139:22-4.153.228.146:35996.service: Deactivated successfully. Jan 14 00:15:16.717000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@59-46.224.77.139:22-4.153.228.146:35996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:16.717034 systemd-logind[1545]: Session 60 logged out. Waiting for processes to exit. Jan 14 00:15:16.720623 systemd[1]: session-60.scope: Deactivated successfully. Jan 14 00:15:16.723098 systemd-logind[1545]: Removed session 60. Jan 14 00:15:17.336027 kubelet[2832]: E0114 00:15:17.335977 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:15:17.337651 kubelet[2832]: E0114 00:15:17.336243 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:15:18.335573 kubelet[2832]: E0114 00:15:18.335259 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:15:20.335004 kubelet[2832]: E0114 00:15:20.334950 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:15:21.812760 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:21.812883 kernel: audit: type=1130 audit(1768349721.810:1204): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-46.224.77.139:22-4.153.228.146:36010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:21.810000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-46.224.77.139:22-4.153.228.146:36010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:21.811732 systemd[1]: Started sshd@60-46.224.77.139:22-4.153.228.146:36010.service - OpenSSH per-connection server daemon (4.153.228.146:36010). Jan 14 00:15:22.340000 audit[6426]: USER_ACCT pid=6426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.345354 sshd[6426]: Accepted publickey for core from 4.153.228.146 port 36010 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:22.344000 audit[6426]: CRED_ACQ pid=6426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.347727 sshd-session[6426]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:22.348125 kernel: audit: type=1101 audit(1768349722.340:1205): pid=6426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.348187 kernel: audit: type=1103 audit(1768349722.344:1206): pid=6426 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.350117 kernel: audit: type=1006 audit(1768349722.344:1207): pid=6426 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=61 res=1 Jan 14 00:15:22.344000 audit[6426]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffef0b4ef0 a2=3 a3=0 items=0 ppid=1 pid=6426 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:22.353417 kernel: audit: type=1300 audit(1768349722.344:1207): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffef0b4ef0 a2=3 a3=0 items=0 ppid=1 pid=6426 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=61 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:22.344000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:22.354542 kernel: audit: type=1327 audit(1768349722.344:1207): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:22.360370 systemd-logind[1545]: New session 61 of user core. Jan 14 00:15:22.367249 systemd[1]: Started session-61.scope - Session 61 of User core. Jan 14 00:15:22.369000 audit[6426]: USER_START pid=6426 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.377571 kernel: audit: type=1105 audit(1768349722.369:1208): pid=6426 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.377000 audit[6430]: CRED_ACQ pid=6430 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.382566 kernel: audit: type=1103 audit(1768349722.377:1209): pid=6430 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.728651 sshd[6430]: Connection closed by 4.153.228.146 port 36010 Jan 14 00:15:22.729282 sshd-session[6426]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:22.730000 audit[6426]: USER_END pid=6426 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.731000 audit[6426]: CRED_DISP pid=6426 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.737074 systemd[1]: sshd@60-46.224.77.139:22-4.153.228.146:36010.service: Deactivated successfully. Jan 14 00:15:22.738077 kernel: audit: type=1106 audit(1768349722.730:1210): pid=6426 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.739486 kernel: audit: type=1104 audit(1768349722.731:1211): pid=6426 uid=0 auid=500 ses=61 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:22.735000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@60-46.224.77.139:22-4.153.228.146:36010 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:22.738182 systemd-logind[1545]: Session 61 logged out. Waiting for processes to exit. Jan 14 00:15:22.740129 systemd[1]: session-61.scope: Deactivated successfully. Jan 14 00:15:22.745892 systemd-logind[1545]: Removed session 61. Jan 14 00:15:26.336534 kubelet[2832]: E0114 00:15:26.336456 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:15:27.334909 kubelet[2832]: E0114 00:15:27.334778 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:15:27.848579 systemd[1]: Started sshd@61-46.224.77.139:22-4.153.228.146:60334.service - OpenSSH per-connection server daemon (4.153.228.146:60334). Jan 14 00:15:27.848000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-46.224.77.139:22-4.153.228.146:60334 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:27.851797 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:27.851868 kernel: audit: type=1130 audit(1768349727.848:1213): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-46.224.77.139:22-4.153.228.146:60334 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:28.428000 audit[6444]: USER_ACCT pid=6444 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.431424 sshd[6444]: Accepted publickey for core from 4.153.228.146 port 60334 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:28.431881 kernel: audit: type=1101 audit(1768349728.428:1214): pid=6444 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.432000 audit[6444]: CRED_ACQ pid=6444 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.433842 sshd-session[6444]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:28.436310 kernel: audit: type=1103 audit(1768349728.432:1215): pid=6444 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.436428 kernel: audit: type=1006 audit(1768349728.432:1216): pid=6444 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=62 res=1 Jan 14 00:15:28.432000 audit[6444]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc7b72da0 a2=3 a3=0 items=0 ppid=1 pid=6444 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:28.440609 kernel: audit: type=1300 audit(1768349728.432:1216): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc7b72da0 a2=3 a3=0 items=0 ppid=1 pid=6444 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=62 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:28.432000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:28.444550 kernel: audit: type=1327 audit(1768349728.432:1216): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:28.444826 systemd-logind[1545]: New session 62 of user core. Jan 14 00:15:28.450823 systemd[1]: Started session-62.scope - Session 62 of User core. Jan 14 00:15:28.454000 audit[6444]: USER_START pid=6444 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.461617 kernel: audit: type=1105 audit(1768349728.454:1217): pid=6444 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.461733 kernel: audit: type=1103 audit(1768349728.458:1218): pid=6448 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.458000 audit[6448]: CRED_ACQ pid=6448 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.848475 sshd[6448]: Connection closed by 4.153.228.146 port 60334 Jan 14 00:15:28.849731 sshd-session[6444]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:28.850000 audit[6444]: USER_END pid=6444 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.850000 audit[6444]: CRED_DISP pid=6444 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.859200 kernel: audit: type=1106 audit(1768349728.850:1219): pid=6444 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.859324 kernel: audit: type=1104 audit(1768349728.850:1220): pid=6444 uid=0 auid=500 ses=62 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:28.859650 systemd[1]: sshd@61-46.224.77.139:22-4.153.228.146:60334.service: Deactivated successfully. Jan 14 00:15:28.859000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@61-46.224.77.139:22-4.153.228.146:60334 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:28.863745 systemd[1]: session-62.scope: Deactivated successfully. Jan 14 00:15:28.864824 systemd-logind[1545]: Session 62 logged out. Waiting for processes to exit. Jan 14 00:15:28.868377 systemd-logind[1545]: Removed session 62. Jan 14 00:15:29.334174 kubelet[2832]: E0114 00:15:29.333910 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:15:29.335716 kubelet[2832]: E0114 00:15:29.334410 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:15:30.339260 kubelet[2832]: E0114 00:15:30.339206 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:15:32.339118 kubelet[2832]: E0114 00:15:32.338625 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:15:33.956000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-46.224.77.139:22-4.153.228.146:60346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:33.956756 systemd[1]: Started sshd@62-46.224.77.139:22-4.153.228.146:60346.service - OpenSSH per-connection server daemon (4.153.228.146:60346). Jan 14 00:15:33.959286 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:33.959477 kernel: audit: type=1130 audit(1768349733.956:1222): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-46.224.77.139:22-4.153.228.146:60346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:34.516000 audit[6485]: USER_ACCT pid=6485 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.520589 kernel: audit: type=1101 audit(1768349734.516:1223): pid=6485 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.520719 sshd[6485]: Accepted publickey for core from 4.153.228.146 port 60346 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:34.521000 audit[6485]: CRED_ACQ pid=6485 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.523162 sshd-session[6485]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:34.527546 kernel: audit: type=1103 audit(1768349734.521:1224): pid=6485 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.527622 kernel: audit: type=1006 audit(1768349734.521:1225): pid=6485 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=63 res=1 Jan 14 00:15:34.521000 audit[6485]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffee67dfb0 a2=3 a3=0 items=0 ppid=1 pid=6485 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:34.530349 kernel: audit: type=1300 audit(1768349734.521:1225): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffee67dfb0 a2=3 a3=0 items=0 ppid=1 pid=6485 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=63 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:34.521000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:34.531787 kernel: audit: type=1327 audit(1768349734.521:1225): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:34.537797 systemd-logind[1545]: New session 63 of user core. Jan 14 00:15:34.546783 systemd[1]: Started session-63.scope - Session 63 of User core. Jan 14 00:15:34.551000 audit[6485]: USER_START pid=6485 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.555000 audit[6489]: CRED_ACQ pid=6489 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.558934 kernel: audit: type=1105 audit(1768349734.551:1226): pid=6485 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.558997 kernel: audit: type=1103 audit(1768349734.555:1227): pid=6489 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.935024 sshd[6489]: Connection closed by 4.153.228.146 port 60346 Jan 14 00:15:34.935876 sshd-session[6485]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:34.937000 audit[6485]: USER_END pid=6485 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.942508 systemd[1]: sshd@62-46.224.77.139:22-4.153.228.146:60346.service: Deactivated successfully. Jan 14 00:15:34.937000 audit[6485]: CRED_DISP pid=6485 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.944711 kernel: audit: type=1106 audit(1768349734.937:1228): pid=6485 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.944781 kernel: audit: type=1104 audit(1768349734.937:1229): pid=6485 uid=0 auid=500 ses=63 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:34.942000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@62-46.224.77.139:22-4.153.228.146:60346 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:34.945818 systemd[1]: session-63.scope: Deactivated successfully. Jan 14 00:15:34.948008 systemd-logind[1545]: Session 63 logged out. Waiting for processes to exit. Jan 14 00:15:34.949555 systemd-logind[1545]: Removed session 63. Jan 14 00:15:38.336580 kubelet[2832]: E0114 00:15:38.336475 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:15:40.040571 systemd[1]: Started sshd@63-46.224.77.139:22-4.153.228.146:32978.service - OpenSSH per-connection server daemon (4.153.228.146:32978). Jan 14 00:15:40.040000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-46.224.77.139:22-4.153.228.146:32978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:40.043626 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:40.043683 kernel: audit: type=1130 audit(1768349740.040:1231): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-46.224.77.139:22-4.153.228.146:32978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:40.336051 kubelet[2832]: E0114 00:15:40.335991 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:15:40.604000 audit[6501]: USER_ACCT pid=6501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:40.606984 sshd[6501]: Accepted publickey for core from 4.153.228.146 port 32978 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:40.606000 audit[6501]: CRED_ACQ pid=6501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:40.607577 kernel: audit: type=1101 audit(1768349740.604:1232): pid=6501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:40.608030 sshd-session[6501]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:40.612572 kernel: audit: type=1103 audit(1768349740.606:1233): pid=6501 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:40.612775 kernel: audit: type=1006 audit(1768349740.606:1234): pid=6501 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=64 res=1 Jan 14 00:15:40.612799 kernel: audit: type=1300 audit(1768349740.606:1234): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff077d3e0 a2=3 a3=0 items=0 ppid=1 pid=6501 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:40.606000 audit[6501]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff077d3e0 a2=3 a3=0 items=0 ppid=1 pid=6501 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=64 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:40.606000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:40.616466 kernel: audit: type=1327 audit(1768349740.606:1234): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:40.618477 systemd-logind[1545]: New session 64 of user core. Jan 14 00:15:40.624217 systemd[1]: Started session-64.scope - Session 64 of User core. Jan 14 00:15:40.627000 audit[6501]: USER_START pid=6501 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:40.632000 audit[6505]: CRED_ACQ pid=6505 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:40.635150 kernel: audit: type=1105 audit(1768349740.627:1235): pid=6501 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:40.635319 kernel: audit: type=1103 audit(1768349740.632:1236): pid=6505 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:41.014723 sshd[6505]: Connection closed by 4.153.228.146 port 32978 Jan 14 00:15:41.016100 sshd-session[6501]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:41.017000 audit[6501]: USER_END pid=6501 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:41.018000 audit[6501]: CRED_DISP pid=6501 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:41.025552 kernel: audit: type=1106 audit(1768349741.017:1237): pid=6501 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:41.025642 kernel: audit: type=1104 audit(1768349741.018:1238): pid=6501 uid=0 auid=500 ses=64 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:41.023929 systemd[1]: sshd@63-46.224.77.139:22-4.153.228.146:32978.service: Deactivated successfully. Jan 14 00:15:41.023000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@63-46.224.77.139:22-4.153.228.146:32978 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:41.027421 systemd[1]: session-64.scope: Deactivated successfully. Jan 14 00:15:41.029433 systemd-logind[1545]: Session 64 logged out. Waiting for processes to exit. Jan 14 00:15:41.033369 systemd-logind[1545]: Removed session 64. Jan 14 00:15:41.334436 kubelet[2832]: E0114 00:15:41.333994 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:15:42.333802 kubelet[2832]: E0114 00:15:42.333430 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:15:43.333545 kubelet[2832]: E0114 00:15:43.333311 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:15:45.334558 kubelet[2832]: E0114 00:15:45.333383 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:15:46.130928 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:46.131039 kernel: audit: type=1130 audit(1768349746.128:1240): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-46.224.77.139:22-4.153.228.146:50462 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:46.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-46.224.77.139:22-4.153.228.146:50462 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:46.130284 systemd[1]: Started sshd@64-46.224.77.139:22-4.153.228.146:50462.service - OpenSSH per-connection server daemon (4.153.228.146:50462). Jan 14 00:15:46.686000 audit[6520]: USER_ACCT pid=6520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:46.690589 sshd[6520]: Accepted publickey for core from 4.153.228.146 port 50462 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:46.691552 kernel: audit: type=1101 audit(1768349746.686:1241): pid=6520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:46.690000 audit[6520]: CRED_ACQ pid=6520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:46.695327 sshd-session[6520]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:46.696794 kernel: audit: type=1103 audit(1768349746.690:1242): pid=6520 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:46.696870 kernel: audit: type=1006 audit(1768349746.693:1243): pid=6520 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=65 res=1 Jan 14 00:15:46.696890 kernel: audit: type=1300 audit(1768349746.693:1243): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffb867b80 a2=3 a3=0 items=0 ppid=1 pid=6520 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:46.693000 audit[6520]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffb867b80 a2=3 a3=0 items=0 ppid=1 pid=6520 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=65 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:46.700016 kernel: audit: type=1327 audit(1768349746.693:1243): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:46.693000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:46.706765 systemd-logind[1545]: New session 65 of user core. Jan 14 00:15:46.713115 systemd[1]: Started session-65.scope - Session 65 of User core. Jan 14 00:15:46.718000 audit[6520]: USER_START pid=6520 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:46.724780 kernel: audit: type=1105 audit(1768349746.718:1244): pid=6520 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:46.725088 kernel: audit: type=1103 audit(1768349746.723:1245): pid=6526 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:46.723000 audit[6526]: CRED_ACQ pid=6526 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:47.105551 sshd[6526]: Connection closed by 4.153.228.146 port 50462 Jan 14 00:15:47.106933 sshd-session[6520]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:47.108000 audit[6520]: USER_END pid=6520 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:47.108000 audit[6520]: CRED_DISP pid=6520 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:47.116425 systemd[1]: sshd@64-46.224.77.139:22-4.153.228.146:50462.service: Deactivated successfully. Jan 14 00:15:47.117580 kernel: audit: type=1106 audit(1768349747.108:1246): pid=6520 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:47.117647 kernel: audit: type=1104 audit(1768349747.108:1247): pid=6520 uid=0 auid=500 ses=65 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:47.114000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@64-46.224.77.139:22-4.153.228.146:50462 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:47.121596 systemd[1]: session-65.scope: Deactivated successfully. Jan 14 00:15:47.126212 systemd-logind[1545]: Session 65 logged out. Waiting for processes to exit. Jan 14 00:15:47.130249 systemd-logind[1545]: Removed session 65. Jan 14 00:15:50.339314 kubelet[2832]: E0114 00:15:50.339264 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:15:52.225511 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:52.225846 kernel: audit: type=1130 audit(1768349752.222:1249): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-46.224.77.139:22-4.153.228.146:50472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:52.222000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-46.224.77.139:22-4.153.228.146:50472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:52.224032 systemd[1]: Started sshd@65-46.224.77.139:22-4.153.228.146:50472.service - OpenSSH per-connection server daemon (4.153.228.146:50472). Jan 14 00:15:52.776000 audit[6539]: USER_ACCT pid=6539 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:52.780876 sshd[6539]: Accepted publickey for core from 4.153.228.146 port 50472 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:52.780000 audit[6539]: CRED_ACQ pid=6539 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:52.783138 sshd-session[6539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:52.784241 kernel: audit: type=1101 audit(1768349752.776:1250): pid=6539 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:52.784297 kernel: audit: type=1103 audit(1768349752.780:1251): pid=6539 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:52.786907 kernel: audit: type=1006 audit(1768349752.780:1252): pid=6539 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=66 res=1 Jan 14 00:15:52.780000 audit[6539]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe9d88430 a2=3 a3=0 items=0 ppid=1 pid=6539 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:52.789532 kernel: audit: type=1300 audit(1768349752.780:1252): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe9d88430 a2=3 a3=0 items=0 ppid=1 pid=6539 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=66 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:52.790692 kernel: audit: type=1327 audit(1768349752.780:1252): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:52.780000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:52.794288 systemd-logind[1545]: New session 66 of user core. Jan 14 00:15:52.799740 systemd[1]: Started session-66.scope - Session 66 of User core. Jan 14 00:15:52.803000 audit[6539]: USER_START pid=6539 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:52.808000 audit[6543]: CRED_ACQ pid=6543 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:52.812386 kernel: audit: type=1105 audit(1768349752.803:1253): pid=6539 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:52.812492 kernel: audit: type=1103 audit(1768349752.808:1254): pid=6543 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:53.181709 sshd[6543]: Connection closed by 4.153.228.146 port 50472 Jan 14 00:15:53.183833 sshd-session[6539]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:53.185000 audit[6539]: USER_END pid=6539 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:53.187000 audit[6539]: CRED_DISP pid=6539 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:53.193209 kernel: audit: type=1106 audit(1768349753.185:1255): pid=6539 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:53.193270 kernel: audit: type=1104 audit(1768349753.187:1256): pid=6539 uid=0 auid=500 ses=66 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:53.193768 systemd[1]: sshd@65-46.224.77.139:22-4.153.228.146:50472.service: Deactivated successfully. Jan 14 00:15:53.192000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@65-46.224.77.139:22-4.153.228.146:50472 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:53.197198 systemd[1]: session-66.scope: Deactivated successfully. Jan 14 00:15:53.202424 systemd-logind[1545]: Session 66 logged out. Waiting for processes to exit. Jan 14 00:15:53.205973 systemd-logind[1545]: Removed session 66. Jan 14 00:15:53.335105 kubelet[2832]: E0114 00:15:53.334805 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:15:54.335707 kubelet[2832]: E0114 00:15:54.335483 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:15:56.336201 kubelet[2832]: E0114 00:15:56.335648 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:15:56.337962 kubelet[2832]: E0114 00:15:56.337609 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:15:57.332939 kubelet[2832]: E0114 00:15:57.332873 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:15:58.288215 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:15:58.288338 kernel: audit: type=1130 audit(1768349758.284:1258): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-46.224.77.139:22-4.153.228.146:55962 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:58.284000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-46.224.77.139:22-4.153.228.146:55962 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:58.285443 systemd[1]: Started sshd@66-46.224.77.139:22-4.153.228.146:55962.service - OpenSSH per-connection server daemon (4.153.228.146:55962). Jan 14 00:15:58.823000 audit[6558]: USER_ACCT pid=6558 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:58.826274 sshd[6558]: Accepted publickey for core from 4.153.228.146 port 55962 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:15:58.826000 audit[6558]: CRED_ACQ pid=6558 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:58.830948 kernel: audit: type=1101 audit(1768349758.823:1259): pid=6558 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:58.831055 kernel: audit: type=1103 audit(1768349758.826:1260): pid=6558 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:58.827590 sshd-session[6558]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:15:58.833473 kernel: audit: type=1006 audit(1768349758.826:1261): pid=6558 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=67 res=1 Jan 14 00:15:58.837435 kernel: audit: type=1300 audit(1768349758.826:1261): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffebda7f20 a2=3 a3=0 items=0 ppid=1 pid=6558 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:58.826000 audit[6558]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffebda7f20 a2=3 a3=0 items=0 ppid=1 pid=6558 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=67 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:15:58.826000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:58.844773 kernel: audit: type=1327 audit(1768349758.826:1261): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:15:58.844400 systemd-logind[1545]: New session 67 of user core. Jan 14 00:15:58.852795 systemd[1]: Started session-67.scope - Session 67 of User core. Jan 14 00:15:58.857000 audit[6558]: USER_START pid=6558 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:58.862547 kernel: audit: type=1105 audit(1768349758.857:1262): pid=6558 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:58.863000 audit[6562]: CRED_ACQ pid=6562 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:58.868551 kernel: audit: type=1103 audit(1768349758.863:1263): pid=6562 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:59.200750 sshd[6562]: Connection closed by 4.153.228.146 port 55962 Jan 14 00:15:59.200464 sshd-session[6558]: pam_unix(sshd:session): session closed for user core Jan 14 00:15:59.203000 audit[6558]: USER_END pid=6558 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:59.211735 systemd-logind[1545]: Session 67 logged out. Waiting for processes to exit. Jan 14 00:15:59.212837 systemd[1]: sshd@66-46.224.77.139:22-4.153.228.146:55962.service: Deactivated successfully. Jan 14 00:15:59.217148 kernel: audit: type=1106 audit(1768349759.203:1264): pid=6558 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:59.217233 kernel: audit: type=1104 audit(1768349759.208:1265): pid=6558 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:59.208000 audit[6558]: CRED_DISP pid=6558 uid=0 auid=500 ses=67 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:15:59.219043 systemd[1]: session-67.scope: Deactivated successfully. Jan 14 00:15:59.213000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@66-46.224.77.139:22-4.153.228.146:55962 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:15:59.224210 systemd-logind[1545]: Removed session 67. Jan 14 00:16:01.337997 kubelet[2832]: E0114 00:16:01.337938 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:16:04.309707 systemd[1]: Started sshd@67-46.224.77.139:22-4.153.228.146:55970.service - OpenSSH per-connection server daemon (4.153.228.146:55970). Jan 14 00:16:04.313733 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:04.313816 kernel: audit: type=1130 audit(1768349764.309:1267): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-46.224.77.139:22-4.153.228.146:55970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:04.309000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-46.224.77.139:22-4.153.228.146:55970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:04.858000 audit[6598]: USER_ACCT pid=6598 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:04.863404 sshd[6598]: Accepted publickey for core from 4.153.228.146 port 55970 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:04.864648 kernel: audit: type=1101 audit(1768349764.858:1268): pid=6598 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:04.865000 audit[6598]: CRED_ACQ pid=6598 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:04.867734 sshd-session[6598]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:04.873799 kernel: audit: type=1103 audit(1768349764.865:1269): pid=6598 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:04.873867 kernel: audit: type=1006 audit(1768349764.865:1270): pid=6598 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=68 res=1 Jan 14 00:16:04.865000 audit[6598]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd2a300b0 a2=3 a3=0 items=0 ppid=1 pid=6598 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:04.876502 kernel: audit: type=1300 audit(1768349764.865:1270): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd2a300b0 a2=3 a3=0 items=0 ppid=1 pid=6598 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=68 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:04.865000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:04.877589 kernel: audit: type=1327 audit(1768349764.865:1270): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:04.880683 systemd-logind[1545]: New session 68 of user core. Jan 14 00:16:04.886889 systemd[1]: Started session-68.scope - Session 68 of User core. Jan 14 00:16:04.891000 audit[6598]: USER_START pid=6598 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:04.897055 kernel: audit: type=1105 audit(1768349764.891:1271): pid=6598 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:04.897160 kernel: audit: type=1103 audit(1768349764.896:1272): pid=6602 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:04.896000 audit[6602]: CRED_ACQ pid=6602 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:05.285634 sshd[6602]: Connection closed by 4.153.228.146 port 55970 Jan 14 00:16:05.286343 sshd-session[6598]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:05.287000 audit[6598]: USER_END pid=6598 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:05.291863 systemd[1]: sshd@67-46.224.77.139:22-4.153.228.146:55970.service: Deactivated successfully. Jan 14 00:16:05.287000 audit[6598]: CRED_DISP pid=6598 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:05.293602 kernel: audit: type=1106 audit(1768349765.287:1273): pid=6598 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:05.293688 kernel: audit: type=1104 audit(1768349765.287:1274): pid=6598 uid=0 auid=500 ses=68 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:05.292000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@67-46.224.77.139:22-4.153.228.146:55970 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:05.296259 systemd[1]: session-68.scope: Deactivated successfully. Jan 14 00:16:05.300187 systemd-logind[1545]: Session 68 logged out. Waiting for processes to exit. Jan 14 00:16:05.302484 systemd-logind[1545]: Removed session 68. Jan 14 00:16:06.335661 kubelet[2832]: E0114 00:16:06.335534 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:16:06.335661 kubelet[2832]: E0114 00:16:06.335625 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:16:07.335925 kubelet[2832]: E0114 00:16:07.335848 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:16:10.335532 kubelet[2832]: E0114 00:16:10.334981 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:16:10.394000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-46.224.77.139:22-4.153.228.146:57720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:10.394889 systemd[1]: Started sshd@68-46.224.77.139:22-4.153.228.146:57720.service - OpenSSH per-connection server daemon (4.153.228.146:57720). Jan 14 00:16:10.397191 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:10.397248 kernel: audit: type=1130 audit(1768349770.394:1276): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-46.224.77.139:22-4.153.228.146:57720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:10.967000 audit[6613]: USER_ACCT pid=6613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:10.968748 sshd[6613]: Accepted publickey for core from 4.153.228.146 port 57720 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:10.971631 kernel: audit: type=1101 audit(1768349770.967:1277): pid=6613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:10.971000 audit[6613]: CRED_ACQ pid=6613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:10.973224 sshd-session[6613]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:10.975768 kernel: audit: type=1103 audit(1768349770.971:1278): pid=6613 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:10.975885 kernel: audit: type=1006 audit(1768349770.971:1279): pid=6613 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=69 res=1 Jan 14 00:16:10.971000 audit[6613]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe1cab130 a2=3 a3=0 items=0 ppid=1 pid=6613 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:10.978739 kernel: audit: type=1300 audit(1768349770.971:1279): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe1cab130 a2=3 a3=0 items=0 ppid=1 pid=6613 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=69 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:10.971000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:10.980535 kernel: audit: type=1327 audit(1768349770.971:1279): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:10.980246 systemd-logind[1545]: New session 69 of user core. Jan 14 00:16:10.986711 systemd[1]: Started session-69.scope - Session 69 of User core. Jan 14 00:16:10.990000 audit[6613]: USER_START pid=6613 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:10.995310 kernel: audit: type=1105 audit(1768349770.990:1280): pid=6613 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:10.995000 audit[6617]: CRED_ACQ pid=6617 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:10.998595 kernel: audit: type=1103 audit(1768349770.995:1281): pid=6617 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:11.334485 kubelet[2832]: E0114 00:16:11.334078 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:16:11.360010 sshd[6617]: Connection closed by 4.153.228.146 port 57720 Jan 14 00:16:11.361360 sshd-session[6613]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:11.368000 audit[6613]: USER_END pid=6613 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:11.368000 audit[6613]: CRED_DISP pid=6613 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:11.376449 kernel: audit: type=1106 audit(1768349771.368:1282): pid=6613 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:11.376701 kernel: audit: type=1104 audit(1768349771.368:1283): pid=6613 uid=0 auid=500 ses=69 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:11.375860 systemd[1]: sshd@68-46.224.77.139:22-4.153.228.146:57720.service: Deactivated successfully. Jan 14 00:16:11.376000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@68-46.224.77.139:22-4.153.228.146:57720 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:11.383514 systemd[1]: session-69.scope: Deactivated successfully. Jan 14 00:16:11.390507 systemd-logind[1545]: Session 69 logged out. Waiting for processes to exit. Jan 14 00:16:11.393513 systemd-logind[1545]: Removed session 69. Jan 14 00:16:15.334925 kubelet[2832]: E0114 00:16:15.334876 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:16:16.478196 systemd[1]: Started sshd@69-46.224.77.139:22-4.153.228.146:38432.service - OpenSSH per-connection server daemon (4.153.228.146:38432). Jan 14 00:16:16.480583 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:16.480663 kernel: audit: type=1130 audit(1768349776.477:1285): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-46.224.77.139:22-4.153.228.146:38432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:16.477000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-46.224.77.139:22-4.153.228.146:38432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:17.042000 audit[6629]: USER_ACCT pid=6629 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.046328 sshd[6629]: Accepted publickey for core from 4.153.228.146 port 38432 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:17.050125 kernel: audit: type=1101 audit(1768349777.042:1286): pid=6629 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.050226 kernel: audit: type=1103 audit(1768349777.045:1287): pid=6629 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.045000 audit[6629]: CRED_ACQ pid=6629 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.047404 sshd-session[6629]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:17.054536 kernel: audit: type=1006 audit(1768349777.046:1288): pid=6629 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=70 res=1 Jan 14 00:16:17.046000 audit[6629]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe0147dd0 a2=3 a3=0 items=0 ppid=1 pid=6629 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:17.046000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:17.059548 kernel: audit: type=1300 audit(1768349777.046:1288): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe0147dd0 a2=3 a3=0 items=0 ppid=1 pid=6629 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=70 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:17.059687 kernel: audit: type=1327 audit(1768349777.046:1288): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:17.064547 systemd-logind[1545]: New session 70 of user core. Jan 14 00:16:17.067882 systemd[1]: Started session-70.scope - Session 70 of User core. Jan 14 00:16:17.071000 audit[6629]: USER_START pid=6629 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.074000 audit[6633]: CRED_ACQ pid=6633 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.077336 kernel: audit: type=1105 audit(1768349777.071:1289): pid=6629 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.077393 kernel: audit: type=1103 audit(1768349777.074:1290): pid=6633 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.474682 sshd[6633]: Connection closed by 4.153.228.146 port 38432 Jan 14 00:16:17.477723 sshd-session[6629]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:17.479000 audit[6629]: USER_END pid=6629 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.484006 systemd[1]: sshd@69-46.224.77.139:22-4.153.228.146:38432.service: Deactivated successfully. Jan 14 00:16:17.486572 kernel: audit: type=1106 audit(1768349777.479:1291): pid=6629 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.479000 audit[6629]: CRED_DISP pid=6629 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.489342 systemd[1]: session-70.scope: Deactivated successfully. Jan 14 00:16:17.484000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@69-46.224.77.139:22-4.153.228.146:38432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:17.491562 kernel: audit: type=1104 audit(1768349777.479:1292): pid=6629 uid=0 auid=500 ses=70 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:17.492717 systemd-logind[1545]: Session 70 logged out. Waiting for processes to exit. Jan 14 00:16:17.496274 systemd-logind[1545]: Removed session 70. Jan 14 00:16:19.334153 kubelet[2832]: E0114 00:16:19.333859 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:16:20.338453 kubelet[2832]: E0114 00:16:20.338208 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:16:20.339957 kubelet[2832]: E0114 00:16:20.339895 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:16:21.333384 kubelet[2832]: E0114 00:16:21.333258 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:16:22.581216 systemd[1]: Started sshd@70-46.224.77.139:22-4.153.228.146:38438.service - OpenSSH per-connection server daemon (4.153.228.146:38438). Jan 14 00:16:22.585103 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:22.585170 kernel: audit: type=1130 audit(1768349782.581:1294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-46.224.77.139:22-4.153.228.146:38438 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:22.581000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-46.224.77.139:22-4.153.228.146:38438 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:23.141000 audit[6645]: USER_ACCT pid=6645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.144571 sshd[6645]: Accepted publickey for core from 4.153.228.146 port 38438 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:23.144000 audit[6645]: CRED_ACQ pid=6645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.147308 kernel: audit: type=1101 audit(1768349783.141:1295): pid=6645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.147383 kernel: audit: type=1103 audit(1768349783.144:1296): pid=6645 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.147887 sshd-session[6645]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:23.150435 kernel: audit: type=1006 audit(1768349783.144:1297): pid=6645 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=71 res=1 Jan 14 00:16:23.144000 audit[6645]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff26ce120 a2=3 a3=0 items=0 ppid=1 pid=6645 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:23.152967 kernel: audit: type=1300 audit(1768349783.144:1297): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff26ce120 a2=3 a3=0 items=0 ppid=1 pid=6645 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=71 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:23.153043 kernel: audit: type=1327 audit(1768349783.144:1297): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:23.144000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:23.160509 systemd-logind[1545]: New session 71 of user core. Jan 14 00:16:23.167758 systemd[1]: Started session-71.scope - Session 71 of User core. Jan 14 00:16:23.172000 audit[6645]: USER_START pid=6645 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.176000 audit[6649]: CRED_ACQ pid=6649 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.178680 kernel: audit: type=1105 audit(1768349783.172:1298): pid=6645 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.178763 kernel: audit: type=1103 audit(1768349783.176:1299): pid=6649 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.557130 sshd[6649]: Connection closed by 4.153.228.146 port 38438 Jan 14 00:16:23.556964 sshd-session[6645]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:23.562000 audit[6645]: USER_END pid=6645 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.566000 audit[6645]: CRED_DISP pid=6645 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.569921 kernel: audit: type=1106 audit(1768349783.562:1300): pid=6645 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.570040 kernel: audit: type=1104 audit(1768349783.566:1301): pid=6645 uid=0 auid=500 ses=71 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:23.572664 systemd-logind[1545]: Session 71 logged out. Waiting for processes to exit. Jan 14 00:16:23.573238 systemd[1]: sshd@70-46.224.77.139:22-4.153.228.146:38438.service: Deactivated successfully. Jan 14 00:16:23.573000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@70-46.224.77.139:22-4.153.228.146:38438 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:23.577881 systemd[1]: session-71.scope: Deactivated successfully. Jan 14 00:16:23.581419 systemd-logind[1545]: Removed session 71. Jan 14 00:16:25.333258 kubelet[2832]: E0114 00:16:25.333160 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:16:28.671647 systemd[1]: Started sshd@71-46.224.77.139:22-4.153.228.146:47638.service - OpenSSH per-connection server daemon (4.153.228.146:47638). Jan 14 00:16:28.674483 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:28.674547 kernel: audit: type=1130 audit(1768349788.670:1303): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-46.224.77.139:22-4.153.228.146:47638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:28.670000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-46.224.77.139:22-4.153.228.146:47638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:29.216000 audit[6683]: USER_ACCT pid=6683 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.220638 sshd[6683]: Accepted publickey for core from 4.153.228.146 port 47638 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:29.226914 kernel: audit: type=1101 audit(1768349789.216:1304): pid=6683 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.227137 kernel: audit: type=1103 audit(1768349789.221:1305): pid=6683 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.221000 audit[6683]: CRED_ACQ pid=6683 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.224250 sshd-session[6683]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:29.229847 kernel: audit: type=1006 audit(1768349789.221:1306): pid=6683 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=72 res=1 Jan 14 00:16:29.221000 audit[6683]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc505e5c0 a2=3 a3=0 items=0 ppid=1 pid=6683 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:29.233023 kernel: audit: type=1300 audit(1768349789.221:1306): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc505e5c0 a2=3 a3=0 items=0 ppid=1 pid=6683 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=72 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:29.221000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:29.234827 kernel: audit: type=1327 audit(1768349789.221:1306): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:29.240624 systemd-logind[1545]: New session 72 of user core. Jan 14 00:16:29.245001 systemd[1]: Started session-72.scope - Session 72 of User core. Jan 14 00:16:29.248000 audit[6683]: USER_START pid=6683 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.254562 kernel: audit: type=1105 audit(1768349789.248:1307): pid=6683 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.253000 audit[6687]: CRED_ACQ pid=6687 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.258556 kernel: audit: type=1103 audit(1768349789.253:1308): pid=6687 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.336329 kubelet[2832]: E0114 00:16:29.335935 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:16:29.606010 sshd[6687]: Connection closed by 4.153.228.146 port 47638 Jan 14 00:16:29.609727 sshd-session[6683]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:29.610000 audit[6683]: USER_END pid=6683 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.620183 kernel: audit: type=1106 audit(1768349789.610:1309): pid=6683 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.620291 kernel: audit: type=1104 audit(1768349789.613:1310): pid=6683 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.613000 audit[6683]: CRED_DISP pid=6683 uid=0 auid=500 ses=72 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:29.623274 systemd[1]: sshd@71-46.224.77.139:22-4.153.228.146:47638.service: Deactivated successfully. Jan 14 00:16:29.622000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@71-46.224.77.139:22-4.153.228.146:47638 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:29.627364 systemd[1]: session-72.scope: Deactivated successfully. Jan 14 00:16:29.630707 systemd-logind[1545]: Session 72 logged out. Waiting for processes to exit. Jan 14 00:16:29.632840 systemd-logind[1545]: Removed session 72. Jan 14 00:16:31.333446 kubelet[2832]: E0114 00:16:31.333377 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:16:33.333992 kubelet[2832]: E0114 00:16:33.333822 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:16:34.726491 systemd[1]: Started sshd@72-46.224.77.139:22-4.153.228.146:34718.service - OpenSSH per-connection server daemon (4.153.228.146:34718). Jan 14 00:16:34.731007 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:34.731110 kernel: audit: type=1130 audit(1768349794.725:1312): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-46.224.77.139:22-4.153.228.146:34718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:34.725000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-46.224.77.139:22-4.153.228.146:34718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:35.303000 audit[6725]: USER_ACCT pid=6725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.307707 sshd[6725]: Accepted publickey for core from 4.153.228.146 port 34718 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:35.306000 audit[6725]: CRED_ACQ pid=6725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.310669 kernel: audit: type=1101 audit(1768349795.303:1313): pid=6725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.310743 kernel: audit: type=1103 audit(1768349795.306:1314): pid=6725 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.309570 sshd-session[6725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:35.312587 kernel: audit: type=1006 audit(1768349795.306:1315): pid=6725 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=73 res=1 Jan 14 00:16:35.306000 audit[6725]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff7a44fa0 a2=3 a3=0 items=0 ppid=1 pid=6725 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:35.315709 kernel: audit: type=1300 audit(1768349795.306:1315): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff7a44fa0 a2=3 a3=0 items=0 ppid=1 pid=6725 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=73 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:35.316717 kernel: audit: type=1327 audit(1768349795.306:1315): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:35.306000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:35.319764 systemd-logind[1545]: New session 73 of user core. Jan 14 00:16:35.324908 systemd[1]: Started session-73.scope - Session 73 of User core. Jan 14 00:16:35.327000 audit[6725]: USER_START pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.334244 kubelet[2832]: E0114 00:16:35.333929 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:16:35.332000 audit[6729]: CRED_ACQ pid=6729 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.337200 kernel: audit: type=1105 audit(1768349795.327:1316): pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.337419 kernel: audit: type=1103 audit(1768349795.332:1317): pid=6729 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.337460 kubelet[2832]: E0114 00:16:35.337250 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:16:35.720574 sshd[6729]: Connection closed by 4.153.228.146 port 34718 Jan 14 00:16:35.722329 sshd-session[6725]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:35.723000 audit[6725]: USER_END pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.729383 systemd[1]: sshd@72-46.224.77.139:22-4.153.228.146:34718.service: Deactivated successfully. Jan 14 00:16:35.723000 audit[6725]: CRED_DISP pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.732601 kernel: audit: type=1106 audit(1768349795.723:1318): pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.732703 kernel: audit: type=1104 audit(1768349795.723:1319): pid=6725 uid=0 auid=500 ses=73 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:35.728000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@72-46.224.77.139:22-4.153.228.146:34718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:35.736679 systemd[1]: session-73.scope: Deactivated successfully. Jan 14 00:16:35.738820 systemd-logind[1545]: Session 73 logged out. Waiting for processes to exit. Jan 14 00:16:35.741338 systemd-logind[1545]: Removed session 73. Jan 14 00:16:36.335693 kubelet[2832]: E0114 00:16:36.335341 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:16:40.337607 kubelet[2832]: E0114 00:16:40.336632 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:16:40.820000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-46.224.77.139:22-4.153.228.146:34734 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:40.821973 systemd[1]: Started sshd@73-46.224.77.139:22-4.153.228.146:34734.service - OpenSSH per-connection server daemon (4.153.228.146:34734). Jan 14 00:16:40.824240 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:40.824292 kernel: audit: type=1130 audit(1768349800.820:1321): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-46.224.77.139:22-4.153.228.146:34734 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:41.357000 audit[6742]: USER_ACCT pid=6742 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.361350 sshd[6742]: Accepted publickey for core from 4.153.228.146 port 34734 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:41.360000 audit[6742]: CRED_ACQ pid=6742 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.363100 sshd-session[6742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:41.364319 kernel: audit: type=1101 audit(1768349801.357:1322): pid=6742 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.364375 kernel: audit: type=1103 audit(1768349801.360:1323): pid=6742 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.367034 kernel: audit: type=1006 audit(1768349801.360:1324): pid=6742 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=74 res=1 Jan 14 00:16:41.370089 kernel: audit: type=1300 audit(1768349801.360:1324): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd0d9c520 a2=3 a3=0 items=0 ppid=1 pid=6742 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:41.360000 audit[6742]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd0d9c520 a2=3 a3=0 items=0 ppid=1 pid=6742 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=74 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:41.360000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:41.372897 kernel: audit: type=1327 audit(1768349801.360:1324): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:41.375268 systemd-logind[1545]: New session 74 of user core. Jan 14 00:16:41.380732 systemd[1]: Started session-74.scope - Session 74 of User core. Jan 14 00:16:41.383000 audit[6742]: USER_START pid=6742 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.388833 kernel: audit: type=1105 audit(1768349801.383:1325): pid=6742 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.388911 kernel: audit: type=1103 audit(1768349801.387:1326): pid=6746 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.387000 audit[6746]: CRED_ACQ pid=6746 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.733901 sshd[6746]: Connection closed by 4.153.228.146 port 34734 Jan 14 00:16:41.734888 sshd-session[6742]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:41.738000 audit[6742]: USER_END pid=6742 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.739000 audit[6742]: CRED_DISP pid=6742 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.745009 kernel: audit: type=1106 audit(1768349801.738:1327): pid=6742 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.745097 kernel: audit: type=1104 audit(1768349801.739:1328): pid=6742 uid=0 auid=500 ses=74 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:41.745334 systemd[1]: sshd@73-46.224.77.139:22-4.153.228.146:34734.service: Deactivated successfully. Jan 14 00:16:41.744000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@73-46.224.77.139:22-4.153.228.146:34734 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:41.748317 systemd[1]: session-74.scope: Deactivated successfully. Jan 14 00:16:41.751362 systemd-logind[1545]: Session 74 logged out. Waiting for processes to exit. Jan 14 00:16:41.752684 systemd-logind[1545]: Removed session 74. Jan 14 00:16:46.335416 kubelet[2832]: E0114 00:16:46.335045 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:16:46.843926 systemd[1]: Started sshd@74-46.224.77.139:22-4.153.228.146:55596.service - OpenSSH per-connection server daemon (4.153.228.146:55596). Jan 14 00:16:46.846673 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:46.846766 kernel: audit: type=1130 audit(1768349806.842:1330): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-46.224.77.139:22-4.153.228.146:55596 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:46.842000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-46.224.77.139:22-4.153.228.146:55596 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:47.335426 kubelet[2832]: E0114 00:16:47.334816 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:16:47.338645 kubelet[2832]: E0114 00:16:47.338586 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:16:47.377000 audit[6760]: USER_ACCT pid=6760 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.382722 sshd[6760]: Accepted publickey for core from 4.153.228.146 port 55596 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:47.383183 sshd-session[6760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:47.380000 audit[6760]: CRED_ACQ pid=6760 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.385620 kernel: audit: type=1101 audit(1768349807.377:1331): pid=6760 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.385700 kernel: audit: type=1103 audit(1768349807.380:1332): pid=6760 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.387006 kernel: audit: type=1006 audit(1768349807.380:1333): pid=6760 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=75 res=1 Jan 14 00:16:47.380000 audit[6760]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc82c1760 a2=3 a3=0 items=0 ppid=1 pid=6760 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:47.391870 kernel: audit: type=1300 audit(1768349807.380:1333): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc82c1760 a2=3 a3=0 items=0 ppid=1 pid=6760 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=75 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:47.392004 kernel: audit: type=1327 audit(1768349807.380:1333): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:47.380000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:47.398816 systemd-logind[1545]: New session 75 of user core. Jan 14 00:16:47.406701 systemd[1]: Started session-75.scope - Session 75 of User core. Jan 14 00:16:47.410000 audit[6760]: USER_START pid=6760 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.415576 kernel: audit: type=1105 audit(1768349807.410:1334): pid=6760 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.419000 audit[6769]: CRED_ACQ pid=6769 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.423654 kernel: audit: type=1103 audit(1768349807.419:1335): pid=6769 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.763671 sshd[6769]: Connection closed by 4.153.228.146 port 55596 Jan 14 00:16:47.764381 sshd-session[6760]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:47.766000 audit[6760]: USER_END pid=6760 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.766000 audit[6760]: CRED_DISP pid=6760 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.773023 systemd[1]: sshd@74-46.224.77.139:22-4.153.228.146:55596.service: Deactivated successfully. Jan 14 00:16:47.775599 kernel: audit: type=1106 audit(1768349807.766:1336): pid=6760 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.775747 kernel: audit: type=1104 audit(1768349807.766:1337): pid=6760 uid=0 auid=500 ses=75 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:47.771000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@74-46.224.77.139:22-4.153.228.146:55596 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:47.777481 systemd[1]: session-75.scope: Deactivated successfully. Jan 14 00:16:47.781439 systemd-logind[1545]: Session 75 logged out. Waiting for processes to exit. Jan 14 00:16:47.782980 systemd-logind[1545]: Removed session 75. Jan 14 00:16:49.334056 kubelet[2832]: E0114 00:16:49.333878 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:16:49.335638 kubelet[2832]: E0114 00:16:49.334937 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:16:52.879373 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:52.879489 kernel: audit: type=1130 audit(1768349812.875:1339): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-46.224.77.139:22-4.153.228.146:55604 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:52.875000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-46.224.77.139:22-4.153.228.146:55604 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:52.876816 systemd[1]: Started sshd@75-46.224.77.139:22-4.153.228.146:55604.service - OpenSSH per-connection server daemon (4.153.228.146:55604). Jan 14 00:16:53.439000 audit[6781]: USER_ACCT pid=6781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.447115 sshd[6781]: Accepted publickey for core from 4.153.228.146 port 55604 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:53.447546 kernel: audit: type=1101 audit(1768349813.439:1340): pid=6781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.446000 audit[6781]: CRED_ACQ pid=6781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.455423 kernel: audit: type=1103 audit(1768349813.446:1341): pid=6781 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.455585 kernel: audit: type=1006 audit(1768349813.446:1342): pid=6781 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=76 res=1 Jan 14 00:16:53.450685 sshd-session[6781]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:53.460324 kernel: audit: type=1300 audit(1768349813.446:1342): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd09b7f00 a2=3 a3=0 items=0 ppid=1 pid=6781 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:53.446000 audit[6781]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd09b7f00 a2=3 a3=0 items=0 ppid=1 pid=6781 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=76 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:53.465686 kernel: audit: type=1327 audit(1768349813.446:1342): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:53.446000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:53.466186 systemd-logind[1545]: New session 76 of user core. Jan 14 00:16:53.474123 systemd[1]: Started session-76.scope - Session 76 of User core. Jan 14 00:16:53.478000 audit[6781]: USER_START pid=6781 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.484542 kernel: audit: type=1105 audit(1768349813.478:1343): pid=6781 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.484000 audit[6787]: CRED_ACQ pid=6787 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.490551 kernel: audit: type=1103 audit(1768349813.484:1344): pid=6787 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.836025 sshd[6787]: Connection closed by 4.153.228.146 port 55604 Jan 14 00:16:53.836512 sshd-session[6781]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:53.837000 audit[6781]: USER_END pid=6781 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.840000 audit[6781]: CRED_DISP pid=6781 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.845139 kernel: audit: type=1106 audit(1768349813.837:1345): pid=6781 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.845195 kernel: audit: type=1104 audit(1768349813.840:1346): pid=6781 uid=0 auid=500 ses=76 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:53.847045 systemd[1]: sshd@75-46.224.77.139:22-4.153.228.146:55604.service: Deactivated successfully. Jan 14 00:16:53.847000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@75-46.224.77.139:22-4.153.228.146:55604 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:53.850970 systemd[1]: session-76.scope: Deactivated successfully. Jan 14 00:16:53.852620 systemd-logind[1545]: Session 76 logged out. Waiting for processes to exit. Jan 14 00:16:53.855122 systemd-logind[1545]: Removed session 76. Jan 14 00:16:54.337287 kubelet[2832]: E0114 00:16:54.337070 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:16:57.333881 kubelet[2832]: E0114 00:16:57.333563 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:16:58.334380 kubelet[2832]: E0114 00:16:58.334313 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:16:58.946813 systemd[1]: Started sshd@76-46.224.77.139:22-4.153.228.146:37880.service - OpenSSH per-connection server daemon (4.153.228.146:37880). Jan 14 00:16:58.945000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-46.224.77.139:22-4.153.228.146:37880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:58.949327 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:16:58.949414 kernel: audit: type=1130 audit(1768349818.945:1348): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-46.224.77.139:22-4.153.228.146:37880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:59.483000 audit[6800]: USER_ACCT pid=6800 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.487352 sshd[6800]: Accepted publickey for core from 4.153.228.146 port 37880 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:16:59.488809 sshd-session[6800]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:16:59.486000 audit[6800]: CRED_ACQ pid=6800 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.491121 kernel: audit: type=1101 audit(1768349819.483:1349): pid=6800 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.491195 kernel: audit: type=1103 audit(1768349819.486:1350): pid=6800 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.493012 kernel: audit: type=1006 audit(1768349819.486:1351): pid=6800 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=77 res=1 Jan 14 00:16:59.486000 audit[6800]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe8c074e0 a2=3 a3=0 items=0 ppid=1 pid=6800 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:59.495293 kernel: audit: type=1300 audit(1768349819.486:1351): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe8c074e0 a2=3 a3=0 items=0 ppid=1 pid=6800 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=77 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:16:59.486000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:59.501588 kernel: audit: type=1327 audit(1768349819.486:1351): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:16:59.504959 systemd-logind[1545]: New session 77 of user core. Jan 14 00:16:59.509291 systemd[1]: Started session-77.scope - Session 77 of User core. Jan 14 00:16:59.513000 audit[6800]: USER_START pid=6800 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.517000 audit[6804]: CRED_ACQ pid=6804 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.521299 kernel: audit: type=1105 audit(1768349819.513:1352): pid=6800 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.521377 kernel: audit: type=1103 audit(1768349819.517:1353): pid=6804 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.863548 sshd[6804]: Connection closed by 4.153.228.146 port 37880 Jan 14 00:16:59.864233 sshd-session[6800]: pam_unix(sshd:session): session closed for user core Jan 14 00:16:59.864000 audit[6800]: USER_END pid=6800 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.864000 audit[6800]: CRED_DISP pid=6800 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.873149 kernel: audit: type=1106 audit(1768349819.864:1354): pid=6800 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.873203 kernel: audit: type=1104 audit(1768349819.864:1355): pid=6800 uid=0 auid=500 ses=77 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:16:59.872321 systemd-logind[1545]: Session 77 logged out. Waiting for processes to exit. Jan 14 00:16:59.873982 systemd[1]: sshd@76-46.224.77.139:22-4.153.228.146:37880.service: Deactivated successfully. Jan 14 00:16:59.872000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@76-46.224.77.139:22-4.153.228.146:37880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:16:59.876535 systemd[1]: session-77.scope: Deactivated successfully. Jan 14 00:16:59.881490 systemd-logind[1545]: Removed session 77. Jan 14 00:17:00.337557 kubelet[2832]: E0114 00:17:00.337392 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:17:02.335575 kubelet[2832]: E0114 00:17:02.335459 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:17:02.337140 kubelet[2832]: E0114 00:17:02.336615 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:17:04.969000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-46.224.77.139:22-4.153.228.146:43252 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:04.972116 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:17:04.972270 kernel: audit: type=1130 audit(1768349824.969:1357): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-46.224.77.139:22-4.153.228.146:43252 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:04.971034 systemd[1]: Started sshd@77-46.224.77.139:22-4.153.228.146:43252.service - OpenSSH per-connection server daemon (4.153.228.146:43252). Jan 14 00:17:05.335610 kubelet[2832]: E0114 00:17:05.335399 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:17:05.524000 audit[6848]: USER_ACCT pid=6848 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.527825 sshd[6848]: Accepted publickey for core from 4.153.228.146 port 43252 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:05.528552 kernel: audit: type=1101 audit(1768349825.524:1358): pid=6848 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.527000 audit[6848]: CRED_ACQ pid=6848 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.530437 sshd-session[6848]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:05.535100 kernel: audit: type=1103 audit(1768349825.527:1359): pid=6848 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.535217 kernel: audit: type=1006 audit(1768349825.527:1360): pid=6848 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=78 res=1 Jan 14 00:17:05.527000 audit[6848]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd14ca280 a2=3 a3=0 items=0 ppid=1 pid=6848 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:05.537472 kernel: audit: type=1300 audit(1768349825.527:1360): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd14ca280 a2=3 a3=0 items=0 ppid=1 pid=6848 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=78 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:05.527000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:05.539179 kernel: audit: type=1327 audit(1768349825.527:1360): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:05.543034 systemd-logind[1545]: New session 78 of user core. Jan 14 00:17:05.558243 systemd[1]: Started session-78.scope - Session 78 of User core. Jan 14 00:17:05.562000 audit[6848]: USER_START pid=6848 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.566874 kernel: audit: type=1105 audit(1768349825.562:1361): pid=6848 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.568000 audit[6852]: CRED_ACQ pid=6852 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.572566 kernel: audit: type=1103 audit(1768349825.568:1362): pid=6852 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.928266 sshd[6852]: Connection closed by 4.153.228.146 port 43252 Jan 14 00:17:05.929025 sshd-session[6848]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:05.928000 audit[6848]: USER_END pid=6848 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.929000 audit[6848]: CRED_DISP pid=6848 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.936832 kernel: audit: type=1106 audit(1768349825.928:1363): pid=6848 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.936905 kernel: audit: type=1104 audit(1768349825.929:1364): pid=6848 uid=0 auid=500 ses=78 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:05.936205 systemd[1]: sshd@77-46.224.77.139:22-4.153.228.146:43252.service: Deactivated successfully. Jan 14 00:17:05.934000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@77-46.224.77.139:22-4.153.228.146:43252 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:05.940295 systemd[1]: session-78.scope: Deactivated successfully. Jan 14 00:17:05.946995 systemd-logind[1545]: Session 78 logged out. Waiting for processes to exit. Jan 14 00:17:05.949181 systemd-logind[1545]: Removed session 78. Jan 14 00:17:10.334483 kubelet[2832]: E0114 00:17:10.334161 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:17:11.039948 systemd[1]: Started sshd@78-46.224.77.139:22-4.153.228.146:43268.service - OpenSSH per-connection server daemon (4.153.228.146:43268). Jan 14 00:17:11.038000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-46.224.77.139:22-4.153.228.146:43268 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:11.042675 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:17:11.042807 kernel: audit: type=1130 audit(1768349831.038:1366): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-46.224.77.139:22-4.153.228.146:43268 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:11.334111 kubelet[2832]: E0114 00:17:11.333976 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:17:11.590154 sshd[6863]: Accepted publickey for core from 4.153.228.146 port 43268 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:11.588000 audit[6863]: USER_ACCT pid=6863 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.594583 kernel: audit: type=1101 audit(1768349831.588:1367): pid=6863 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.594681 kernel: audit: type=1103 audit(1768349831.592:1368): pid=6863 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.592000 audit[6863]: CRED_ACQ pid=6863 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.594467 sshd-session[6863]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:11.597213 kernel: audit: type=1006 audit(1768349831.592:1369): pid=6863 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=79 res=1 Jan 14 00:17:11.592000 audit[6863]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffee0dd4c0 a2=3 a3=0 items=0 ppid=1 pid=6863 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:11.599912 kernel: audit: type=1300 audit(1768349831.592:1369): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffee0dd4c0 a2=3 a3=0 items=0 ppid=1 pid=6863 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=79 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:11.592000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:11.600682 kernel: audit: type=1327 audit(1768349831.592:1369): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:11.607094 systemd-logind[1545]: New session 79 of user core. Jan 14 00:17:11.611721 systemd[1]: Started session-79.scope - Session 79 of User core. Jan 14 00:17:11.613000 audit[6863]: USER_START pid=6863 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.615000 audit[6867]: CRED_ACQ pid=6867 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.621299 kernel: audit: type=1105 audit(1768349831.613:1370): pid=6863 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.621387 kernel: audit: type=1103 audit(1768349831.615:1371): pid=6867 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.974258 sshd[6867]: Connection closed by 4.153.228.146 port 43268 Jan 14 00:17:11.974766 sshd-session[6863]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:11.977000 audit[6863]: USER_END pid=6863 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.984207 systemd[1]: sshd@78-46.224.77.139:22-4.153.228.146:43268.service: Deactivated successfully. Jan 14 00:17:11.978000 audit[6863]: CRED_DISP pid=6863 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.987867 kernel: audit: type=1106 audit(1768349831.977:1372): pid=6863 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.987947 kernel: audit: type=1104 audit(1768349831.978:1373): pid=6863 uid=0 auid=500 ses=79 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:11.988257 systemd[1]: session-79.scope: Deactivated successfully. Jan 14 00:17:11.984000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@78-46.224.77.139:22-4.153.228.146:43268 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:11.990931 systemd-logind[1545]: Session 79 logged out. Waiting for processes to exit. Jan 14 00:17:11.992412 systemd-logind[1545]: Removed session 79. Jan 14 00:17:12.301596 update_engine[1548]: I20260114 00:17:12.299618 1548 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jan 14 00:17:12.301596 update_engine[1548]: I20260114 00:17:12.299671 1548 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jan 14 00:17:12.301596 update_engine[1548]: I20260114 00:17:12.299968 1548 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jan 14 00:17:12.301987 update_engine[1548]: I20260114 00:17:12.301898 1548 omaha_request_params.cc:62] Current group set to alpha Jan 14 00:17:12.304034 update_engine[1548]: I20260114 00:17:12.303824 1548 update_attempter.cc:499] Already updated boot flags. Skipping. Jan 14 00:17:12.304034 update_engine[1548]: I20260114 00:17:12.303862 1548 update_attempter.cc:643] Scheduling an action processor start. Jan 14 00:17:12.304034 update_engine[1548]: I20260114 00:17:12.303881 1548 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 14 00:17:12.312170 update_engine[1548]: I20260114 00:17:12.307825 1548 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jan 14 00:17:12.312170 update_engine[1548]: I20260114 00:17:12.307929 1548 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 14 00:17:12.312170 update_engine[1548]: I20260114 00:17:12.307938 1548 omaha_request_action.cc:272] Request: Jan 14 00:17:12.312170 update_engine[1548]: Jan 14 00:17:12.312170 update_engine[1548]: Jan 14 00:17:12.312170 update_engine[1548]: Jan 14 00:17:12.312170 update_engine[1548]: Jan 14 00:17:12.312170 update_engine[1548]: Jan 14 00:17:12.312170 update_engine[1548]: Jan 14 00:17:12.312170 update_engine[1548]: Jan 14 00:17:12.312170 update_engine[1548]: Jan 14 00:17:12.312170 update_engine[1548]: I20260114 00:17:12.307945 1548 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 00:17:12.312170 update_engine[1548]: I20260114 00:17:12.311569 1548 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 00:17:12.312509 update_engine[1548]: I20260114 00:17:12.312361 1548 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 00:17:12.312813 locksmithd[1607]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jan 14 00:17:12.313832 update_engine[1548]: E20260114 00:17:12.313499 1548 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 00:17:12.313832 update_engine[1548]: I20260114 00:17:12.313617 1548 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jan 14 00:17:14.335820 kubelet[2832]: E0114 00:17:14.335764 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:17:15.335267 kubelet[2832]: E0114 00:17:15.335066 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:17:16.335133 kubelet[2832]: E0114 00:17:16.335047 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:17:17.084925 systemd[1]: Started sshd@79-46.224.77.139:22-4.153.228.146:51878.service - OpenSSH per-connection server daemon (4.153.228.146:51878). Jan 14 00:17:17.083000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-46.224.77.139:22-4.153.228.146:51878 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:17.087748 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:17:17.087813 kernel: audit: type=1130 audit(1768349837.083:1375): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-46.224.77.139:22-4.153.228.146:51878 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:17.630000 audit[6880]: USER_ACCT pid=6880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:17.634102 sshd[6880]: Accepted publickey for core from 4.153.228.146 port 51878 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:17.635035 kernel: audit: type=1101 audit(1768349837.630:1376): pid=6880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:17.634000 audit[6880]: CRED_ACQ pid=6880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:17.636796 sshd-session[6880]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:17.638754 kernel: audit: type=1103 audit(1768349837.634:1377): pid=6880 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:17.638935 kernel: audit: type=1006 audit(1768349837.634:1378): pid=6880 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=80 res=1 Jan 14 00:17:17.638973 kernel: audit: type=1300 audit(1768349837.634:1378): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff4f05c00 a2=3 a3=0 items=0 ppid=1 pid=6880 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:17.634000 audit[6880]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff4f05c00 a2=3 a3=0 items=0 ppid=1 pid=6880 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=80 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:17.641578 kernel: audit: type=1327 audit(1768349837.634:1378): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:17.634000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:17.644448 systemd-logind[1545]: New session 80 of user core. Jan 14 00:17:17.651062 systemd[1]: Started session-80.scope - Session 80 of User core. Jan 14 00:17:17.654000 audit[6880]: USER_START pid=6880 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:17.661000 audit[6884]: CRED_ACQ pid=6884 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:17.667008 kernel: audit: type=1105 audit(1768349837.654:1379): pid=6880 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:17.667133 kernel: audit: type=1103 audit(1768349837.661:1380): pid=6884 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:18.031907 sshd[6884]: Connection closed by 4.153.228.146 port 51878 Jan 14 00:17:18.035230 sshd-session[6880]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:18.036000 audit[6880]: USER_END pid=6880 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:18.037000 audit[6880]: CRED_DISP pid=6880 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:18.043659 kernel: audit: type=1106 audit(1768349838.036:1381): pid=6880 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:18.043739 kernel: audit: type=1104 audit(1768349838.037:1382): pid=6880 uid=0 auid=500 ses=80 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:18.044350 systemd[1]: sshd@79-46.224.77.139:22-4.153.228.146:51878.service: Deactivated successfully. Jan 14 00:17:18.044000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@79-46.224.77.139:22-4.153.228.146:51878 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:18.049707 systemd[1]: session-80.scope: Deactivated successfully. Jan 14 00:17:18.052711 systemd-logind[1545]: Session 80 logged out. Waiting for processes to exit. Jan 14 00:17:18.057012 systemd-logind[1545]: Removed session 80. Jan 14 00:17:20.343291 kubelet[2832]: E0114 00:17:20.343153 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:17:22.303673 update_engine[1548]: I20260114 00:17:22.303586 1548 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 00:17:22.304388 update_engine[1548]: I20260114 00:17:22.303704 1548 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 00:17:22.304818 update_engine[1548]: I20260114 00:17:22.304733 1548 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 00:17:22.305108 update_engine[1548]: E20260114 00:17:22.305071 1548 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 00:17:22.305183 update_engine[1548]: I20260114 00:17:22.305164 1548 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jan 14 00:17:23.157020 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:17:23.157149 kernel: audit: type=1130 audit(1768349843.153:1384): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-46.224.77.139:22-4.153.228.146:51894 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:23.153000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-46.224.77.139:22-4.153.228.146:51894 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:23.154647 systemd[1]: Started sshd@80-46.224.77.139:22-4.153.228.146:51894.service - OpenSSH per-connection server daemon (4.153.228.146:51894). Jan 14 00:17:23.335192 kubelet[2832]: E0114 00:17:23.335133 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:17:23.335640 kubelet[2832]: E0114 00:17:23.335486 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:17:23.717000 audit[6896]: USER_ACCT pid=6896 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:23.723738 sshd[6896]: Accepted publickey for core from 4.153.228.146 port 51894 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:23.725114 sshd-session[6896]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:23.722000 audit[6896]: CRED_ACQ pid=6896 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:23.727396 kernel: audit: type=1101 audit(1768349843.717:1385): pid=6896 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:23.727853 kernel: audit: type=1103 audit(1768349843.722:1386): pid=6896 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:23.727891 kernel: audit: type=1006 audit(1768349843.722:1387): pid=6896 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=81 res=1 Jan 14 00:17:23.722000 audit[6896]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc6a9fd40 a2=3 a3=0 items=0 ppid=1 pid=6896 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:23.731173 kernel: audit: type=1300 audit(1768349843.722:1387): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc6a9fd40 a2=3 a3=0 items=0 ppid=1 pid=6896 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=81 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:23.731566 kernel: audit: type=1327 audit(1768349843.722:1387): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:23.722000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:23.737373 systemd-logind[1545]: New session 81 of user core. Jan 14 00:17:23.743745 systemd[1]: Started session-81.scope - Session 81 of User core. Jan 14 00:17:23.745000 audit[6896]: USER_START pid=6896 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:23.755559 kernel: audit: type=1105 audit(1768349843.745:1388): pid=6896 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:23.754000 audit[6902]: CRED_ACQ pid=6902 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:23.758552 kernel: audit: type=1103 audit(1768349843.754:1389): pid=6902 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:24.146677 sshd[6902]: Connection closed by 4.153.228.146 port 51894 Jan 14 00:17:24.147065 sshd-session[6896]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:24.149000 audit[6896]: USER_END pid=6896 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:24.149000 audit[6896]: CRED_DISP pid=6896 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:24.156786 systemd[1]: sshd@80-46.224.77.139:22-4.153.228.146:51894.service: Deactivated successfully. Jan 14 00:17:24.157598 kernel: audit: type=1106 audit(1768349844.149:1390): pid=6896 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:24.157692 kernel: audit: type=1104 audit(1768349844.149:1391): pid=6896 uid=0 auid=500 ses=81 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:24.154000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@80-46.224.77.139:22-4.153.228.146:51894 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:24.159778 systemd[1]: session-81.scope: Deactivated successfully. Jan 14 00:17:24.161017 systemd-logind[1545]: Session 81 logged out. Waiting for processes to exit. Jan 14 00:17:24.162970 systemd-logind[1545]: Removed session 81. Jan 14 00:17:24.253579 systemd[1]: Started sshd@81-46.224.77.139:22-4.153.228.146:51906.service - OpenSSH per-connection server daemon (4.153.228.146:51906). Jan 14 00:17:24.252000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-46.224.77.139:22-4.153.228.146:51906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:24.800000 audit[6914]: USER_ACCT pid=6914 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:24.802711 sshd[6914]: Accepted publickey for core from 4.153.228.146 port 51906 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:24.803000 audit[6914]: CRED_ACQ pid=6914 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:24.803000 audit[6914]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffffab4840 a2=3 a3=0 items=0 ppid=1 pid=6914 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=82 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:24.803000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:24.806228 sshd-session[6914]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:24.814974 systemd-logind[1545]: New session 82 of user core. Jan 14 00:17:24.821288 systemd[1]: Started session-82.scope - Session 82 of User core. Jan 14 00:17:24.824000 audit[6914]: USER_START pid=6914 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:24.827000 audit[6918]: CRED_ACQ pid=6918 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:25.360381 sshd[6918]: Connection closed by 4.153.228.146 port 51906 Jan 14 00:17:25.360250 sshd-session[6914]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:25.362000 audit[6914]: USER_END pid=6914 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:25.363000 audit[6914]: CRED_DISP pid=6914 uid=0 auid=500 ses=82 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:25.369342 systemd-logind[1545]: Session 82 logged out. Waiting for processes to exit. Jan 14 00:17:25.369960 systemd[1]: sshd@81-46.224.77.139:22-4.153.228.146:51906.service: Deactivated successfully. Jan 14 00:17:25.368000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@81-46.224.77.139:22-4.153.228.146:51906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:25.372777 systemd[1]: session-82.scope: Deactivated successfully. Jan 14 00:17:25.377298 systemd-logind[1545]: Removed session 82. Jan 14 00:17:25.475060 systemd[1]: Started sshd@82-46.224.77.139:22-4.153.228.146:52508.service - OpenSSH per-connection server daemon (4.153.228.146:52508). Jan 14 00:17:25.473000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-46.224.77.139:22-4.153.228.146:52508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:26.032254 sshd[6928]: Accepted publickey for core from 4.153.228.146 port 52508 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:26.030000 audit[6928]: USER_ACCT pid=6928 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:26.033000 audit[6928]: CRED_ACQ pid=6928 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:26.033000 audit[6928]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdf774510 a2=3 a3=0 items=0 ppid=1 pid=6928 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=83 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:26.033000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:26.035967 sshd-session[6928]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:26.041562 systemd-logind[1545]: New session 83 of user core. Jan 14 00:17:26.049885 systemd[1]: Started session-83.scope - Session 83 of User core. Jan 14 00:17:26.053000 audit[6928]: USER_START pid=6928 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:26.056000 audit[6932]: CRED_ACQ pid=6932 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:26.339128 kubelet[2832]: E0114 00:17:26.338918 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:17:27.081000 audit[6950]: NETFILTER_CFG table=filter:137 family=2 entries=14 op=nft_register_rule pid=6950 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:17:27.081000 audit[6950]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd192e6d0 a2=0 a3=1 items=0 ppid=2942 pid=6950 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:27.081000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:17:27.086000 audit[6950]: NETFILTER_CFG table=nat:138 family=2 entries=20 op=nft_register_rule pid=6950 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:17:27.086000 audit[6950]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffd192e6d0 a2=0 a3=1 items=0 ppid=2942 pid=6950 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:27.086000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:17:27.132161 sshd[6932]: Connection closed by 4.153.228.146 port 52508 Jan 14 00:17:27.132807 sshd-session[6928]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:27.134000 audit[6928]: USER_END pid=6928 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:27.134000 audit[6928]: CRED_DISP pid=6928 uid=0 auid=500 ses=83 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:27.140690 systemd-logind[1545]: Session 83 logged out. Waiting for processes to exit. Jan 14 00:17:27.141368 systemd[1]: sshd@82-46.224.77.139:22-4.153.228.146:52508.service: Deactivated successfully. Jan 14 00:17:27.143000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@82-46.224.77.139:22-4.153.228.146:52508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:27.148858 systemd[1]: session-83.scope: Deactivated successfully. Jan 14 00:17:27.152465 systemd-logind[1545]: Removed session 83. Jan 14 00:17:27.244583 systemd[1]: Started sshd@83-46.224.77.139:22-4.153.228.146:52514.service - OpenSSH per-connection server daemon (4.153.228.146:52514). Jan 14 00:17:27.243000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-46.224.77.139:22-4.153.228.146:52514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:27.246000 audit[6955]: NETFILTER_CFG table=filter:139 family=2 entries=26 op=nft_register_rule pid=6955 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:17:27.246000 audit[6955]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffd721aab0 a2=0 a3=1 items=0 ppid=2942 pid=6955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:27.246000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:17:27.254000 audit[6955]: NETFILTER_CFG table=nat:140 family=2 entries=20 op=nft_register_rule pid=6955 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 14 00:17:27.254000 audit[6955]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffd721aab0 a2=0 a3=1 items=0 ppid=2942 pid=6955 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:27.254000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Jan 14 00:17:27.333878 kubelet[2832]: E0114 00:17:27.333679 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:17:27.789000 audit[6957]: USER_ACCT pid=6957 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:27.791176 sshd[6957]: Accepted publickey for core from 4.153.228.146 port 52514 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:27.792000 audit[6957]: CRED_ACQ pid=6957 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:27.792000 audit[6957]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd2973c80 a2=3 a3=0 items=0 ppid=1 pid=6957 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=84 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:27.792000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:27.794650 sshd-session[6957]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:27.801227 systemd-logind[1545]: New session 84 of user core. Jan 14 00:17:27.806904 systemd[1]: Started session-84.scope - Session 84 of User core. Jan 14 00:17:27.810000 audit[6957]: USER_START pid=6957 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:27.812000 audit[6961]: CRED_ACQ pid=6961 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:28.337107 kubelet[2832]: E0114 00:17:28.337042 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:17:28.364046 sshd[6961]: Connection closed by 4.153.228.146 port 52514 Jan 14 00:17:28.365716 sshd-session[6957]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:28.365000 audit[6957]: USER_END pid=6957 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:28.369772 kernel: kauditd_printk_skb: 43 callbacks suppressed Jan 14 00:17:28.369882 kernel: audit: type=1106 audit(1768349848.365:1421): pid=6957 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:28.365000 audit[6957]: CRED_DISP pid=6957 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:28.374488 kernel: audit: type=1104 audit(1768349848.365:1422): pid=6957 uid=0 auid=500 ses=84 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:28.371349 systemd[1]: sshd@83-46.224.77.139:22-4.153.228.146:52514.service: Deactivated successfully. Jan 14 00:17:28.375094 systemd[1]: session-84.scope: Deactivated successfully. Jan 14 00:17:28.370000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-46.224.77.139:22-4.153.228.146:52514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:28.377475 kernel: audit: type=1131 audit(1768349848.370:1423): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@83-46.224.77.139:22-4.153.228.146:52514 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:28.381499 systemd-logind[1545]: Session 84 logged out. Waiting for processes to exit. Jan 14 00:17:28.383389 systemd-logind[1545]: Removed session 84. Jan 14 00:17:28.472000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-46.224.77.139:22-4.153.228.146:52522 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:28.474199 systemd[1]: Started sshd@84-46.224.77.139:22-4.153.228.146:52522.service - OpenSSH per-connection server daemon (4.153.228.146:52522). Jan 14 00:17:28.479589 kernel: audit: type=1130 audit(1768349848.472:1424): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-46.224.77.139:22-4.153.228.146:52522 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:29.018000 audit[6970]: USER_ACCT pid=6970 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:29.025549 kernel: audit: type=1101 audit(1768349849.018:1425): pid=6970 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:29.025669 kernel: audit: type=1103 audit(1768349849.021:1426): pid=6970 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:29.021000 audit[6970]: CRED_ACQ pid=6970 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:29.025785 sshd[6970]: Accepted publickey for core from 4.153.228.146 port 52522 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:29.028165 kernel: audit: type=1006 audit(1768349849.021:1427): pid=6970 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=85 res=1 Jan 14 00:17:29.026874 sshd-session[6970]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:29.031390 kernel: audit: type=1300 audit(1768349849.021:1427): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd5ab66d0 a2=3 a3=0 items=0 ppid=1 pid=6970 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=85 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:29.021000 audit[6970]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd5ab66d0 a2=3 a3=0 items=0 ppid=1 pid=6970 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=85 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:29.021000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:29.032728 kernel: audit: type=1327 audit(1768349849.021:1427): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:29.037648 systemd-logind[1545]: New session 85 of user core. Jan 14 00:17:29.042746 systemd[1]: Started session-85.scope - Session 85 of User core. Jan 14 00:17:29.044000 audit[6970]: USER_START pid=6970 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:29.050553 kernel: audit: type=1105 audit(1768349849.044:1428): pid=6970 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:29.049000 audit[6974]: CRED_ACQ pid=6974 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:29.428565 sshd[6974]: Connection closed by 4.153.228.146 port 52522 Jan 14 00:17:29.429236 sshd-session[6970]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:29.430000 audit[6970]: USER_END pid=6970 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:29.431000 audit[6970]: CRED_DISP pid=6970 uid=0 auid=500 ses=85 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:29.438488 systemd[1]: sshd@84-46.224.77.139:22-4.153.228.146:52522.service: Deactivated successfully. Jan 14 00:17:29.438000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@84-46.224.77.139:22-4.153.228.146:52522 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:29.442989 systemd[1]: session-85.scope: Deactivated successfully. Jan 14 00:17:29.445557 systemd-logind[1545]: Session 85 logged out. Waiting for processes to exit. Jan 14 00:17:29.446733 systemd-logind[1545]: Removed session 85. Jan 14 00:17:32.301658 update_engine[1548]: I20260114 00:17:32.301564 1548 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 00:17:32.301658 update_engine[1548]: I20260114 00:17:32.301656 1548 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 00:17:32.302446 update_engine[1548]: I20260114 00:17:32.302031 1548 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 00:17:32.302446 update_engine[1548]: E20260114 00:17:32.302327 1548 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 00:17:32.302446 update_engine[1548]: I20260114 00:17:32.302389 1548 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jan 14 00:17:34.548635 kernel: kauditd_printk_skb: 4 callbacks suppressed Jan 14 00:17:34.548768 kernel: audit: type=1130 audit(1768349854.545:1433): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-46.224.77.139:22-4.153.228.146:52524 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:34.545000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-46.224.77.139:22-4.153.228.146:52524 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:34.547353 systemd[1]: Started sshd@85-46.224.77.139:22-4.153.228.146:52524.service - OpenSSH per-connection server daemon (4.153.228.146:52524). Jan 14 00:17:35.118559 kernel: audit: type=1101 audit(1768349855.113:1434): pid=7009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.113000 audit[7009]: USER_ACCT pid=7009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.118755 sshd[7009]: Accepted publickey for core from 4.153.228.146 port 52524 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:35.118000 audit[7009]: CRED_ACQ pid=7009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.123263 sshd-session[7009]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:35.125709 kernel: audit: type=1103 audit(1768349855.118:1435): pid=7009 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.125794 kernel: audit: type=1006 audit(1768349855.118:1436): pid=7009 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=86 res=1 Jan 14 00:17:35.118000 audit[7009]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc165e040 a2=3 a3=0 items=0 ppid=1 pid=7009 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:35.128926 kernel: audit: type=1300 audit(1768349855.118:1436): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc165e040 a2=3 a3=0 items=0 ppid=1 pid=7009 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=86 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:35.118000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:35.131644 kernel: audit: type=1327 audit(1768349855.118:1436): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:35.137922 systemd-logind[1545]: New session 86 of user core. Jan 14 00:17:35.144025 systemd[1]: Started session-86.scope - Session 86 of User core. Jan 14 00:17:35.145000 audit[7009]: USER_START pid=7009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.156147 kernel: audit: type=1105 audit(1768349855.145:1437): pid=7009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.156260 kernel: audit: type=1103 audit(1768349855.150:1438): pid=7014 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.150000 audit[7014]: CRED_ACQ pid=7014 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.496752 sshd[7014]: Connection closed by 4.153.228.146 port 52524 Jan 14 00:17:35.498380 sshd-session[7009]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:35.499000 audit[7009]: USER_END pid=7009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.505470 systemd[1]: sshd@85-46.224.77.139:22-4.153.228.146:52524.service: Deactivated successfully. Jan 14 00:17:35.507440 kernel: audit: type=1106 audit(1768349855.499:1439): pid=7009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.507498 kernel: audit: type=1104 audit(1768349855.499:1440): pid=7009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.499000 audit[7009]: CRED_DISP pid=7009 uid=0 auid=500 ses=86 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:35.506000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@85-46.224.77.139:22-4.153.228.146:52524 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:35.510789 systemd[1]: session-86.scope: Deactivated successfully. Jan 14 00:17:35.512287 systemd-logind[1545]: Session 86 logged out. Waiting for processes to exit. Jan 14 00:17:35.514159 systemd-logind[1545]: Removed session 86. Jan 14 00:17:36.335259 kubelet[2832]: E0114 00:17:36.334811 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:17:36.336794 kubelet[2832]: E0114 00:17:36.336725 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:17:38.339910 kubelet[2832]: E0114 00:17:38.339805 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:17:39.334384 kubelet[2832]: E0114 00:17:39.334287 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:17:40.335668 kubelet[2832]: E0114 00:17:40.335514 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:17:40.609925 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:17:40.610020 kernel: audit: type=1130 audit(1768349860.605:1442): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-46.224.77.139:22-4.153.228.146:52130 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:40.605000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-46.224.77.139:22-4.153.228.146:52130 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:40.606789 systemd[1]: Started sshd@86-46.224.77.139:22-4.153.228.146:52130.service - OpenSSH per-connection server daemon (4.153.228.146:52130). Jan 14 00:17:41.161273 sshd[7026]: Accepted publickey for core from 4.153.228.146 port 52130 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:41.159000 audit[7026]: USER_ACCT pid=7026 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.164911 sshd-session[7026]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:41.169106 kernel: audit: type=1101 audit(1768349861.159:1443): pid=7026 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.169175 kernel: audit: type=1103 audit(1768349861.162:1444): pid=7026 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.162000 audit[7026]: CRED_ACQ pid=7026 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.170626 kernel: audit: type=1006 audit(1768349861.162:1445): pid=7026 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=87 res=1 Jan 14 00:17:41.162000 audit[7026]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffa59ab80 a2=3 a3=0 items=0 ppid=1 pid=7026 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:41.173723 kernel: audit: type=1300 audit(1768349861.162:1445): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffa59ab80 a2=3 a3=0 items=0 ppid=1 pid=7026 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=87 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:41.177614 kernel: audit: type=1327 audit(1768349861.162:1445): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:41.162000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:41.182860 systemd-logind[1545]: New session 87 of user core. Jan 14 00:17:41.185918 systemd[1]: Started session-87.scope - Session 87 of User core. Jan 14 00:17:41.190000 audit[7026]: USER_START pid=7026 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.194000 audit[7031]: CRED_ACQ pid=7031 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.198375 kernel: audit: type=1105 audit(1768349861.190:1446): pid=7026 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.198465 kernel: audit: type=1103 audit(1768349861.194:1447): pid=7031 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.569186 sshd[7031]: Connection closed by 4.153.228.146 port 52130 Jan 14 00:17:41.569976 sshd-session[7026]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:41.570000 audit[7026]: USER_END pid=7026 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.573000 audit[7026]: CRED_DISP pid=7026 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.579962 kernel: audit: type=1106 audit(1768349861.570:1448): pid=7026 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.580013 kernel: audit: type=1104 audit(1768349861.573:1449): pid=7026 uid=0 auid=500 ses=87 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:41.580142 systemd[1]: session-87.scope: Deactivated successfully. Jan 14 00:17:41.580955 systemd[1]: sshd@86-46.224.77.139:22-4.153.228.146:52130.service: Deactivated successfully. Jan 14 00:17:41.579000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@86-46.224.77.139:22-4.153.228.146:52130 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:41.587267 systemd-logind[1545]: Session 87 logged out. Waiting for processes to exit. Jan 14 00:17:41.590082 systemd-logind[1545]: Removed session 87. Jan 14 00:17:42.299697 update_engine[1548]: I20260114 00:17:42.299597 1548 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 00:17:42.299697 update_engine[1548]: I20260114 00:17:42.299688 1548 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 00:17:42.300386 update_engine[1548]: I20260114 00:17:42.300062 1548 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 00:17:42.300694 update_engine[1548]: E20260114 00:17:42.300621 1548 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 00:17:42.300694 update_engine[1548]: I20260114 00:17:42.300690 1548 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 14 00:17:42.300694 update_engine[1548]: I20260114 00:17:42.300700 1548 omaha_request_action.cc:617] Omaha request response: Jan 14 00:17:42.301510 update_engine[1548]: E20260114 00:17:42.300773 1548 omaha_request_action.cc:636] Omaha request network transfer failed. Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300806 1548 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300810 1548 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300815 1548 update_attempter.cc:306] Processing Done. Jan 14 00:17:42.301510 update_engine[1548]: E20260114 00:17:42.300828 1548 update_attempter.cc:619] Update failed. Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300832 1548 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300836 1548 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300841 1548 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300920 1548 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300943 1548 omaha_request_action.cc:271] Posting an Omaha request to disabled Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300948 1548 omaha_request_action.cc:272] Request: Jan 14 00:17:42.301510 update_engine[1548]: Jan 14 00:17:42.301510 update_engine[1548]: Jan 14 00:17:42.301510 update_engine[1548]: Jan 14 00:17:42.301510 update_engine[1548]: Jan 14 00:17:42.301510 update_engine[1548]: Jan 14 00:17:42.301510 update_engine[1548]: Jan 14 00:17:42.301510 update_engine[1548]: I20260114 00:17:42.300954 1548 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jan 14 00:17:42.303852 update_engine[1548]: I20260114 00:17:42.300971 1548 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jan 14 00:17:42.303852 update_engine[1548]: I20260114 00:17:42.301201 1548 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jan 14 00:17:42.303852 update_engine[1548]: E20260114 00:17:42.302006 1548 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Jan 14 00:17:42.303852 update_engine[1548]: I20260114 00:17:42.302059 1548 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jan 14 00:17:42.303852 update_engine[1548]: I20260114 00:17:42.302067 1548 omaha_request_action.cc:617] Omaha request response: Jan 14 00:17:42.303852 update_engine[1548]: I20260114 00:17:42.302073 1548 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 00:17:42.303852 update_engine[1548]: I20260114 00:17:42.302076 1548 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jan 14 00:17:42.303852 update_engine[1548]: I20260114 00:17:42.302080 1548 update_attempter.cc:306] Processing Done. Jan 14 00:17:42.303852 update_engine[1548]: I20260114 00:17:42.302085 1548 update_attempter.cc:310] Error event sent. Jan 14 00:17:42.303852 update_engine[1548]: I20260114 00:17:42.302093 1548 update_check_scheduler.cc:74] Next update check in 42m40s Jan 14 00:17:42.305376 locksmithd[1607]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jan 14 00:17:42.305376 locksmithd[1607]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jan 14 00:17:43.334104 kubelet[2832]: E0114 00:17:43.334064 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:17:46.684764 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:17:46.684876 kernel: audit: type=1130 audit(1768349866.678:1451): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-46.224.77.139:22-4.153.228.146:37942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:46.678000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-46.224.77.139:22-4.153.228.146:37942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:46.680196 systemd[1]: Started sshd@87-46.224.77.139:22-4.153.228.146:37942.service - OpenSSH per-connection server daemon (4.153.228.146:37942). Jan 14 00:17:47.220000 audit[7045]: USER_ACCT pid=7045 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.224556 kernel: audit: type=1101 audit(1768349867.220:1452): pid=7045 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.224846 sshd[7045]: Accepted publickey for core from 4.153.228.146 port 37942 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:47.224000 audit[7045]: CRED_ACQ pid=7045 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.228296 sshd-session[7045]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:47.231137 kernel: audit: type=1103 audit(1768349867.224:1453): pid=7045 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.231220 kernel: audit: type=1006 audit(1768349867.224:1454): pid=7045 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=88 res=1 Jan 14 00:17:47.224000 audit[7045]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdf3b3ef0 a2=3 a3=0 items=0 ppid=1 pid=7045 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:47.234393 kernel: audit: type=1300 audit(1768349867.224:1454): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdf3b3ef0 a2=3 a3=0 items=0 ppid=1 pid=7045 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=88 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:47.224000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:47.237535 kernel: audit: type=1327 audit(1768349867.224:1454): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:47.244595 systemd-logind[1545]: New session 88 of user core. Jan 14 00:17:47.252287 systemd[1]: Started session-88.scope - Session 88 of User core. Jan 14 00:17:47.256000 audit[7045]: USER_START pid=7045 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.259000 audit[7050]: CRED_ACQ pid=7050 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.263589 kernel: audit: type=1105 audit(1768349867.256:1455): pid=7045 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.263694 kernel: audit: type=1103 audit(1768349867.259:1456): pid=7050 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.640693 sshd[7050]: Connection closed by 4.153.228.146 port 37942 Jan 14 00:17:47.643795 sshd-session[7045]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:47.644000 audit[7045]: USER_END pid=7045 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.644000 audit[7045]: CRED_DISP pid=7045 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.650344 kernel: audit: type=1106 audit(1768349867.644:1457): pid=7045 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.650379 kernel: audit: type=1104 audit(1768349867.644:1458): pid=7045 uid=0 auid=500 ses=88 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:47.652156 systemd[1]: sshd@87-46.224.77.139:22-4.153.228.146:37942.service: Deactivated successfully. Jan 14 00:17:47.652000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@87-46.224.77.139:22-4.153.228.146:37942 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:47.657430 systemd[1]: session-88.scope: Deactivated successfully. Jan 14 00:17:47.662754 systemd-logind[1545]: Session 88 logged out. Waiting for processes to exit. Jan 14 00:17:47.664540 systemd-logind[1545]: Removed session 88. Jan 14 00:17:48.334391 kubelet[2832]: E0114 00:17:48.334232 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:17:50.335135 kubelet[2832]: E0114 00:17:50.334741 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:17:51.334286 kubelet[2832]: E0114 00:17:51.334051 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:17:51.335952 kubelet[2832]: E0114 00:17:51.335894 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:17:52.765241 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:17:52.765352 kernel: audit: type=1130 audit(1768349872.760:1460): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-46.224.77.139:22-4.153.228.146:37946 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:52.760000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-46.224.77.139:22-4.153.228.146:37946 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:52.761418 systemd[1]: Started sshd@88-46.224.77.139:22-4.153.228.146:37946.service - OpenSSH per-connection server daemon (4.153.228.146:37946). Jan 14 00:17:53.326986 sshd[7061]: Accepted publickey for core from 4.153.228.146 port 37946 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:53.325000 audit[7061]: USER_ACCT pid=7061 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.333329 kernel: audit: type=1101 audit(1768349873.325:1461): pid=7061 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.333427 kernel: audit: type=1103 audit(1768349873.329:1462): pid=7061 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.329000 audit[7061]: CRED_ACQ pid=7061 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.331738 sshd-session[7061]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:53.334987 kernel: audit: type=1006 audit(1768349873.329:1463): pid=7061 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=89 res=1 Jan 14 00:17:53.329000 audit[7061]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe4af4bf0 a2=3 a3=0 items=0 ppid=1 pid=7061 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:53.339667 kernel: audit: type=1300 audit(1768349873.329:1463): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe4af4bf0 a2=3 a3=0 items=0 ppid=1 pid=7061 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=89 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:53.329000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:53.342547 kernel: audit: type=1327 audit(1768349873.329:1463): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:53.348077 kubelet[2832]: E0114 00:17:53.345280 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:17:53.346624 systemd-logind[1545]: New session 89 of user core. Jan 14 00:17:53.354069 systemd[1]: Started session-89.scope - Session 89 of User core. Jan 14 00:17:53.359000 audit[7061]: USER_START pid=7061 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.363000 audit[7068]: CRED_ACQ pid=7068 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.368977 kernel: audit: type=1105 audit(1768349873.359:1464): pid=7061 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.369068 kernel: audit: type=1103 audit(1768349873.363:1465): pid=7068 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.752947 sshd[7068]: Connection closed by 4.153.228.146 port 37946 Jan 14 00:17:53.753607 sshd-session[7061]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:53.755000 audit[7061]: USER_END pid=7061 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.761621 systemd[1]: sshd@88-46.224.77.139:22-4.153.228.146:37946.service: Deactivated successfully. Jan 14 00:17:53.755000 audit[7061]: CRED_DISP pid=7061 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.764608 kernel: audit: type=1106 audit(1768349873.755:1466): pid=7061 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.764684 kernel: audit: type=1104 audit(1768349873.755:1467): pid=7061 uid=0 auid=500 ses=89 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:53.766986 systemd[1]: session-89.scope: Deactivated successfully. Jan 14 00:17:53.760000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@88-46.224.77.139:22-4.153.228.146:37946 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:53.771439 systemd-logind[1545]: Session 89 logged out. Waiting for processes to exit. Jan 14 00:17:53.773451 systemd-logind[1545]: Removed session 89. Jan 14 00:17:57.334814 kubelet[2832]: E0114 00:17:57.334761 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:17:58.869848 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:17:58.869930 kernel: audit: type=1130 audit(1768349878.867:1469): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-46.224.77.139:22-4.153.228.146:55048 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:58.867000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-46.224.77.139:22-4.153.228.146:55048 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:58.868245 systemd[1]: Started sshd@89-46.224.77.139:22-4.153.228.146:55048.service - OpenSSH per-connection server daemon (4.153.228.146:55048). Jan 14 00:17:59.335552 kubelet[2832]: E0114 00:17:59.335509 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:17:59.443000 audit[7080]: USER_ACCT pid=7080 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.447169 sshd[7080]: Accepted publickey for core from 4.153.228.146 port 55048 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:17:59.446000 audit[7080]: CRED_ACQ pid=7080 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.449890 kernel: audit: type=1101 audit(1768349879.443:1470): pid=7080 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.449994 kernel: audit: type=1103 audit(1768349879.446:1471): pid=7080 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.451943 kernel: audit: type=1006 audit(1768349879.446:1472): pid=7080 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=90 res=1 Jan 14 00:17:59.451462 sshd-session[7080]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:17:59.446000 audit[7080]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd09eaa20 a2=3 a3=0 items=0 ppid=1 pid=7080 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:59.455733 kernel: audit: type=1300 audit(1768349879.446:1472): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd09eaa20 a2=3 a3=0 items=0 ppid=1 pid=7080 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=90 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:17:59.446000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:59.457244 kernel: audit: type=1327 audit(1768349879.446:1472): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:17:59.459840 systemd-logind[1545]: New session 90 of user core. Jan 14 00:17:59.464819 systemd[1]: Started session-90.scope - Session 90 of User core. Jan 14 00:17:59.468000 audit[7080]: USER_START pid=7080 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.472000 audit[7084]: CRED_ACQ pid=7084 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.475847 kernel: audit: type=1105 audit(1768349879.468:1473): pid=7080 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.475922 kernel: audit: type=1103 audit(1768349879.472:1474): pid=7084 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.870637 sshd[7084]: Connection closed by 4.153.228.146 port 55048 Jan 14 00:17:59.873711 sshd-session[7080]: pam_unix(sshd:session): session closed for user core Jan 14 00:17:59.875000 audit[7080]: USER_END pid=7080 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.880551 kernel: audit: type=1106 audit(1768349879.875:1475): pid=7080 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.875000 audit[7080]: CRED_DISP pid=7080 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.882817 systemd[1]: sshd@89-46.224.77.139:22-4.153.228.146:55048.service: Deactivated successfully. Jan 14 00:17:59.882958 systemd-logind[1545]: Session 90 logged out. Waiting for processes to exit. Jan 14 00:17:59.884571 kernel: audit: type=1104 audit(1768349879.875:1476): pid=7080 uid=0 auid=500 ses=90 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:17:59.884000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@89-46.224.77.139:22-4.153.228.146:55048 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:17:59.886330 systemd[1]: session-90.scope: Deactivated successfully. Jan 14 00:17:59.891167 systemd-logind[1545]: Removed session 90. Jan 14 00:18:03.334506 kubelet[2832]: E0114 00:18:03.334430 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:18:04.337583 kubelet[2832]: E0114 00:18:04.337489 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:18:04.339008 kubelet[2832]: E0114 00:18:04.338502 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:18:04.988840 systemd[1]: Started sshd@90-46.224.77.139:22-4.153.228.146:47212.service - OpenSSH per-connection server daemon (4.153.228.146:47212). Jan 14 00:18:04.992366 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:04.992416 kernel: audit: type=1130 audit(1768349884.988:1478): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-46.224.77.139:22-4.153.228.146:47212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:04.988000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-46.224.77.139:22-4.153.228.146:47212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:05.341292 kubelet[2832]: E0114 00:18:05.341214 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:18:05.527000 audit[7142]: USER_ACCT pid=7142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.530959 sshd[7142]: Accepted publickey for core from 4.153.228.146 port 47212 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:18:05.531600 kernel: audit: type=1101 audit(1768349885.527:1479): pid=7142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.533000 audit[7142]: CRED_ACQ pid=7142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.536852 sshd-session[7142]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:18:05.539267 kernel: audit: type=1103 audit(1768349885.533:1480): pid=7142 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.539351 kernel: audit: type=1006 audit(1768349885.535:1481): pid=7142 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=91 res=1 Jan 14 00:18:05.541650 kernel: audit: type=1300 audit(1768349885.535:1481): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd980d60 a2=3 a3=0 items=0 ppid=1 pid=7142 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:05.535000 audit[7142]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd980d60 a2=3 a3=0 items=0 ppid=1 pid=7142 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=91 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:05.535000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:05.543605 kernel: audit: type=1327 audit(1768349885.535:1481): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:05.549833 systemd-logind[1545]: New session 91 of user core. Jan 14 00:18:05.553735 systemd[1]: Started session-91.scope - Session 91 of User core. Jan 14 00:18:05.558000 audit[7142]: USER_START pid=7142 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.561000 audit[7146]: CRED_ACQ pid=7146 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.566511 kernel: audit: type=1105 audit(1768349885.558:1482): pid=7142 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.566574 kernel: audit: type=1103 audit(1768349885.561:1483): pid=7146 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.906100 sshd[7146]: Connection closed by 4.153.228.146 port 47212 Jan 14 00:18:05.906878 sshd-session[7142]: pam_unix(sshd:session): session closed for user core Jan 14 00:18:05.908000 audit[7142]: USER_END pid=7142 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.909000 audit[7142]: CRED_DISP pid=7142 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.917069 kernel: audit: type=1106 audit(1768349885.908:1484): pid=7142 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.917135 kernel: audit: type=1104 audit(1768349885.909:1485): pid=7142 uid=0 auid=500 ses=91 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:05.918078 systemd[1]: sshd@90-46.224.77.139:22-4.153.228.146:47212.service: Deactivated successfully. Jan 14 00:18:05.919000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@90-46.224.77.139:22-4.153.228.146:47212 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:05.924298 systemd[1]: session-91.scope: Deactivated successfully. Jan 14 00:18:05.926154 systemd-logind[1545]: Session 91 logged out. Waiting for processes to exit. Jan 14 00:18:05.928889 systemd-logind[1545]: Removed session 91. Jan 14 00:18:09.333736 kubelet[2832]: E0114 00:18:09.333576 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:18:11.022922 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:11.023046 kernel: audit: type=1130 audit(1768349891.019:1487): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-46.224.77.139:22-4.153.228.146:47222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:11.019000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-46.224.77.139:22-4.153.228.146:47222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:11.019870 systemd[1]: Started sshd@91-46.224.77.139:22-4.153.228.146:47222.service - OpenSSH per-connection server daemon (4.153.228.146:47222). Jan 14 00:18:11.335513 kubelet[2832]: E0114 00:18:11.335479 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:18:11.580000 audit[7158]: USER_ACCT pid=7158 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.583674 sshd[7158]: Accepted publickey for core from 4.153.228.146 port 47222 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:18:11.585572 kernel: audit: type=1101 audit(1768349891.580:1488): pid=7158 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.585666 kernel: audit: type=1103 audit(1768349891.583:1489): pid=7158 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.583000 audit[7158]: CRED_ACQ pid=7158 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.585410 sshd-session[7158]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:18:11.590375 kernel: audit: type=1006 audit(1768349891.584:1490): pid=7158 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=92 res=1 Jan 14 00:18:11.590469 kernel: audit: type=1300 audit(1768349891.584:1490): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe501eba0 a2=3 a3=0 items=0 ppid=1 pid=7158 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:11.584000 audit[7158]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe501eba0 a2=3 a3=0 items=0 ppid=1 pid=7158 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=92 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:11.584000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:11.593789 kernel: audit: type=1327 audit(1768349891.584:1490): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:11.598595 systemd-logind[1545]: New session 92 of user core. Jan 14 00:18:11.604718 systemd[1]: Started session-92.scope - Session 92 of User core. Jan 14 00:18:11.608000 audit[7158]: USER_START pid=7158 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.614473 kernel: audit: type=1105 audit(1768349891.608:1491): pid=7158 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.614583 kernel: audit: type=1103 audit(1768349891.613:1492): pid=7162 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.613000 audit[7162]: CRED_ACQ pid=7162 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.960438 sshd[7162]: Connection closed by 4.153.228.146 port 47222 Jan 14 00:18:11.960278 sshd-session[7158]: pam_unix(sshd:session): session closed for user core Jan 14 00:18:11.962000 audit[7158]: USER_END pid=7158 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.962000 audit[7158]: CRED_DISP pid=7158 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.969371 kernel: audit: type=1106 audit(1768349891.962:1493): pid=7158 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.969459 kernel: audit: type=1104 audit(1768349891.962:1494): pid=7158 uid=0 auid=500 ses=92 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:11.969888 systemd[1]: sshd@91-46.224.77.139:22-4.153.228.146:47222.service: Deactivated successfully. Jan 14 00:18:11.970000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@91-46.224.77.139:22-4.153.228.146:47222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:11.975562 systemd[1]: session-92.scope: Deactivated successfully. Jan 14 00:18:11.980896 systemd-logind[1545]: Session 92 logged out. Waiting for processes to exit. Jan 14 00:18:11.982067 systemd-logind[1545]: Removed session 92. Jan 14 00:18:15.333992 kubelet[2832]: E0114 00:18:15.333682 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:18:16.339341 kubelet[2832]: E0114 00:18:16.339261 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:18:16.341548 kubelet[2832]: E0114 00:18:16.341028 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:18:16.341548 kubelet[2832]: E0114 00:18:16.341092 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:18:17.077638 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:17.077735 kernel: audit: type=1130 audit(1768349897.074:1496): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-46.224.77.139:22-4.153.228.146:47358 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:17.074000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-46.224.77.139:22-4.153.228.146:47358 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:17.074813 systemd[1]: Started sshd@92-46.224.77.139:22-4.153.228.146:47358.service - OpenSSH per-connection server daemon (4.153.228.146:47358). Jan 14 00:18:17.617000 audit[7174]: USER_ACCT pid=7174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:17.621535 sshd[7174]: Accepted publickey for core from 4.153.228.146 port 47358 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:18:17.622723 kernel: audit: type=1101 audit(1768349897.617:1497): pid=7174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:17.624454 sshd-session[7174]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:18:17.623000 audit[7174]: CRED_ACQ pid=7174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:17.629126 kernel: audit: type=1103 audit(1768349897.623:1498): pid=7174 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:17.629191 kernel: audit: type=1006 audit(1768349897.623:1499): pid=7174 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=93 res=1 Jan 14 00:18:17.623000 audit[7174]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffebbb7df0 a2=3 a3=0 items=0 ppid=1 pid=7174 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:17.633129 kernel: audit: type=1300 audit(1768349897.623:1499): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffebbb7df0 a2=3 a3=0 items=0 ppid=1 pid=7174 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=93 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:17.623000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:17.634400 kernel: audit: type=1327 audit(1768349897.623:1499): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:17.638933 systemd-logind[1545]: New session 93 of user core. Jan 14 00:18:17.644011 systemd[1]: Started session-93.scope - Session 93 of User core. Jan 14 00:18:17.648000 audit[7174]: USER_START pid=7174 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:17.651000 audit[7178]: CRED_ACQ pid=7178 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:17.654552 kernel: audit: type=1105 audit(1768349897.648:1500): pid=7174 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:17.654632 kernel: audit: type=1103 audit(1768349897.651:1501): pid=7178 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:18.018742 sshd[7178]: Connection closed by 4.153.228.146 port 47358 Jan 14 00:18:18.020938 sshd-session[7174]: pam_unix(sshd:session): session closed for user core Jan 14 00:18:18.023000 audit[7174]: USER_END pid=7174 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:18.023000 audit[7174]: CRED_DISP pid=7174 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:18.029581 kernel: audit: type=1106 audit(1768349898.023:1502): pid=7174 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:18.029698 kernel: audit: type=1104 audit(1768349898.023:1503): pid=7174 uid=0 auid=500 ses=93 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:18.028465 systemd[1]: sshd@92-46.224.77.139:22-4.153.228.146:47358.service: Deactivated successfully. Jan 14 00:18:18.027000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@92-46.224.77.139:22-4.153.228.146:47358 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:18.032431 systemd[1]: session-93.scope: Deactivated successfully. Jan 14 00:18:18.036806 systemd-logind[1545]: Session 93 logged out. Waiting for processes to exit. Jan 14 00:18:18.038389 systemd-logind[1545]: Removed session 93. Jan 14 00:18:21.335193 kubelet[2832]: E0114 00:18:21.335143 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:18:23.128723 systemd[1]: Started sshd@93-46.224.77.139:22-4.153.228.146:47368.service - OpenSSH per-connection server daemon (4.153.228.146:47368). Jan 14 00:18:23.131872 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:23.131973 kernel: audit: type=1130 audit(1768349903.128:1505): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-46.224.77.139:22-4.153.228.146:47368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:23.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-46.224.77.139:22-4.153.228.146:47368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:23.662000 audit[7192]: USER_ACCT pid=7192 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:23.664836 sshd[7192]: Accepted publickey for core from 4.153.228.146 port 47368 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:18:23.667248 sshd-session[7192]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:18:23.669802 kernel: audit: type=1101 audit(1768349903.662:1506): pid=7192 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:23.669883 kernel: audit: type=1103 audit(1768349903.665:1507): pid=7192 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:23.665000 audit[7192]: CRED_ACQ pid=7192 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:23.671176 kernel: audit: type=1006 audit(1768349903.665:1508): pid=7192 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=94 res=1 Jan 14 00:18:23.665000 audit[7192]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd13f2480 a2=3 a3=0 items=0 ppid=1 pid=7192 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:23.665000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:23.674594 kernel: audit: type=1300 audit(1768349903.665:1508): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd13f2480 a2=3 a3=0 items=0 ppid=1 pid=7192 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=94 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:23.674648 kernel: audit: type=1327 audit(1768349903.665:1508): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:23.678168 systemd-logind[1545]: New session 94 of user core. Jan 14 00:18:23.682725 systemd[1]: Started session-94.scope - Session 94 of User core. Jan 14 00:18:23.686000 audit[7192]: USER_START pid=7192 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:23.690000 audit[7198]: CRED_ACQ pid=7198 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:23.693189 kernel: audit: type=1105 audit(1768349903.686:1509): pid=7192 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:23.693291 kernel: audit: type=1103 audit(1768349903.690:1510): pid=7198 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:24.048588 sshd[7198]: Connection closed by 4.153.228.146 port 47368 Jan 14 00:18:24.048744 sshd-session[7192]: pam_unix(sshd:session): session closed for user core Jan 14 00:18:24.050000 audit[7192]: USER_END pid=7192 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:24.050000 audit[7192]: CRED_DISP pid=7192 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:24.055247 systemd[1]: sshd@93-46.224.77.139:22-4.153.228.146:47368.service: Deactivated successfully. Jan 14 00:18:24.057005 kernel: audit: type=1106 audit(1768349904.050:1511): pid=7192 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:24.057218 kernel: audit: type=1104 audit(1768349904.050:1512): pid=7192 uid=0 auid=500 ses=94 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:24.055000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@93-46.224.77.139:22-4.153.228.146:47368 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:24.059098 systemd[1]: session-94.scope: Deactivated successfully. Jan 14 00:18:24.062369 systemd-logind[1545]: Session 94 logged out. Waiting for processes to exit. Jan 14 00:18:24.063689 systemd-logind[1545]: Removed session 94. Jan 14 00:18:24.336785 kubelet[2832]: E0114 00:18:24.336748 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:18:27.335157 kubelet[2832]: E0114 00:18:27.334827 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:18:27.336706 kubelet[2832]: E0114 00:18:27.335649 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:18:27.336706 kubelet[2832]: E0114 00:18:27.335736 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:18:28.338801 kubelet[2832]: E0114 00:18:28.338766 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:18:29.162092 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:29.162191 kernel: audit: type=1130 audit(1768349909.156:1514): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-46.224.77.139:22-4.153.228.146:54132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:29.156000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-46.224.77.139:22-4.153.228.146:54132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:29.157146 systemd[1]: Started sshd@94-46.224.77.139:22-4.153.228.146:54132.service - OpenSSH per-connection server daemon (4.153.228.146:54132). Jan 14 00:18:29.717000 audit[7209]: USER_ACCT pid=7209 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:29.720695 sshd[7209]: Accepted publickey for core from 4.153.228.146 port 54132 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:18:29.721000 audit[7209]: CRED_ACQ pid=7209 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:29.724586 kernel: audit: type=1101 audit(1768349909.717:1515): pid=7209 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:29.724673 kernel: audit: type=1103 audit(1768349909.721:1516): pid=7209 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:29.722974 sshd-session[7209]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:18:29.728980 kernel: audit: type=1006 audit(1768349909.721:1517): pid=7209 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=95 res=1 Jan 14 00:18:29.721000 audit[7209]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd27f29b0 a2=3 a3=0 items=0 ppid=1 pid=7209 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:29.721000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:29.734411 kernel: audit: type=1300 audit(1768349909.721:1517): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd27f29b0 a2=3 a3=0 items=0 ppid=1 pid=7209 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=95 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:29.734486 kernel: audit: type=1327 audit(1768349909.721:1517): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:29.736636 systemd-logind[1545]: New session 95 of user core. Jan 14 00:18:29.742763 systemd[1]: Started session-95.scope - Session 95 of User core. Jan 14 00:18:29.748000 audit[7209]: USER_START pid=7209 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:29.751000 audit[7213]: CRED_ACQ pid=7213 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:29.755340 kernel: audit: type=1105 audit(1768349909.748:1518): pid=7209 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:29.755402 kernel: audit: type=1103 audit(1768349909.751:1519): pid=7213 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:30.122736 sshd[7213]: Connection closed by 4.153.228.146 port 54132 Jan 14 00:18:30.124084 sshd-session[7209]: pam_unix(sshd:session): session closed for user core Jan 14 00:18:30.125000 audit[7209]: USER_END pid=7209 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:30.125000 audit[7209]: CRED_DISP pid=7209 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:30.129618 kernel: audit: type=1106 audit(1768349910.125:1520): pid=7209 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:30.131340 systemd[1]: sshd@94-46.224.77.139:22-4.153.228.146:54132.service: Deactivated successfully. Jan 14 00:18:30.131000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@94-46.224.77.139:22-4.153.228.146:54132 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:30.133543 kernel: audit: type=1104 audit(1768349910.125:1521): pid=7209 uid=0 auid=500 ses=95 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:30.134228 systemd[1]: session-95.scope: Deactivated successfully. Jan 14 00:18:30.136239 systemd-logind[1545]: Session 95 logged out. Waiting for processes to exit. Jan 14 00:18:30.139279 systemd-logind[1545]: Removed session 95. Jan 14 00:18:33.334314 kubelet[2832]: E0114 00:18:33.333963 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:18:35.247601 systemd[1]: Started sshd@95-46.224.77.139:22-4.153.228.146:56428.service - OpenSSH per-connection server daemon (4.153.228.146:56428). Jan 14 00:18:35.246000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-46.224.77.139:22-4.153.228.146:56428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:35.248912 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:35.248983 kernel: audit: type=1130 audit(1768349915.246:1523): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-46.224.77.139:22-4.153.228.146:56428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:35.816000 audit[7249]: USER_ACCT pid=7249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:35.818974 sshd[7249]: Accepted publickey for core from 4.153.228.146 port 56428 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:18:35.822061 sshd-session[7249]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:18:35.819000 audit[7249]: CRED_ACQ pid=7249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:35.824210 kernel: audit: type=1101 audit(1768349915.816:1524): pid=7249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:35.824302 kernel: audit: type=1103 audit(1768349915.819:1525): pid=7249 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:35.827577 kernel: audit: type=1006 audit(1768349915.819:1526): pid=7249 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=96 res=1 Jan 14 00:18:35.819000 audit[7249]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed573bc0 a2=3 a3=0 items=0 ppid=1 pid=7249 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:35.831015 kernel: audit: type=1300 audit(1768349915.819:1526): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed573bc0 a2=3 a3=0 items=0 ppid=1 pid=7249 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=96 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:35.819000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:35.834252 kernel: audit: type=1327 audit(1768349915.819:1526): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:35.837766 systemd-logind[1545]: New session 96 of user core. Jan 14 00:18:35.841742 systemd[1]: Started session-96.scope - Session 96 of User core. Jan 14 00:18:35.844000 audit[7249]: USER_START pid=7249 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:35.848651 kernel: audit: type=1105 audit(1768349915.844:1527): pid=7249 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:35.849000 audit[7253]: CRED_ACQ pid=7253 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:35.854556 kernel: audit: type=1103 audit(1768349915.849:1528): pid=7253 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:36.197633 sshd[7253]: Connection closed by 4.153.228.146 port 56428 Jan 14 00:18:36.197976 sshd-session[7249]: pam_unix(sshd:session): session closed for user core Jan 14 00:18:36.198000 audit[7249]: USER_END pid=7249 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:36.198000 audit[7249]: CRED_DISP pid=7249 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:36.205593 systemd-logind[1545]: Session 96 logged out. Waiting for processes to exit. Jan 14 00:18:36.206110 kernel: audit: type=1106 audit(1768349916.198:1529): pid=7249 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:36.206172 kernel: audit: type=1104 audit(1768349916.198:1530): pid=7249 uid=0 auid=500 ses=96 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:36.207500 systemd[1]: sshd@95-46.224.77.139:22-4.153.228.146:56428.service: Deactivated successfully. Jan 14 00:18:36.209000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@95-46.224.77.139:22-4.153.228.146:56428 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:36.213836 systemd[1]: session-96.scope: Deactivated successfully. Jan 14 00:18:36.217657 systemd-logind[1545]: Removed session 96. Jan 14 00:18:37.333590 kubelet[2832]: E0114 00:18:37.333509 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:18:38.334163 kubelet[2832]: E0114 00:18:38.333813 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:18:40.333809 kubelet[2832]: E0114 00:18:40.333465 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:18:40.341486 kubelet[2832]: E0114 00:18:40.341436 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:18:41.304156 systemd[1]: Started sshd@96-46.224.77.139:22-4.153.228.146:56432.service - OpenSSH per-connection server daemon (4.153.228.146:56432). Jan 14 00:18:41.305510 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:41.305557 kernel: audit: type=1130 audit(1768349921.302:1532): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-46.224.77.139:22-4.153.228.146:56432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:41.302000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-46.224.77.139:22-4.153.228.146:56432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:41.336733 kubelet[2832]: E0114 00:18:41.336680 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:18:41.849000 audit[7264]: USER_ACCT pid=7264 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:41.851553 sshd[7264]: Accepted publickey for core from 4.153.228.146 port 56432 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:18:41.856638 sshd-session[7264]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:18:41.854000 audit[7264]: CRED_ACQ pid=7264 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:41.860549 kernel: audit: type=1101 audit(1768349921.849:1533): pid=7264 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:41.861231 kernel: audit: type=1103 audit(1768349921.854:1534): pid=7264 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:41.862730 kernel: audit: type=1006 audit(1768349921.854:1535): pid=7264 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=97 res=1 Jan 14 00:18:41.854000 audit[7264]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff8432d70 a2=3 a3=0 items=0 ppid=1 pid=7264 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=97 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:41.865556 kernel: audit: type=1300 audit(1768349921.854:1535): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff8432d70 a2=3 a3=0 items=0 ppid=1 pid=7264 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=97 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:41.865627 kernel: audit: type=1327 audit(1768349921.854:1535): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:41.854000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:41.868671 systemd-logind[1545]: New session 97 of user core. Jan 14 00:18:41.872722 systemd[1]: Started session-97.scope - Session 97 of User core. Jan 14 00:18:41.876000 audit[7264]: USER_START pid=7264 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:41.879000 audit[7268]: CRED_ACQ pid=7268 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:41.884109 kernel: audit: type=1105 audit(1768349921.876:1536): pid=7264 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:41.884200 kernel: audit: type=1103 audit(1768349921.879:1537): pid=7268 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:42.243887 sshd[7268]: Connection closed by 4.153.228.146 port 56432 Jan 14 00:18:42.246128 sshd-session[7264]: pam_unix(sshd:session): session closed for user core Jan 14 00:18:42.246000 audit[7264]: USER_END pid=7264 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:42.246000 audit[7264]: CRED_DISP pid=7264 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:42.256182 kernel: audit: type=1106 audit(1768349922.246:1538): pid=7264 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:42.256290 kernel: audit: type=1104 audit(1768349922.246:1539): pid=7264 uid=0 auid=500 ses=97 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:42.257162 systemd[1]: sshd@96-46.224.77.139:22-4.153.228.146:56432.service: Deactivated successfully. Jan 14 00:18:42.255000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@96-46.224.77.139:22-4.153.228.146:56432 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:42.261767 systemd[1]: session-97.scope: Deactivated successfully. Jan 14 00:18:42.265073 systemd-logind[1545]: Session 97 logged out. Waiting for processes to exit. Jan 14 00:18:42.268276 systemd-logind[1545]: Removed session 97. Jan 14 00:18:45.334481 kubelet[2832]: E0114 00:18:45.334003 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:18:47.360083 systemd[1]: Started sshd@97-46.224.77.139:22-4.153.228.146:53042.service - OpenSSH per-connection server daemon (4.153.228.146:53042). Jan 14 00:18:47.360916 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:47.360973 kernel: audit: type=1130 audit(1768349927.358:1541): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-46.224.77.139:22-4.153.228.146:53042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:47.358000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-46.224.77.139:22-4.153.228.146:53042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:47.896000 audit[7281]: USER_ACCT pid=7281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:47.900816 sshd[7281]: Accepted publickey for core from 4.153.228.146 port 53042 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:18:47.899000 audit[7281]: CRED_ACQ pid=7281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:47.902316 sshd-session[7281]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:18:47.903403 kernel: audit: type=1101 audit(1768349927.896:1542): pid=7281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:47.903477 kernel: audit: type=1103 audit(1768349927.899:1543): pid=7281 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:47.905748 kernel: audit: type=1006 audit(1768349927.899:1544): pid=7281 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=98 res=1 Jan 14 00:18:47.899000 audit[7281]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff62e2fa0 a2=3 a3=0 items=0 ppid=1 pid=7281 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=98 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:47.909549 kernel: audit: type=1300 audit(1768349927.899:1544): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff62e2fa0 a2=3 a3=0 items=0 ppid=1 pid=7281 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=98 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:47.899000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:47.911600 kernel: audit: type=1327 audit(1768349927.899:1544): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:47.915349 systemd-logind[1545]: New session 98 of user core. Jan 14 00:18:47.918838 systemd[1]: Started session-98.scope - Session 98 of User core. Jan 14 00:18:47.922000 audit[7281]: USER_START pid=7281 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:47.931345 kernel: audit: type=1105 audit(1768349927.922:1545): pid=7281 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:47.931500 kernel: audit: type=1103 audit(1768349927.927:1546): pid=7285 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:47.927000 audit[7285]: CRED_ACQ pid=7285 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:48.290314 sshd[7285]: Connection closed by 4.153.228.146 port 53042 Jan 14 00:18:48.290589 sshd-session[7281]: pam_unix(sshd:session): session closed for user core Jan 14 00:18:48.291000 audit[7281]: USER_END pid=7281 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:48.300892 systemd-logind[1545]: Session 98 logged out. Waiting for processes to exit. Jan 14 00:18:48.303936 kernel: audit: type=1106 audit(1768349928.291:1547): pid=7281 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:48.304076 kernel: audit: type=1104 audit(1768349928.291:1548): pid=7281 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:48.291000 audit[7281]: CRED_DISP pid=7281 uid=0 auid=500 ses=98 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:48.304850 systemd[1]: sshd@97-46.224.77.139:22-4.153.228.146:53042.service: Deactivated successfully. Jan 14 00:18:48.305000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@97-46.224.77.139:22-4.153.228.146:53042 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:48.312254 systemd[1]: session-98.scope: Deactivated successfully. Jan 14 00:18:48.316959 systemd-logind[1545]: Removed session 98. Jan 14 00:18:49.334473 kubelet[2832]: E0114 00:18:49.334282 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:18:51.333858 kubelet[2832]: E0114 00:18:51.333787 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:18:52.334199 kubelet[2832]: E0114 00:18:52.333773 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:18:53.405086 systemd[1]: Started sshd@98-46.224.77.139:22-4.153.228.146:53048.service - OpenSSH per-connection server daemon (4.153.228.146:53048). Jan 14 00:18:53.405000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-46.224.77.139:22-4.153.228.146:53048 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:53.409136 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:53.409234 kernel: audit: type=1130 audit(1768349933.405:1550): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-46.224.77.139:22-4.153.228.146:53048 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:53.954000 audit[7299]: USER_ACCT pid=7299 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:53.956641 sshd[7299]: Accepted publickey for core from 4.153.228.146 port 53048 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:18:53.958612 kernel: audit: type=1101 audit(1768349933.954:1551): pid=7299 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:53.957000 audit[7299]: CRED_ACQ pid=7299 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:53.960357 sshd-session[7299]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:18:53.963708 kernel: audit: type=1103 audit(1768349933.957:1552): pid=7299 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:53.963794 kernel: audit: type=1006 audit(1768349933.957:1553): pid=7299 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=99 res=1 Jan 14 00:18:53.957000 audit[7299]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe411d200 a2=3 a3=0 items=0 ppid=1 pid=7299 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=99 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:53.966321 kernel: audit: type=1300 audit(1768349933.957:1553): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe411d200 a2=3 a3=0 items=0 ppid=1 pid=7299 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=99 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:18:53.957000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:53.968731 kernel: audit: type=1327 audit(1768349933.957:1553): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:18:53.974784 systemd-logind[1545]: New session 99 of user core. Jan 14 00:18:53.981760 systemd[1]: Started session-99.scope - Session 99 of User core. Jan 14 00:18:53.986000 audit[7299]: USER_START pid=7299 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:53.990000 audit[7303]: CRED_ACQ pid=7303 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:53.994282 kernel: audit: type=1105 audit(1768349933.986:1554): pid=7299 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:53.994361 kernel: audit: type=1103 audit(1768349933.990:1555): pid=7303 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:54.340216 kubelet[2832]: E0114 00:18:54.340152 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:18:54.341325 kubelet[2832]: E0114 00:18:54.341283 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:18:54.356276 sshd[7303]: Connection closed by 4.153.228.146 port 53048 Jan 14 00:18:54.357117 sshd-session[7299]: pam_unix(sshd:session): session closed for user core Jan 14 00:18:54.358000 audit[7299]: USER_END pid=7299 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:54.358000 audit[7299]: CRED_DISP pid=7299 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:54.365936 kernel: audit: type=1106 audit(1768349934.358:1556): pid=7299 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:54.365992 kernel: audit: type=1104 audit(1768349934.358:1557): pid=7299 uid=0 auid=500 ses=99 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:18:54.366321 systemd[1]: sshd@98-46.224.77.139:22-4.153.228.146:53048.service: Deactivated successfully. Jan 14 00:18:54.365000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@98-46.224.77.139:22-4.153.228.146:53048 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:54.371939 systemd[1]: session-99.scope: Deactivated successfully. Jan 14 00:18:54.375268 systemd-logind[1545]: Session 99 logged out. Waiting for processes to exit. Jan 14 00:18:54.379193 systemd-logind[1545]: Removed session 99. Jan 14 00:18:57.332818 kubelet[2832]: E0114 00:18:57.332766 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:18:59.468496 systemd[1]: Started sshd@99-46.224.77.139:22-4.153.228.146:54996.service - OpenSSH per-connection server daemon (4.153.228.146:54996). Jan 14 00:18:59.471387 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:18:59.471421 kernel: audit: type=1130 audit(1768349939.468:1559): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-46.224.77.139:22-4.153.228.146:54996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:18:59.468000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-46.224.77.139:22-4.153.228.146:54996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:00.030000 audit[7314]: USER_ACCT pid=7314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.031350 sshd[7314]: Accepted publickey for core from 4.153.228.146 port 54996 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:00.035000 audit[7314]: CRED_ACQ pid=7314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.037725 kernel: audit: type=1101 audit(1768349940.030:1560): pid=7314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.037767 kernel: audit: type=1103 audit(1768349940.035:1561): pid=7314 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.037788 kernel: audit: type=1006 audit(1768349940.035:1562): pid=7314 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=100 res=1 Jan 14 00:19:00.038081 sshd-session[7314]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:00.035000 audit[7314]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdc3f7020 a2=3 a3=0 items=0 ppid=1 pid=7314 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=100 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:00.041662 kernel: audit: type=1300 audit(1768349940.035:1562): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdc3f7020 a2=3 a3=0 items=0 ppid=1 pid=7314 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=100 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:00.042593 kernel: audit: type=1327 audit(1768349940.035:1562): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:00.035000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:00.049723 systemd-logind[1545]: New session 100 of user core. Jan 14 00:19:00.056106 systemd[1]: Started session-100.scope - Session 100 of User core. Jan 14 00:19:00.062000 audit[7314]: USER_START pid=7314 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.066000 audit[7318]: CRED_ACQ pid=7318 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.068922 kernel: audit: type=1105 audit(1768349940.062:1563): pid=7314 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.069096 kernel: audit: type=1103 audit(1768349940.066:1564): pid=7318 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.436674 sshd[7318]: Connection closed by 4.153.228.146 port 54996 Jan 14 00:19:00.439229 sshd-session[7314]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:00.444000 audit[7314]: USER_END pid=7314 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.444000 audit[7314]: CRED_DISP pid=7314 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.450434 kernel: audit: type=1106 audit(1768349940.444:1565): pid=7314 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.451073 kernel: audit: type=1104 audit(1768349940.444:1566): pid=7314 uid=0 auid=500 ses=100 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:00.454573 systemd[1]: sshd@99-46.224.77.139:22-4.153.228.146:54996.service: Deactivated successfully. Jan 14 00:19:00.457000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@99-46.224.77.139:22-4.153.228.146:54996 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:00.460133 systemd[1]: session-100.scope: Deactivated successfully. Jan 14 00:19:00.462890 systemd-logind[1545]: Session 100 logged out. Waiting for processes to exit. Jan 14 00:19:00.464067 systemd-logind[1545]: Removed session 100. Jan 14 00:19:04.334084 kubelet[2832]: E0114 00:19:04.334023 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:19:05.334453 kubelet[2832]: E0114 00:19:05.333659 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:19:05.549000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-46.224.77.139:22-4.153.228.146:49586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:05.549817 systemd[1]: Started sshd@100-46.224.77.139:22-4.153.228.146:49586.service - OpenSSH per-connection server daemon (4.153.228.146:49586). Jan 14 00:19:05.552228 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:19:05.552743 kernel: audit: type=1130 audit(1768349945.549:1568): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-46.224.77.139:22-4.153.228.146:49586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:06.091000 audit[7356]: USER_ACCT pid=7356 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.091864 sshd[7356]: Accepted publickey for core from 4.153.228.146 port 49586 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:06.097556 kernel: audit: type=1101 audit(1768349946.091:1569): pid=7356 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.097657 kernel: audit: type=1103 audit(1768349946.095:1570): pid=7356 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.095000 audit[7356]: CRED_ACQ pid=7356 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.097936 sshd-session[7356]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:06.099667 kernel: audit: type=1006 audit(1768349946.095:1571): pid=7356 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=101 res=1 Jan 14 00:19:06.095000 audit[7356]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe13982b0 a2=3 a3=0 items=0 ppid=1 pid=7356 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=101 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:06.102487 kernel: audit: type=1300 audit(1768349946.095:1571): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe13982b0 a2=3 a3=0 items=0 ppid=1 pid=7356 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=101 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:06.104615 kernel: audit: type=1327 audit(1768349946.095:1571): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:06.095000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:06.108340 systemd-logind[1545]: New session 101 of user core. Jan 14 00:19:06.113330 systemd[1]: Started session-101.scope - Session 101 of User core. Jan 14 00:19:06.117000 audit[7356]: USER_START pid=7356 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.122548 kernel: audit: type=1105 audit(1768349946.117:1572): pid=7356 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.122625 kernel: audit: type=1103 audit(1768349946.121:1573): pid=7360 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.121000 audit[7360]: CRED_ACQ pid=7360 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.342367 kubelet[2832]: E0114 00:19:06.341956 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:19:06.478686 sshd[7360]: Connection closed by 4.153.228.146 port 49586 Jan 14 00:19:06.478949 sshd-session[7356]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:06.481000 audit[7356]: USER_END pid=7356 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.481000 audit[7356]: CRED_DISP pid=7356 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.487173 kernel: audit: type=1106 audit(1768349946.481:1574): pid=7356 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.487254 kernel: audit: type=1104 audit(1768349946.481:1575): pid=7356 uid=0 auid=500 ses=101 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:06.488043 systemd[1]: sshd@100-46.224.77.139:22-4.153.228.146:49586.service: Deactivated successfully. Jan 14 00:19:06.488000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@100-46.224.77.139:22-4.153.228.146:49586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:06.492423 systemd[1]: session-101.scope: Deactivated successfully. Jan 14 00:19:06.496379 systemd-logind[1545]: Session 101 logged out. Waiting for processes to exit. Jan 14 00:19:06.500120 systemd-logind[1545]: Removed session 101. Jan 14 00:19:07.335438 kubelet[2832]: E0114 00:19:07.335366 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:19:08.338515 kubelet[2832]: E0114 00:19:08.338443 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:19:10.336361 kubelet[2832]: E0114 00:19:10.335915 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:19:11.588297 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:19:11.588437 kernel: audit: type=1130 audit(1768349951.585:1577): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-46.224.77.139:22-4.153.228.146:49600 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:11.585000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-46.224.77.139:22-4.153.228.146:49600 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:11.585810 systemd[1]: Started sshd@101-46.224.77.139:22-4.153.228.146:49600.service - OpenSSH per-connection server daemon (4.153.228.146:49600). Jan 14 00:19:12.121000 audit[7371]: USER_ACCT pid=7371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.124420 sshd[7371]: Accepted publickey for core from 4.153.228.146 port 49600 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:12.124808 kernel: audit: type=1101 audit(1768349952.121:1578): pid=7371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.126000 audit[7371]: CRED_ACQ pid=7371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.127665 sshd-session[7371]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:12.132710 kernel: audit: type=1103 audit(1768349952.126:1579): pid=7371 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.132971 kernel: audit: type=1006 audit(1768349952.126:1580): pid=7371 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=102 res=1 Jan 14 00:19:12.126000 audit[7371]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffde161fa0 a2=3 a3=0 items=0 ppid=1 pid=7371 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=102 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:12.135444 kernel: audit: type=1300 audit(1768349952.126:1580): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffde161fa0 a2=3 a3=0 items=0 ppid=1 pid=7371 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=102 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:12.126000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:12.139673 kernel: audit: type=1327 audit(1768349952.126:1580): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:12.142854 systemd-logind[1545]: New session 102 of user core. Jan 14 00:19:12.148997 systemd[1]: Started session-102.scope - Session 102 of User core. Jan 14 00:19:12.154000 audit[7371]: USER_START pid=7371 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.160000 audit[7375]: CRED_ACQ pid=7375 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.163338 kernel: audit: type=1105 audit(1768349952.154:1581): pid=7371 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.163407 kernel: audit: type=1103 audit(1768349952.160:1582): pid=7375 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.494606 sshd[7375]: Connection closed by 4.153.228.146 port 49600 Jan 14 00:19:12.495056 sshd-session[7371]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:12.496000 audit[7371]: USER_END pid=7371 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.496000 audit[7371]: CRED_DISP pid=7371 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.502707 kernel: audit: type=1106 audit(1768349952.496:1583): pid=7371 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.502781 kernel: audit: type=1104 audit(1768349952.496:1584): pid=7371 uid=0 auid=500 ses=102 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:12.503912 systemd[1]: sshd@101-46.224.77.139:22-4.153.228.146:49600.service: Deactivated successfully. Jan 14 00:19:12.504000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@101-46.224.77.139:22-4.153.228.146:49600 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:12.511681 systemd[1]: session-102.scope: Deactivated successfully. Jan 14 00:19:12.513469 systemd-logind[1545]: Session 102 logged out. Waiting for processes to exit. Jan 14 00:19:12.516000 systemd-logind[1545]: Removed session 102. Jan 14 00:19:17.340567 containerd[1590]: time="2026-01-14T00:19:17.340182038Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 14 00:19:17.605827 systemd[1]: Started sshd@102-46.224.77.139:22-4.153.228.146:46558.service - OpenSSH per-connection server daemon (4.153.228.146:46558). Jan 14 00:19:17.605000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-46.224.77.139:22-4.153.228.146:46558 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:17.608703 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:19:17.608762 kernel: audit: type=1130 audit(1768349957.605:1586): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-46.224.77.139:22-4.153.228.146:46558 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:17.680912 containerd[1590]: time="2026-01-14T00:19:17.680734401Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:19:17.682235 containerd[1590]: time="2026-01-14T00:19:17.682092551Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 14 00:19:17.682235 containerd[1590]: time="2026-01-14T00:19:17.682187793Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 14 00:19:17.682410 kubelet[2832]: E0114 00:19:17.682355 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:19:17.682841 kubelet[2832]: E0114 00:19:17.682411 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 14 00:19:17.682841 kubelet[2832]: E0114 00:19:17.682561 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 14 00:19:17.684846 containerd[1590]: time="2026-01-14T00:19:17.684600687Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 14 00:19:18.011960 containerd[1590]: time="2026-01-14T00:19:18.011843593Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:19:18.013650 containerd[1590]: time="2026-01-14T00:19:18.013481670Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 14 00:19:18.013960 containerd[1590]: time="2026-01-14T00:19:18.013537111Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 14 00:19:18.014312 kubelet[2832]: E0114 00:19:18.014203 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:19:18.014312 kubelet[2832]: E0114 00:19:18.014279 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 14 00:19:18.014730 kubelet[2832]: E0114 00:19:18.014657 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rt5nl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-8jmff_calico-system(6c288445-910a-4d1d-9b62-12f5155b11be): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 14 00:19:18.016028 kubelet[2832]: E0114 00:19:18.015974 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:19:18.176000 audit[7387]: USER_ACCT pid=7387 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.180741 sshd[7387]: Accepted publickey for core from 4.153.228.146 port 46558 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:18.181434 sshd-session[7387]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:18.179000 audit[7387]: CRED_ACQ pid=7387 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.183653 kernel: audit: type=1101 audit(1768349958.176:1587): pid=7387 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.184352 kernel: audit: type=1103 audit(1768349958.179:1588): pid=7387 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.186418 kernel: audit: type=1006 audit(1768349958.179:1589): pid=7387 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=103 res=1 Jan 14 00:19:18.186484 kernel: audit: type=1300 audit(1768349958.179:1589): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe6e1d8e0 a2=3 a3=0 items=0 ppid=1 pid=7387 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=103 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:18.179000 audit[7387]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe6e1d8e0 a2=3 a3=0 items=0 ppid=1 pid=7387 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=103 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:18.187962 kernel: audit: type=1327 audit(1768349958.179:1589): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:18.179000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:18.196674 systemd-logind[1545]: New session 103 of user core. Jan 14 00:19:18.203729 systemd[1]: Started session-103.scope - Session 103 of User core. Jan 14 00:19:18.208000 audit[7387]: USER_START pid=7387 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.213585 kernel: audit: type=1105 audit(1768349958.208:1590): pid=7387 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.213000 audit[7391]: CRED_ACQ pid=7391 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.217622 kernel: audit: type=1103 audit(1768349958.213:1591): pid=7391 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.342092 kubelet[2832]: E0114 00:19:18.342028 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:19:18.567377 sshd[7391]: Connection closed by 4.153.228.146 port 46558 Jan 14 00:19:18.570262 sshd-session[7387]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:18.571000 audit[7387]: USER_END pid=7387 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.572000 audit[7387]: CRED_DISP pid=7387 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.578150 systemd[1]: sshd@102-46.224.77.139:22-4.153.228.146:46558.service: Deactivated successfully. Jan 14 00:19:18.578253 kernel: audit: type=1106 audit(1768349958.571:1592): pid=7387 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.578304 kernel: audit: type=1104 audit(1768349958.572:1593): pid=7387 uid=0 auid=500 ses=103 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:18.578267 systemd-logind[1545]: Session 103 logged out. Waiting for processes to exit. Jan 14 00:19:18.578000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@102-46.224.77.139:22-4.153.228.146:46558 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:18.581926 systemd[1]: session-103.scope: Deactivated successfully. Jan 14 00:19:18.587092 systemd-logind[1545]: Removed session 103. Jan 14 00:19:19.334732 containerd[1590]: time="2026-01-14T00:19:19.334684927Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 14 00:19:19.681668 containerd[1590]: time="2026-01-14T00:19:19.681564379Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:19:19.682969 containerd[1590]: time="2026-01-14T00:19:19.682907289Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 14 00:19:19.683077 containerd[1590]: time="2026-01-14T00:19:19.682997731Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 14 00:19:19.683531 kubelet[2832]: E0114 00:19:19.683457 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:19:19.683531 kubelet[2832]: E0114 00:19:19.683508 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 14 00:19:19.683902 kubelet[2832]: E0114 00:19:19.683721 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r74wh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-b44cc6f4-gxl6c_calico-system(5ee70bb0-55b7-4a80-b5cb-3133091615ae): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 14 00:19:19.685123 kubelet[2832]: E0114 00:19:19.685052 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:19:20.333851 containerd[1590]: time="2026-01-14T00:19:20.333808700Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:19:20.693012 containerd[1590]: time="2026-01-14T00:19:20.692731777Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:19:20.694559 containerd[1590]: time="2026-01-14T00:19:20.694420335Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:19:20.694757 containerd[1590]: time="2026-01-14T00:19:20.694567459Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:19:20.694976 kubelet[2832]: E0114 00:19:20.694851 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:19:20.694976 kubelet[2832]: E0114 00:19:20.694895 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:19:20.695510 kubelet[2832]: E0114 00:19:20.695008 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-cr65d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-7bt2c_calico-apiserver(3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:19:20.696388 kubelet[2832]: E0114 00:19:20.696300 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:19:21.335059 kubelet[2832]: E0114 00:19:21.334964 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:19:22.337737 containerd[1590]: time="2026-01-14T00:19:22.337696381Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 14 00:19:22.690228 containerd[1590]: time="2026-01-14T00:19:22.690081658Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:19:22.691821 containerd[1590]: time="2026-01-14T00:19:22.691709536Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 14 00:19:22.691821 containerd[1590]: time="2026-01-14T00:19:22.691760217Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 14 00:19:22.693109 kubelet[2832]: E0114 00:19:22.692111 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:19:22.693109 kubelet[2832]: E0114 00:19:22.692163 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 14 00:19:22.693109 kubelet[2832]: E0114 00:19:22.692476 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:8c290d008f4c4d48b25c8570357599ee,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 14 00:19:22.696220 containerd[1590]: time="2026-01-14T00:19:22.695794189Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 14 00:19:23.042919 containerd[1590]: time="2026-01-14T00:19:23.042774627Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:19:23.044808 containerd[1590]: time="2026-01-14T00:19:23.044733791Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 14 00:19:23.044922 containerd[1590]: time="2026-01-14T00:19:23.044867834Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 14 00:19:23.045537 kubelet[2832]: E0114 00:19:23.045080 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:19:23.045537 kubelet[2832]: E0114 00:19:23.045135 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 14 00:19:23.045537 kubelet[2832]: E0114 00:19:23.045263 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z2vgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-684bfd8c46-zxdr6_calico-system(5ea780f2-7146-4be4-95de-faccba85fdbd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 14 00:19:23.046817 kubelet[2832]: E0114 00:19:23.046775 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:19:23.678000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-46.224.77.139:22-4.153.228.146:46564 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:23.679824 systemd[1]: Started sshd@103-46.224.77.139:22-4.153.228.146:46564.service - OpenSSH per-connection server daemon (4.153.228.146:46564). Jan 14 00:19:23.682540 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:19:23.682619 kernel: audit: type=1130 audit(1768349963.678:1595): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-46.224.77.139:22-4.153.228.146:46564 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:24.241000 audit[7413]: USER_ACCT pid=7413 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.243751 sshd[7413]: Accepted publickey for core from 4.153.228.146 port 46564 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:24.245000 audit[7413]: CRED_ACQ pid=7413 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.249027 sshd-session[7413]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:24.249466 kernel: audit: type=1101 audit(1768349964.241:1596): pid=7413 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.249512 kernel: audit: type=1103 audit(1768349964.245:1597): pid=7413 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.252326 kernel: audit: type=1006 audit(1768349964.246:1598): pid=7413 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=104 res=1 Jan 14 00:19:24.252387 kernel: audit: type=1300 audit(1768349964.246:1598): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffb58eca0 a2=3 a3=0 items=0 ppid=1 pid=7413 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=104 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:24.246000 audit[7413]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffb58eca0 a2=3 a3=0 items=0 ppid=1 pid=7413 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=104 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:24.246000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:24.256034 kernel: audit: type=1327 audit(1768349964.246:1598): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:24.260682 systemd-logind[1545]: New session 104 of user core. Jan 14 00:19:24.265811 systemd[1]: Started session-104.scope - Session 104 of User core. Jan 14 00:19:24.271000 audit[7413]: USER_START pid=7413 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.277543 kernel: audit: type=1105 audit(1768349964.271:1599): pid=7413 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.276000 audit[7417]: CRED_ACQ pid=7417 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.282562 kernel: audit: type=1103 audit(1768349964.276:1600): pid=7417 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.624429 sshd[7417]: Connection closed by 4.153.228.146 port 46564 Jan 14 00:19:24.625606 sshd-session[7413]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:24.625000 audit[7413]: USER_END pid=7413 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.625000 audit[7413]: CRED_DISP pid=7413 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.632651 kernel: audit: type=1106 audit(1768349964.625:1601): pid=7413 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.633350 systemd[1]: sshd@103-46.224.77.139:22-4.153.228.146:46564.service: Deactivated successfully. Jan 14 00:19:24.633000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@103-46.224.77.139:22-4.153.228.146:46564 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:24.635589 kernel: audit: type=1104 audit(1768349964.625:1602): pid=7413 uid=0 auid=500 ses=104 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:24.639580 systemd[1]: session-104.scope: Deactivated successfully. Jan 14 00:19:24.644564 systemd-logind[1545]: Session 104 logged out. Waiting for processes to exit. Jan 14 00:19:24.645918 systemd-logind[1545]: Removed session 104. Jan 14 00:19:29.740000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-46.224.77.139:22-4.153.228.146:54274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:29.741584 systemd[1]: Started sshd@104-46.224.77.139:22-4.153.228.146:54274.service - OpenSSH per-connection server daemon (4.153.228.146:54274). Jan 14 00:19:29.746547 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:19:29.746630 kernel: audit: type=1130 audit(1768349969.740:1604): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-46.224.77.139:22-4.153.228.146:54274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:30.304000 audit[7429]: USER_ACCT pid=7429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.308073 sshd[7429]: Accepted publickey for core from 4.153.228.146 port 54274 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:30.308551 kernel: audit: type=1101 audit(1768349970.304:1605): pid=7429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.308000 audit[7429]: CRED_ACQ pid=7429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.310675 sshd-session[7429]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:30.313935 kernel: audit: type=1103 audit(1768349970.308:1606): pid=7429 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.314000 kernel: audit: type=1006 audit(1768349970.308:1607): pid=7429 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=105 res=1 Jan 14 00:19:30.308000 audit[7429]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3e6a700 a2=3 a3=0 items=0 ppid=1 pid=7429 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=105 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:30.316590 kernel: audit: type=1300 audit(1768349970.308:1607): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3e6a700 a2=3 a3=0 items=0 ppid=1 pid=7429 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=105 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:30.308000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:30.318675 kernel: audit: type=1327 audit(1768349970.308:1607): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:30.320651 systemd-logind[1545]: New session 105 of user core. Jan 14 00:19:30.325789 systemd[1]: Started session-105.scope - Session 105 of User core. Jan 14 00:19:30.329000 audit[7429]: USER_START pid=7429 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.335000 audit[7433]: CRED_ACQ pid=7433 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.340022 kernel: audit: type=1105 audit(1768349970.329:1608): pid=7429 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.340110 kernel: audit: type=1103 audit(1768349970.335:1609): pid=7433 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.341724 kubelet[2832]: E0114 00:19:30.340982 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:19:30.342811 kubelet[2832]: E0114 00:19:30.342764 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:19:30.690432 sshd[7433]: Connection closed by 4.153.228.146 port 54274 Jan 14 00:19:30.691034 sshd-session[7429]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:30.692000 audit[7429]: USER_END pid=7429 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.698708 kernel: audit: type=1106 audit(1768349970.692:1610): pid=7429 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.699005 kernel: audit: type=1104 audit(1768349970.692:1611): pid=7429 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.692000 audit[7429]: CRED_DISP pid=7429 uid=0 auid=500 ses=105 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:30.699274 systemd[1]: sshd@104-46.224.77.139:22-4.153.228.146:54274.service: Deactivated successfully. Jan 14 00:19:30.702413 systemd[1]: session-105.scope: Deactivated successfully. Jan 14 00:19:30.698000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@104-46.224.77.139:22-4.153.228.146:54274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:30.704624 systemd-logind[1545]: Session 105 logged out. Waiting for processes to exit. Jan 14 00:19:30.708720 systemd-logind[1545]: Removed session 105. Jan 14 00:19:31.335649 kubelet[2832]: E0114 00:19:31.335603 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:19:33.333776 containerd[1590]: time="2026-01-14T00:19:33.333631184Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 14 00:19:33.684556 containerd[1590]: time="2026-01-14T00:19:33.683707759Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:19:33.686410 containerd[1590]: time="2026-01-14T00:19:33.686253660Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 14 00:19:33.686410 containerd[1590]: time="2026-01-14T00:19:33.686350742Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 14 00:19:33.686696 kubelet[2832]: E0114 00:19:33.686622 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:19:33.687263 kubelet[2832]: E0114 00:19:33.686706 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 14 00:19:33.687294 containerd[1590]: time="2026-01-14T00:19:33.687167041Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 14 00:19:33.688686 kubelet[2832]: E0114 00:19:33.686990 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p28dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-hrn72_calico-system(1e53bd66-4746-482e-bb2b-bfd29a1ef20e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 14 00:19:33.689869 kubelet[2832]: E0114 00:19:33.689822 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:19:34.034427 containerd[1590]: time="2026-01-14T00:19:34.034017422Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 14 00:19:34.035655 containerd[1590]: time="2026-01-14T00:19:34.035603620Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 14 00:19:34.035745 containerd[1590]: time="2026-01-14T00:19:34.035668661Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 14 00:19:34.036239 kubelet[2832]: E0114 00:19:34.036198 2832 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:19:34.036611 kubelet[2832]: E0114 00:19:34.036354 2832 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 14 00:19:34.036611 kubelet[2832]: E0114 00:19:34.036494 2832 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-md5mt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6f67969d8d-vdxqm_calico-apiserver(1e4bec8e-a684-46cb-852e-ae05ed7b56d7): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 14 00:19:34.037883 kubelet[2832]: E0114 00:19:34.037831 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:19:35.804030 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:19:35.804162 kernel: audit: type=1130 audit(1768349975.800:1613): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-46.224.77.139:22-4.153.228.146:35204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:35.800000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-46.224.77.139:22-4.153.228.146:35204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:35.802072 systemd[1]: Started sshd@105-46.224.77.139:22-4.153.228.146:35204.service - OpenSSH per-connection server daemon (4.153.228.146:35204). Jan 14 00:19:36.342539 kubelet[2832]: E0114 00:19:36.340724 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:19:36.362000 audit[7476]: USER_ACCT pid=7476 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.364973 sshd[7476]: Accepted publickey for core from 4.153.228.146 port 35204 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:36.366000 audit[7476]: CRED_ACQ pid=7476 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.369447 sshd-session[7476]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:36.370084 kernel: audit: type=1101 audit(1768349976.362:1614): pid=7476 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.370141 kernel: audit: type=1103 audit(1768349976.366:1615): pid=7476 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.370163 kernel: audit: type=1006 audit(1768349976.366:1616): pid=7476 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=106 res=1 Jan 14 00:19:36.366000 audit[7476]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe4f81840 a2=3 a3=0 items=0 ppid=1 pid=7476 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=106 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:36.374400 kernel: audit: type=1300 audit(1768349976.366:1616): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe4f81840 a2=3 a3=0 items=0 ppid=1 pid=7476 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=106 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:36.366000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:36.376546 kernel: audit: type=1327 audit(1768349976.366:1616): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:36.381190 systemd-logind[1545]: New session 106 of user core. Jan 14 00:19:36.385839 systemd[1]: Started session-106.scope - Session 106 of User core. Jan 14 00:19:36.388000 audit[7476]: USER_START pid=7476 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.392000 audit[7480]: CRED_ACQ pid=7480 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.396412 kernel: audit: type=1105 audit(1768349976.388:1617): pid=7476 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.396487 kernel: audit: type=1103 audit(1768349976.392:1618): pid=7480 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.789375 sshd[7480]: Connection closed by 4.153.228.146 port 35204 Jan 14 00:19:36.790428 sshd-session[7476]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:36.791000 audit[7476]: USER_END pid=7476 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.791000 audit[7476]: CRED_DISP pid=7476 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.799899 kernel: audit: type=1106 audit(1768349976.791:1619): pid=7476 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.799976 kernel: audit: type=1104 audit(1768349976.791:1620): pid=7476 uid=0 auid=500 ses=106 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:36.798211 systemd[1]: sshd@105-46.224.77.139:22-4.153.228.146:35204.service: Deactivated successfully. Jan 14 00:19:36.799000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@105-46.224.77.139:22-4.153.228.146:35204 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:36.803432 systemd[1]: session-106.scope: Deactivated successfully. Jan 14 00:19:36.806061 systemd-logind[1545]: Session 106 logged out. Waiting for processes to exit. Jan 14 00:19:36.809683 systemd-logind[1545]: Removed session 106. Jan 14 00:19:41.334929 kubelet[2832]: E0114 00:19:41.334851 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:19:41.899318 systemd[1]: Started sshd@106-46.224.77.139:22-4.153.228.146:35218.service - OpenSSH per-connection server daemon (4.153.228.146:35218). Jan 14 00:19:41.903224 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:19:41.903258 kernel: audit: type=1130 audit(1768349981.898:1622): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-46.224.77.139:22-4.153.228.146:35218 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:41.898000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-46.224.77.139:22-4.153.228.146:35218 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:42.335234 kubelet[2832]: E0114 00:19:42.334841 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:19:42.453000 audit[7505]: USER_ACCT pid=7505 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.458743 sshd[7505]: Accepted publickey for core from 4.153.228.146 port 35218 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:42.459588 kernel: audit: type=1101 audit(1768349982.453:1623): pid=7505 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.459000 audit[7505]: CRED_ACQ pid=7505 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.462862 sshd-session[7505]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:42.465970 kernel: audit: type=1103 audit(1768349982.459:1624): pid=7505 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.466083 kernel: audit: type=1006 audit(1768349982.459:1625): pid=7505 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=107 res=1 Jan 14 00:19:42.466109 kernel: audit: type=1300 audit(1768349982.459:1625): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff304b800 a2=3 a3=0 items=0 ppid=1 pid=7505 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=107 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:42.459000 audit[7505]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff304b800 a2=3 a3=0 items=0 ppid=1 pid=7505 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=107 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:42.459000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:42.467920 kernel: audit: type=1327 audit(1768349982.459:1625): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:42.473629 systemd-logind[1545]: New session 107 of user core. Jan 14 00:19:42.483978 systemd[1]: Started session-107.scope - Session 107 of User core. Jan 14 00:19:42.490000 audit[7505]: USER_START pid=7505 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.495663 kernel: audit: type=1105 audit(1768349982.490:1626): pid=7505 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.497000 audit[7509]: CRED_ACQ pid=7509 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.499566 kernel: audit: type=1103 audit(1768349982.497:1627): pid=7509 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.850546 sshd[7509]: Connection closed by 4.153.228.146 port 35218 Jan 14 00:19:42.852639 sshd-session[7505]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:42.853000 audit[7505]: USER_END pid=7505 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.853000 audit[7505]: CRED_DISP pid=7505 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.863457 kernel: audit: type=1106 audit(1768349982.853:1628): pid=7505 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.863514 kernel: audit: type=1104 audit(1768349982.853:1629): pid=7505 uid=0 auid=500 ses=107 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:42.865006 systemd[1]: sshd@106-46.224.77.139:22-4.153.228.146:35218.service: Deactivated successfully. Jan 14 00:19:42.865000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@106-46.224.77.139:22-4.153.228.146:35218 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:42.868181 systemd[1]: session-107.scope: Deactivated successfully. Jan 14 00:19:42.869354 systemd-logind[1545]: Session 107 logged out. Waiting for processes to exit. Jan 14 00:19:42.871906 systemd-logind[1545]: Removed session 107. Jan 14 00:19:44.333556 kubelet[2832]: E0114 00:19:44.332979 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:19:47.334846 kubelet[2832]: E0114 00:19:47.334207 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:19:47.971000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-46.224.77.139:22-4.153.228.146:57686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:47.971845 systemd[1]: Started sshd@107-46.224.77.139:22-4.153.228.146:57686.service - OpenSSH per-connection server daemon (4.153.228.146:57686). Jan 14 00:19:47.972899 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:19:47.972924 kernel: audit: type=1130 audit(1768349987.971:1631): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-46.224.77.139:22-4.153.228.146:57686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:48.335157 kubelet[2832]: E0114 00:19:48.335058 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:19:48.337011 kubelet[2832]: E0114 00:19:48.336977 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:19:48.527000 audit[7522]: USER_ACCT pid=7522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.527989 sshd[7522]: Accepted publickey for core from 4.153.228.146 port 57686 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:48.530000 audit[7522]: CRED_ACQ pid=7522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.533035 kernel: audit: type=1101 audit(1768349988.527:1632): pid=7522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.533115 kernel: audit: type=1103 audit(1768349988.530:1633): pid=7522 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.532930 sshd-session[7522]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:48.536550 kernel: audit: type=1006 audit(1768349988.530:1634): pid=7522 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=108 res=1 Jan 14 00:19:48.537297 kernel: audit: type=1300 audit(1768349988.530:1634): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe549e470 a2=3 a3=0 items=0 ppid=1 pid=7522 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=108 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:48.530000 audit[7522]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe549e470 a2=3 a3=0 items=0 ppid=1 pid=7522 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=108 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:48.530000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:48.541666 kernel: audit: type=1327 audit(1768349988.530:1634): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:48.544746 systemd-logind[1545]: New session 108 of user core. Jan 14 00:19:48.549675 systemd[1]: Started session-108.scope - Session 108 of User core. Jan 14 00:19:48.553000 audit[7522]: USER_START pid=7522 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.559626 kernel: audit: type=1105 audit(1768349988.553:1635): pid=7522 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.559000 audit[7526]: CRED_ACQ pid=7526 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.563559 kernel: audit: type=1103 audit(1768349988.559:1636): pid=7526 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.938302 sshd[7526]: Connection closed by 4.153.228.146 port 57686 Jan 14 00:19:48.939128 sshd-session[7522]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:48.940000 audit[7522]: USER_END pid=7522 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.940000 audit[7522]: CRED_DISP pid=7522 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.948276 kernel: audit: type=1106 audit(1768349988.940:1637): pid=7522 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.948409 kernel: audit: type=1104 audit(1768349988.940:1638): pid=7522 uid=0 auid=500 ses=108 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:48.949232 systemd[1]: sshd@107-46.224.77.139:22-4.153.228.146:57686.service: Deactivated successfully. Jan 14 00:19:48.949000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@107-46.224.77.139:22-4.153.228.146:57686 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:48.951995 systemd[1]: session-108.scope: Deactivated successfully. Jan 14 00:19:48.953370 systemd-logind[1545]: Session 108 logged out. Waiting for processes to exit. Jan 14 00:19:48.957045 systemd-logind[1545]: Removed session 108. Jan 14 00:19:54.049000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-46.224.77.139:22-4.153.228.146:57692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:54.049815 systemd[1]: Started sshd@108-46.224.77.139:22-4.153.228.146:57692.service - OpenSSH per-connection server daemon (4.153.228.146:57692). Jan 14 00:19:54.052131 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:19:54.052224 kernel: audit: type=1130 audit(1768349994.049:1640): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-46.224.77.139:22-4.153.228.146:57692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:54.334279 kubelet[2832]: E0114 00:19:54.334011 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:19:54.336492 kubelet[2832]: E0114 00:19:54.336416 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:19:54.604000 audit[7540]: USER_ACCT pid=7540 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:54.607287 sshd[7540]: Accepted publickey for core from 4.153.228.146 port 57692 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:19:54.607667 kernel: audit: type=1101 audit(1768349994.604:1641): pid=7540 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:54.609000 audit[7540]: CRED_ACQ pid=7540 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:54.611255 sshd-session[7540]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:19:54.614754 kernel: audit: type=1103 audit(1768349994.609:1642): pid=7540 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:54.614923 kernel: audit: type=1006 audit(1768349994.609:1643): pid=7540 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=109 res=1 Jan 14 00:19:54.609000 audit[7540]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffedc5f30 a2=3 a3=0 items=0 ppid=1 pid=7540 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=109 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:54.617698 kernel: audit: type=1300 audit(1768349994.609:1643): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffedc5f30 a2=3 a3=0 items=0 ppid=1 pid=7540 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=109 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:19:54.609000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:54.620546 kernel: audit: type=1327 audit(1768349994.609:1643): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:19:54.621882 systemd-logind[1545]: New session 109 of user core. Jan 14 00:19:54.627809 systemd[1]: Started session-109.scope - Session 109 of User core. Jan 14 00:19:54.632000 audit[7540]: USER_START pid=7540 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:54.637000 audit[7544]: CRED_ACQ pid=7544 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:54.640589 kernel: audit: type=1105 audit(1768349994.632:1644): pid=7540 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:54.640665 kernel: audit: type=1103 audit(1768349994.637:1645): pid=7544 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:55.002686 sshd[7544]: Connection closed by 4.153.228.146 port 57692 Jan 14 00:19:55.003924 sshd-session[7540]: pam_unix(sshd:session): session closed for user core Jan 14 00:19:55.006000 audit[7540]: USER_END pid=7540 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:55.007000 audit[7540]: CRED_DISP pid=7540 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:55.013363 kernel: audit: type=1106 audit(1768349995.006:1646): pid=7540 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:55.013674 kernel: audit: type=1104 audit(1768349995.007:1647): pid=7540 uid=0 auid=500 ses=109 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:19:55.014224 systemd[1]: sshd@108-46.224.77.139:22-4.153.228.146:57692.service: Deactivated successfully. Jan 14 00:19:55.015000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@108-46.224.77.139:22-4.153.228.146:57692 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:19:55.018477 systemd[1]: session-109.scope: Deactivated successfully. Jan 14 00:19:55.022430 systemd-logind[1545]: Session 109 logged out. Waiting for processes to exit. Jan 14 00:19:55.023986 systemd-logind[1545]: Removed session 109. Jan 14 00:19:56.333875 kubelet[2832]: E0114 00:19:56.333809 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:19:59.335051 kubelet[2832]: E0114 00:19:59.334949 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:20:00.113126 systemd[1]: Started sshd@109-46.224.77.139:22-4.153.228.146:60286.service - OpenSSH per-connection server daemon (4.153.228.146:60286). Jan 14 00:20:00.116016 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:00.116059 kernel: audit: type=1130 audit(1768350000.112:1649): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-46.224.77.139:22-4.153.228.146:60286 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:00.112000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-46.224.77.139:22-4.153.228.146:60286 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:00.650000 audit[7556]: USER_ACCT pid=7556 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:00.651144 sshd[7556]: Accepted publickey for core from 4.153.228.146 port 60286 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:00.654000 audit[7556]: CRED_ACQ pid=7556 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:00.655755 kernel: audit: type=1101 audit(1768350000.650:1650): pid=7556 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:00.656368 sshd-session[7556]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:00.661057 kernel: audit: type=1103 audit(1768350000.654:1651): pid=7556 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:00.661189 kernel: audit: type=1006 audit(1768350000.655:1652): pid=7556 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=110 res=1 Jan 14 00:20:00.655000 audit[7556]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc90a6880 a2=3 a3=0 items=0 ppid=1 pid=7556 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=110 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:00.655000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:00.666412 kernel: audit: type=1300 audit(1768350000.655:1652): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc90a6880 a2=3 a3=0 items=0 ppid=1 pid=7556 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=110 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:00.666503 kernel: audit: type=1327 audit(1768350000.655:1652): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:00.670795 systemd-logind[1545]: New session 110 of user core. Jan 14 00:20:00.676888 systemd[1]: Started session-110.scope - Session 110 of User core. Jan 14 00:20:00.681000 audit[7556]: USER_START pid=7556 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:00.683000 audit[7560]: CRED_ACQ pid=7560 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:00.687626 kernel: audit: type=1105 audit(1768350000.681:1653): pid=7556 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:00.687728 kernel: audit: type=1103 audit(1768350000.683:1654): pid=7560 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:01.036275 sshd[7560]: Connection closed by 4.153.228.146 port 60286 Jan 14 00:20:01.038448 sshd-session[7556]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:01.041000 audit[7556]: USER_END pid=7556 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:01.046487 systemd[1]: sshd@109-46.224.77.139:22-4.153.228.146:60286.service: Deactivated successfully. Jan 14 00:20:01.048479 kernel: audit: type=1106 audit(1768350001.041:1655): pid=7556 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:01.048559 kernel: audit: type=1104 audit(1768350001.041:1656): pid=7556 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:01.041000 audit[7556]: CRED_DISP pid=7556 uid=0 auid=500 ses=110 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:01.046000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@109-46.224.77.139:22-4.153.228.146:60286 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:01.054012 systemd[1]: session-110.scope: Deactivated successfully. Jan 14 00:20:01.057817 systemd-logind[1545]: Session 110 logged out. Waiting for processes to exit. Jan 14 00:20:01.059234 systemd-logind[1545]: Removed session 110. Jan 14 00:20:02.337905 kubelet[2832]: E0114 00:20:02.336569 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:20:02.342272 kubelet[2832]: E0114 00:20:02.341973 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:20:06.153096 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:06.153243 kernel: audit: type=1130 audit(1768350006.149:1658): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-46.224.77.139:22-4.153.228.146:53474 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:06.149000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-46.224.77.139:22-4.153.228.146:53474 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:06.148986 systemd[1]: Started sshd@110-46.224.77.139:22-4.153.228.146:53474.service - OpenSSH per-connection server daemon (4.153.228.146:53474). Jan 14 00:20:06.709000 audit[7594]: USER_ACCT pid=7594 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:06.712211 sshd[7594]: Accepted publickey for core from 4.153.228.146 port 53474 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:06.713546 kernel: audit: type=1101 audit(1768350006.709:1659): pid=7594 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:06.713626 kernel: audit: type=1103 audit(1768350006.711:1660): pid=7594 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:06.711000 audit[7594]: CRED_ACQ pid=7594 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:06.716213 sshd-session[7594]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:06.719376 kernel: audit: type=1006 audit(1768350006.711:1661): pid=7594 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=111 res=1 Jan 14 00:20:06.711000 audit[7594]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff14e3c70 a2=3 a3=0 items=0 ppid=1 pid=7594 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=111 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:06.722576 kernel: audit: type=1300 audit(1768350006.711:1661): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff14e3c70 a2=3 a3=0 items=0 ppid=1 pid=7594 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=111 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:06.711000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:06.729365 kernel: audit: type=1327 audit(1768350006.711:1661): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:06.730409 systemd-logind[1545]: New session 111 of user core. Jan 14 00:20:06.734795 systemd[1]: Started session-111.scope - Session 111 of User core. Jan 14 00:20:06.738000 audit[7594]: USER_START pid=7594 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:06.741000 audit[7598]: CRED_ACQ pid=7598 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:06.745535 kernel: audit: type=1105 audit(1768350006.738:1662): pid=7594 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:06.745630 kernel: audit: type=1103 audit(1768350006.741:1663): pid=7598 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:07.092243 sshd[7598]: Connection closed by 4.153.228.146 port 53474 Jan 14 00:20:07.093331 sshd-session[7594]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:07.093000 audit[7594]: USER_END pid=7594 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:07.098953 systemd[1]: session-111.scope: Deactivated successfully. Jan 14 00:20:07.102409 kernel: audit: type=1106 audit(1768350007.093:1664): pid=7594 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:07.102492 kernel: audit: type=1104 audit(1768350007.093:1665): pid=7594 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:07.093000 audit[7594]: CRED_DISP pid=7594 uid=0 auid=500 ses=111 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:07.101706 systemd[1]: sshd@110-46.224.77.139:22-4.153.228.146:53474.service: Deactivated successfully. Jan 14 00:20:07.100000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@110-46.224.77.139:22-4.153.228.146:53474 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:07.105970 systemd-logind[1545]: Session 111 logged out. Waiting for processes to exit. Jan 14 00:20:07.108379 systemd-logind[1545]: Removed session 111. Jan 14 00:20:07.334017 kubelet[2832]: E0114 00:20:07.333911 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:20:08.337262 kubelet[2832]: E0114 00:20:08.337096 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:20:11.333764 kubelet[2832]: E0114 00:20:11.333680 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:20:12.212072 systemd[1]: Started sshd@111-46.224.77.139:22-4.153.228.146:53486.service - OpenSSH per-connection server daemon (4.153.228.146:53486). Jan 14 00:20:12.215860 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:12.216064 kernel: audit: type=1130 audit(1768350012.211:1667): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-46.224.77.139:22-4.153.228.146:53486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:12.211000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-46.224.77.139:22-4.153.228.146:53486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:12.337340 kubelet[2832]: E0114 00:20:12.337279 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:20:12.771000 audit[7610]: USER_ACCT pid=7610 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:12.773512 sshd[7610]: Accepted publickey for core from 4.153.228.146 port 53486 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:12.776549 kernel: audit: type=1101 audit(1768350012.771:1668): pid=7610 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:12.775000 audit[7610]: CRED_ACQ pid=7610 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:12.778112 sshd-session[7610]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:12.781697 kernel: audit: type=1103 audit(1768350012.775:1669): pid=7610 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:12.781793 kernel: audit: type=1006 audit(1768350012.775:1670): pid=7610 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=112 res=1 Jan 14 00:20:12.775000 audit[7610]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe35dd4f0 a2=3 a3=0 items=0 ppid=1 pid=7610 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=112 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:12.785406 kernel: audit: type=1300 audit(1768350012.775:1670): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe35dd4f0 a2=3 a3=0 items=0 ppid=1 pid=7610 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=112 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:12.775000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:12.788295 kernel: audit: type=1327 audit(1768350012.775:1670): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:12.789153 systemd-logind[1545]: New session 112 of user core. Jan 14 00:20:12.796148 systemd[1]: Started session-112.scope - Session 112 of User core. Jan 14 00:20:12.799000 audit[7610]: USER_START pid=7610 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:12.803592 kernel: audit: type=1105 audit(1768350012.799:1671): pid=7610 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:12.802000 audit[7614]: CRED_ACQ pid=7614 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:12.809576 kernel: audit: type=1103 audit(1768350012.802:1672): pid=7614 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:13.214946 sshd[7614]: Connection closed by 4.153.228.146 port 53486 Jan 14 00:20:13.215846 sshd-session[7610]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:13.217000 audit[7610]: USER_END pid=7610 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:13.218000 audit[7610]: CRED_DISP pid=7610 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:13.223642 kernel: audit: type=1106 audit(1768350013.217:1673): pid=7610 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:13.223725 kernel: audit: type=1104 audit(1768350013.218:1674): pid=7610 uid=0 auid=500 ses=112 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:13.224882 systemd[1]: sshd@111-46.224.77.139:22-4.153.228.146:53486.service: Deactivated successfully. Jan 14 00:20:13.225000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@111-46.224.77.139:22-4.153.228.146:53486 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:13.228572 systemd[1]: session-112.scope: Deactivated successfully. Jan 14 00:20:13.230288 systemd-logind[1545]: Session 112 logged out. Waiting for processes to exit. Jan 14 00:20:13.233032 systemd-logind[1545]: Removed session 112. Jan 14 00:20:14.334558 kubelet[2832]: E0114 00:20:14.333982 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:20:15.335120 kubelet[2832]: E0114 00:20:15.335054 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:20:18.338920 kubelet[2832]: E0114 00:20:18.338863 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:20:18.342347 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:18.342381 kernel: audit: type=1130 audit(1768350018.338:1676): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-46.224.77.139:22-4.153.228.146:56674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:18.338000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-46.224.77.139:22-4.153.228.146:56674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:18.339812 systemd[1]: Started sshd@112-46.224.77.139:22-4.153.228.146:56674.service - OpenSSH per-connection server daemon (4.153.228.146:56674). Jan 14 00:20:18.911000 audit[7626]: USER_ACCT pid=7626 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:18.915031 sshd[7626]: Accepted publickey for core from 4.153.228.146 port 56674 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:18.916611 kernel: audit: type=1101 audit(1768350018.911:1677): pid=7626 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:18.917482 sshd-session[7626]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:18.915000 audit[7626]: CRED_ACQ pid=7626 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:18.921746 kernel: audit: type=1103 audit(1768350018.915:1678): pid=7626 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:18.921874 kernel: audit: type=1006 audit(1768350018.915:1679): pid=7626 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=113 res=1 Jan 14 00:20:18.915000 audit[7626]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed278400 a2=3 a3=0 items=0 ppid=1 pid=7626 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=113 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:18.924610 kernel: audit: type=1300 audit(1768350018.915:1679): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed278400 a2=3 a3=0 items=0 ppid=1 pid=7626 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=113 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:18.915000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:18.926562 kernel: audit: type=1327 audit(1768350018.915:1679): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:18.934806 systemd-logind[1545]: New session 113 of user core. Jan 14 00:20:18.941936 systemd[1]: Started session-113.scope - Session 113 of User core. Jan 14 00:20:18.945000 audit[7626]: USER_START pid=7626 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:18.950566 kernel: audit: type=1105 audit(1768350018.945:1680): pid=7626 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:18.949000 audit[7630]: CRED_ACQ pid=7630 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:18.954552 kernel: audit: type=1103 audit(1768350018.949:1681): pid=7630 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:19.344696 sshd[7630]: Connection closed by 4.153.228.146 port 56674 Jan 14 00:20:19.345703 sshd-session[7626]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:19.346000 audit[7626]: USER_END pid=7626 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:19.354345 systemd[1]: sshd@112-46.224.77.139:22-4.153.228.146:56674.service: Deactivated successfully. Jan 14 00:20:19.362255 kernel: audit: type=1106 audit(1768350019.346:1682): pid=7626 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:19.362336 kernel: audit: type=1104 audit(1768350019.347:1683): pid=7626 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:19.347000 audit[7626]: CRED_DISP pid=7626 uid=0 auid=500 ses=113 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:19.359729 systemd[1]: session-113.scope: Deactivated successfully. Jan 14 00:20:19.353000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@112-46.224.77.139:22-4.153.228.146:56674 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:19.365505 systemd-logind[1545]: Session 113 logged out. Waiting for processes to exit. Jan 14 00:20:19.368188 systemd-logind[1545]: Removed session 113. Jan 14 00:20:22.335266 kubelet[2832]: E0114 00:20:22.335097 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:20:23.333470 kubelet[2832]: E0114 00:20:23.333419 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:20:24.463266 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:24.463396 kernel: audit: type=1130 audit(1768350024.460:1685): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-46.224.77.139:22-4.153.228.146:56684 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:24.460000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-46.224.77.139:22-4.153.228.146:56684 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:24.460813 systemd[1]: Started sshd@113-46.224.77.139:22-4.153.228.146:56684.service - OpenSSH per-connection server daemon (4.153.228.146:56684). Jan 14 00:20:25.014000 audit[7644]: USER_ACCT pid=7644 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.015410 sshd[7644]: Accepted publickey for core from 4.153.228.146 port 56684 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:25.019000 audit[7644]: CRED_ACQ pid=7644 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.022026 kernel: audit: type=1101 audit(1768350025.014:1686): pid=7644 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.022089 kernel: audit: type=1103 audit(1768350025.019:1687): pid=7644 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.020737 sshd-session[7644]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:25.024539 kernel: audit: type=1006 audit(1768350025.019:1688): pid=7644 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=114 res=1 Jan 14 00:20:25.019000 audit[7644]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd4080620 a2=3 a3=0 items=0 ppid=1 pid=7644 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=114 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:25.027734 kernel: audit: type=1300 audit(1768350025.019:1688): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd4080620 a2=3 a3=0 items=0 ppid=1 pid=7644 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=114 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:25.019000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:25.030870 kernel: audit: type=1327 audit(1768350025.019:1688): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:25.030970 systemd-logind[1545]: New session 114 of user core. Jan 14 00:20:25.037709 systemd[1]: Started session-114.scope - Session 114 of User core. Jan 14 00:20:25.044000 audit[7644]: USER_START pid=7644 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.047000 audit[7648]: CRED_ACQ pid=7648 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.052301 kernel: audit: type=1105 audit(1768350025.044:1689): pid=7644 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.052409 kernel: audit: type=1103 audit(1768350025.047:1690): pid=7648 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.333779 kubelet[2832]: E0114 00:20:25.333514 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:20:25.404777 sshd[7648]: Connection closed by 4.153.228.146 port 56684 Jan 14 00:20:25.405852 sshd-session[7644]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:25.409000 audit[7644]: USER_END pid=7644 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.413891 systemd[1]: sshd@113-46.224.77.139:22-4.153.228.146:56684.service: Deactivated successfully. Jan 14 00:20:25.409000 audit[7644]: CRED_DISP pid=7644 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.417235 systemd[1]: session-114.scope: Deactivated successfully. Jan 14 00:20:25.418753 kernel: audit: type=1106 audit(1768350025.409:1691): pid=7644 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.419504 kernel: audit: type=1104 audit(1768350025.409:1692): pid=7644 uid=0 auid=500 ses=114 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:25.413000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@113-46.224.77.139:22-4.153.228.146:56684 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:25.422913 systemd-logind[1545]: Session 114 logged out. Waiting for processes to exit. Jan 14 00:20:25.424286 systemd-logind[1545]: Removed session 114. Jan 14 00:20:26.337090 kubelet[2832]: E0114 00:20:26.336933 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:20:29.333643 kubelet[2832]: E0114 00:20:29.333502 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:20:30.519154 systemd[1]: Started sshd@114-46.224.77.139:22-4.153.228.146:54444.service - OpenSSH per-connection server daemon (4.153.228.146:54444). Jan 14 00:20:30.518000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-46.224.77.139:22-4.153.228.146:54444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:30.523777 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:30.523870 kernel: audit: type=1130 audit(1768350030.518:1694): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-46.224.77.139:22-4.153.228.146:54444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:31.081000 audit[7660]: USER_ACCT pid=7660 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.087043 kernel: audit: type=1101 audit(1768350031.081:1695): pid=7660 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.087152 sshd[7660]: Accepted publickey for core from 4.153.228.146 port 54444 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:31.087000 audit[7660]: CRED_ACQ pid=7660 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.090994 kernel: audit: type=1103 audit(1768350031.087:1696): pid=7660 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.091086 kernel: audit: type=1006 audit(1768350031.088:1697): pid=7660 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=115 res=1 Jan 14 00:20:31.091107 kernel: audit: type=1300 audit(1768350031.088:1697): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe042f310 a2=3 a3=0 items=0 ppid=1 pid=7660 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=115 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:31.088000 audit[7660]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe042f310 a2=3 a3=0 items=0 ppid=1 pid=7660 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=115 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:31.089414 sshd-session[7660]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:31.088000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:31.094584 kernel: audit: type=1327 audit(1768350031.088:1697): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:31.100224 systemd-logind[1545]: New session 115 of user core. Jan 14 00:20:31.107892 systemd[1]: Started session-115.scope - Session 115 of User core. Jan 14 00:20:31.112000 audit[7660]: USER_START pid=7660 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.117000 audit[7664]: CRED_ACQ pid=7664 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.120125 kernel: audit: type=1105 audit(1768350031.112:1698): pid=7660 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.120224 kernel: audit: type=1103 audit(1768350031.117:1699): pid=7664 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.517571 sshd[7664]: Connection closed by 4.153.228.146 port 54444 Jan 14 00:20:31.518242 sshd-session[7660]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:31.521000 audit[7660]: USER_END pid=7660 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.521000 audit[7660]: CRED_DISP pid=7660 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.526013 systemd[1]: sshd@114-46.224.77.139:22-4.153.228.146:54444.service: Deactivated successfully. Jan 14 00:20:31.527355 kernel: audit: type=1106 audit(1768350031.521:1700): pid=7660 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.527637 kernel: audit: type=1104 audit(1768350031.521:1701): pid=7660 uid=0 auid=500 ses=115 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:31.525000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@114-46.224.77.139:22-4.153.228.146:54444 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:31.529655 systemd[1]: session-115.scope: Deactivated successfully. Jan 14 00:20:31.533059 systemd-logind[1545]: Session 115 logged out. Waiting for processes to exit. Jan 14 00:20:31.536138 systemd-logind[1545]: Removed session 115. Jan 14 00:20:32.334723 kubelet[2832]: E0114 00:20:32.334488 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:20:36.335444 kubelet[2832]: E0114 00:20:36.335193 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:20:36.632814 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:36.632920 kernel: audit: type=1130 audit(1768350036.629:1703): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-46.224.77.139:22-4.153.228.146:60100 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:36.629000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-46.224.77.139:22-4.153.228.146:60100 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:36.630313 systemd[1]: Started sshd@115-46.224.77.139:22-4.153.228.146:60100.service - OpenSSH per-connection server daemon (4.153.228.146:60100). Jan 14 00:20:37.196000 audit[7701]: USER_ACCT pid=7701 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.199268 sshd[7701]: Accepted publickey for core from 4.153.228.146 port 60100 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:37.200582 kernel: audit: type=1101 audit(1768350037.196:1704): pid=7701 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.202000 audit[7701]: CRED_ACQ pid=7701 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.206270 sshd-session[7701]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:37.208456 kernel: audit: type=1103 audit(1768350037.202:1705): pid=7701 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.208553 kernel: audit: type=1006 audit(1768350037.204:1706): pid=7701 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=116 res=1 Jan 14 00:20:37.204000 audit[7701]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffacac4f0 a2=3 a3=0 items=0 ppid=1 pid=7701 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=116 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:37.211538 kernel: audit: type=1300 audit(1768350037.204:1706): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffacac4f0 a2=3 a3=0 items=0 ppid=1 pid=7701 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=116 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:37.204000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:37.213603 kernel: audit: type=1327 audit(1768350037.204:1706): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:37.216897 systemd-logind[1545]: New session 116 of user core. Jan 14 00:20:37.225235 systemd[1]: Started session-116.scope - Session 116 of User core. Jan 14 00:20:37.229000 audit[7701]: USER_START pid=7701 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.232000 audit[7705]: CRED_ACQ pid=7705 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.235769 kernel: audit: type=1105 audit(1768350037.229:1707): pid=7701 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.235973 kernel: audit: type=1103 audit(1768350037.232:1708): pid=7705 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.334481 kubelet[2832]: E0114 00:20:37.334382 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:20:37.603548 sshd[7705]: Connection closed by 4.153.228.146 port 60100 Jan 14 00:20:37.604387 sshd-session[7701]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:37.605000 audit[7701]: USER_END pid=7701 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.605000 audit[7701]: CRED_DISP pid=7701 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.610882 systemd[1]: sshd@115-46.224.77.139:22-4.153.228.146:60100.service: Deactivated successfully. Jan 14 00:20:37.612103 kernel: audit: type=1106 audit(1768350037.605:1709): pid=7701 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.612190 kernel: audit: type=1104 audit(1768350037.605:1710): pid=7701 uid=0 auid=500 ses=116 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:37.611000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@115-46.224.77.139:22-4.153.228.146:60100 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:37.616390 systemd[1]: session-116.scope: Deactivated successfully. Jan 14 00:20:37.620673 systemd-logind[1545]: Session 116 logged out. Waiting for processes to exit. Jan 14 00:20:37.622297 systemd-logind[1545]: Removed session 116. Jan 14 00:20:38.337079 kubelet[2832]: E0114 00:20:38.337007 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:20:41.337501 kubelet[2832]: E0114 00:20:41.337289 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:20:42.334707 kubelet[2832]: E0114 00:20:42.334419 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:20:42.718000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-46.224.77.139:22-4.153.228.146:60108 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:42.719625 systemd[1]: Started sshd@116-46.224.77.139:22-4.153.228.146:60108.service - OpenSSH per-connection server daemon (4.153.228.146:60108). Jan 14 00:20:42.721775 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:42.722779 kernel: audit: type=1130 audit(1768350042.718:1712): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-46.224.77.139:22-4.153.228.146:60108 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:43.294000 audit[7716]: USER_ACCT pid=7716 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.299553 sshd[7716]: Accepted publickey for core from 4.153.228.146 port 60108 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:43.299000 audit[7716]: CRED_ACQ pid=7716 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.303118 kernel: audit: type=1101 audit(1768350043.294:1713): pid=7716 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.303217 kernel: audit: type=1103 audit(1768350043.299:1714): pid=7716 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.304198 sshd-session[7716]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:43.307200 kernel: audit: type=1006 audit(1768350043.299:1715): pid=7716 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=117 res=1 Jan 14 00:20:43.299000 audit[7716]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd37572d0 a2=3 a3=0 items=0 ppid=1 pid=7716 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=117 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:43.310321 kernel: audit: type=1300 audit(1768350043.299:1715): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd37572d0 a2=3 a3=0 items=0 ppid=1 pid=7716 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=117 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:43.311088 kernel: audit: type=1327 audit(1768350043.299:1715): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:43.299000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:43.317601 systemd-logind[1545]: New session 117 of user core. Jan 14 00:20:43.321921 systemd[1]: Started session-117.scope - Session 117 of User core. Jan 14 00:20:43.325000 audit[7716]: USER_START pid=7716 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.333555 kernel: audit: type=1105 audit(1768350043.325:1716): pid=7716 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.333000 audit[7720]: CRED_ACQ pid=7720 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.338560 kernel: audit: type=1103 audit(1768350043.333:1717): pid=7720 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.735809 sshd[7720]: Connection closed by 4.153.228.146 port 60108 Jan 14 00:20:43.737083 sshd-session[7716]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:43.737000 audit[7716]: USER_END pid=7716 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.737000 audit[7716]: CRED_DISP pid=7716 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.745438 kernel: audit: type=1106 audit(1768350043.737:1718): pid=7716 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.745752 kernel: audit: type=1104 audit(1768350043.737:1719): pid=7716 uid=0 auid=500 ses=117 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:43.746934 systemd[1]: sshd@116-46.224.77.139:22-4.153.228.146:60108.service: Deactivated successfully. Jan 14 00:20:43.746000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@116-46.224.77.139:22-4.153.228.146:60108 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:43.751770 systemd[1]: session-117.scope: Deactivated successfully. Jan 14 00:20:43.757848 systemd-logind[1545]: Session 117 logged out. Waiting for processes to exit. Jan 14 00:20:43.759066 systemd-logind[1545]: Removed session 117. Jan 14 00:20:45.334646 kubelet[2832]: E0114 00:20:45.334577 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:20:48.843781 systemd[1]: Started sshd@117-46.224.77.139:22-4.153.228.146:32832.service - OpenSSH per-connection server daemon (4.153.228.146:32832). Jan 14 00:20:48.847215 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:48.847249 kernel: audit: type=1130 audit(1768350048.842:1721): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-46.224.77.139:22-4.153.228.146:32832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:48.842000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-46.224.77.139:22-4.153.228.146:32832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:49.390000 audit[7734]: USER_ACCT pid=7734 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.397260 sshd[7734]: Accepted publickey for core from 4.153.228.146 port 32832 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:49.398889 sshd-session[7734]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:49.399545 kernel: audit: type=1101 audit(1768350049.390:1722): pid=7734 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.396000 audit[7734]: CRED_ACQ pid=7734 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.405539 kernel: audit: type=1103 audit(1768350049.396:1723): pid=7734 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.405616 kernel: audit: type=1006 audit(1768350049.396:1724): pid=7734 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=118 res=1 Jan 14 00:20:49.405639 kernel: audit: type=1300 audit(1768350049.396:1724): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd36c30f0 a2=3 a3=0 items=0 ppid=1 pid=7734 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=118 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:49.396000 audit[7734]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd36c30f0 a2=3 a3=0 items=0 ppid=1 pid=7734 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=118 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:49.396000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:49.406831 kernel: audit: type=1327 audit(1768350049.396:1724): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:49.413829 systemd-logind[1545]: New session 118 of user core. Jan 14 00:20:49.420744 systemd[1]: Started session-118.scope - Session 118 of User core. Jan 14 00:20:49.423000 audit[7734]: USER_START pid=7734 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.428000 audit[7738]: CRED_ACQ pid=7738 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.432394 kernel: audit: type=1105 audit(1768350049.423:1725): pid=7734 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.432454 kernel: audit: type=1103 audit(1768350049.428:1726): pid=7738 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.789729 sshd[7738]: Connection closed by 4.153.228.146 port 32832 Jan 14 00:20:49.790124 sshd-session[7734]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:49.792000 audit[7734]: USER_END pid=7734 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.798010 systemd[1]: sshd@117-46.224.77.139:22-4.153.228.146:32832.service: Deactivated successfully. Jan 14 00:20:49.792000 audit[7734]: CRED_DISP pid=7734 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.801554 kernel: audit: type=1106 audit(1768350049.792:1727): pid=7734 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.801625 kernel: audit: type=1104 audit(1768350049.792:1728): pid=7734 uid=0 auid=500 ses=118 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:49.801875 systemd[1]: session-118.scope: Deactivated successfully. Jan 14 00:20:49.796000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@117-46.224.77.139:22-4.153.228.146:32832 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:49.804186 systemd-logind[1545]: Session 118 logged out. Waiting for processes to exit. Jan 14 00:20:49.808379 systemd-logind[1545]: Removed session 118. Jan 14 00:20:51.336568 kubelet[2832]: E0114 00:20:51.335991 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:20:51.339334 kubelet[2832]: E0114 00:20:51.339281 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:20:53.336001 kubelet[2832]: E0114 00:20:53.335743 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:20:53.337166 kubelet[2832]: E0114 00:20:53.337040 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:20:54.335560 kubelet[2832]: E0114 00:20:54.335248 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:20:54.907791 systemd[1]: Started sshd@118-46.224.77.139:22-4.153.228.146:50854.service - OpenSSH per-connection server daemon (4.153.228.146:50854). Jan 14 00:20:54.912838 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:20:54.912914 kernel: audit: type=1130 audit(1768350054.906:1730): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-46.224.77.139:22-4.153.228.146:50854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:54.906000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-46.224.77.139:22-4.153.228.146:50854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:55.460557 sshd[7753]: Accepted publickey for core from 4.153.228.146 port 50854 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:20:55.457000 audit[7753]: USER_ACCT pid=7753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.464848 sshd-session[7753]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:20:55.467756 kernel: audit: type=1101 audit(1768350055.457:1731): pid=7753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.467819 kernel: audit: type=1103 audit(1768350055.462:1732): pid=7753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.462000 audit[7753]: CRED_ACQ pid=7753 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.471138 kernel: audit: type=1006 audit(1768350055.462:1733): pid=7753 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=119 res=1 Jan 14 00:20:55.475296 kernel: audit: type=1300 audit(1768350055.462:1733): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff3836230 a2=3 a3=0 items=0 ppid=1 pid=7753 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=119 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:55.462000 audit[7753]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff3836230 a2=3 a3=0 items=0 ppid=1 pid=7753 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=119 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:20:55.479607 kernel: audit: type=1327 audit(1768350055.462:1733): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:55.462000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:20:55.484564 systemd-logind[1545]: New session 119 of user core. Jan 14 00:20:55.487788 systemd[1]: Started session-119.scope - Session 119 of User core. Jan 14 00:20:55.491000 audit[7753]: USER_START pid=7753 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.500167 kernel: audit: type=1105 audit(1768350055.491:1734): pid=7753 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.500305 kernel: audit: type=1103 audit(1768350055.495:1735): pid=7758 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.495000 audit[7758]: CRED_ACQ pid=7758 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.856712 sshd[7758]: Connection closed by 4.153.228.146 port 50854 Jan 14 00:20:55.857037 sshd-session[7753]: pam_unix(sshd:session): session closed for user core Jan 14 00:20:55.858000 audit[7753]: USER_END pid=7753 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.858000 audit[7753]: CRED_DISP pid=7753 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.865179 kernel: audit: type=1106 audit(1768350055.858:1736): pid=7753 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.865259 kernel: audit: type=1104 audit(1768350055.858:1737): pid=7753 uid=0 auid=500 ses=119 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:20:55.867733 systemd-logind[1545]: Session 119 logged out. Waiting for processes to exit. Jan 14 00:20:55.869674 systemd[1]: sshd@118-46.224.77.139:22-4.153.228.146:50854.service: Deactivated successfully. Jan 14 00:20:55.868000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@118-46.224.77.139:22-4.153.228.146:50854 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:20:55.874396 systemd[1]: session-119.scope: Deactivated successfully. Jan 14 00:20:55.880084 systemd-logind[1545]: Removed session 119. Jan 14 00:20:56.342567 kubelet[2832]: E0114 00:20:56.342514 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:21:00.964000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-46.224.77.139:22-4.153.228.146:50864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:21:00.964727 systemd[1]: Started sshd@119-46.224.77.139:22-4.153.228.146:50864.service - OpenSSH per-connection server daemon (4.153.228.146:50864). Jan 14 00:21:00.967103 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:21:00.967221 kernel: audit: type=1130 audit(1768350060.964:1739): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-46.224.77.139:22-4.153.228.146:50864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:21:01.511000 audit[7769]: USER_ACCT pid=7769 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.514063 sshd[7769]: Accepted publickey for core from 4.153.228.146 port 50864 ssh2: RSA SHA256:glLkFe67bcWfTsvhOKopnTNWDCglO1/VUkYWlAT9GVU Jan 14 00:21:01.515865 kernel: audit: type=1101 audit(1768350061.511:1740): pid=7769 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.516000 audit[7769]: CRED_ACQ pid=7769 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.520428 sshd-session[7769]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 14 00:21:01.522694 kernel: audit: type=1103 audit(1768350061.516:1741): pid=7769 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.522771 kernel: audit: type=1006 audit(1768350061.519:1742): pid=7769 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=120 res=1 Jan 14 00:21:01.522799 kernel: audit: type=1300 audit(1768350061.519:1742): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd3b9b5f0 a2=3 a3=0 items=0 ppid=1 pid=7769 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=120 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:01.519000 audit[7769]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd3b9b5f0 a2=3 a3=0 items=0 ppid=1 pid=7769 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=120 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:01.519000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:21:01.526846 kernel: audit: type=1327 audit(1768350061.519:1742): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 14 00:21:01.532389 systemd-logind[1545]: New session 120 of user core. Jan 14 00:21:01.536820 systemd[1]: Started session-120.scope - Session 120 of User core. Jan 14 00:21:01.543000 audit[7769]: USER_START pid=7769 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.551188 kernel: audit: type=1105 audit(1768350061.543:1743): pid=7769 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.551283 kernel: audit: type=1103 audit(1768350061.548:1744): pid=7773 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.548000 audit[7773]: CRED_ACQ pid=7773 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.899317 sshd[7773]: Connection closed by 4.153.228.146 port 50864 Jan 14 00:21:01.902772 sshd-session[7769]: pam_unix(sshd:session): session closed for user core Jan 14 00:21:01.905000 audit[7769]: USER_END pid=7769 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.911190 systemd-logind[1545]: Session 120 logged out. Waiting for processes to exit. Jan 14 00:21:01.905000 audit[7769]: CRED_DISP pid=7769 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.913971 kernel: audit: type=1106 audit(1768350061.905:1745): pid=7769 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.914055 kernel: audit: type=1104 audit(1768350061.905:1746): pid=7769 uid=0 auid=500 ses=120 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=4.153.228.146 addr=4.153.228.146 terminal=ssh res=success' Jan 14 00:21:01.915966 systemd[1]: sshd@119-46.224.77.139:22-4.153.228.146:50864.service: Deactivated successfully. Jan 14 00:21:01.915000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@119-46.224.77.139:22-4.153.228.146:50864 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:21:01.921658 systemd[1]: session-120.scope: Deactivated successfully. Jan 14 00:21:01.926026 systemd-logind[1545]: Removed session 120. Jan 14 00:21:03.333771 kubelet[2832]: E0114 00:21:03.333721 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:21:03.336105 kubelet[2832]: E0114 00:21:03.336058 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:21:06.336580 kubelet[2832]: E0114 00:21:06.335863 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:21:07.335421 kubelet[2832]: E0114 00:21:07.335045 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:21:09.334876 kubelet[2832]: E0114 00:21:09.334452 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:21:09.336015 kubelet[2832]: E0114 00:21:09.335956 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:21:15.334638 kubelet[2832]: E0114 00:21:15.334481 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:21:18.336623 kubelet[2832]: E0114 00:21:18.336424 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:21:18.338598 kubelet[2832]: E0114 00:21:18.338485 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:21:18.339774 kubelet[2832]: E0114 00:21:18.339725 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:21:22.335784 kubelet[2832]: E0114 00:21:22.335737 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:21:22.336806 kubelet[2832]: E0114 00:21:22.336578 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:21:26.334703 kubelet[2832]: E0114 00:21:26.334604 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:21:30.336608 kubelet[2832]: E0114 00:21:30.336534 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-8jmff" podUID="6c288445-910a-4d1d-9b62-12f5155b11be" Jan 14 00:21:31.334028 kubelet[2832]: E0114 00:21:31.333943 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-684bfd8c46-zxdr6" podUID="5ea780f2-7146-4be4-95de-faccba85fdbd" Jan 14 00:21:33.333743 kubelet[2832]: E0114 00:21:33.333689 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-7bt2c" podUID="3e7b6341-e94e-4a0e-be63-0d1b1ff1d4c4" Jan 14 00:21:33.774231 systemd[1]: cri-containerd-f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7.scope: Deactivated successfully. Jan 14 00:21:33.777750 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 14 00:21:33.777973 kernel: audit: type=1334 audit(1768350093.775:1748): prog-id=254 op=LOAD Jan 14 00:21:33.775000 audit: BPF prog-id=254 op=LOAD Jan 14 00:21:33.779210 kernel: audit: type=1334 audit(1768350093.777:1749): prog-id=91 op=UNLOAD Jan 14 00:21:33.777000 audit: BPF prog-id=91 op=UNLOAD Jan 14 00:21:33.775374 systemd[1]: cri-containerd-f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7.scope: Consumed 10.095s CPU time, 62.6M memory peak, 3.7M read from disk. Jan 14 00:21:33.779395 containerd[1590]: time="2026-01-14T00:21:33.779078453Z" level=info msg="received container exit event container_id:\"f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7\" id:\"f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7\" pid:2690 exit_status:1 exited_at:{seconds:1768350093 nanos:778417273}" Jan 14 00:21:33.778000 audit: BPF prog-id=106 op=UNLOAD Jan 14 00:21:33.781941 kernel: audit: type=1334 audit(1768350093.778:1750): prog-id=106 op=UNLOAD Jan 14 00:21:33.782261 kernel: audit: type=1334 audit(1768350093.778:1751): prog-id=110 op=UNLOAD Jan 14 00:21:33.778000 audit: BPF prog-id=110 op=UNLOAD Jan 14 00:21:33.783729 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories... Jan 14 00:21:33.815815 systemd-tmpfiles[7856]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 14 00:21:33.815835 systemd-tmpfiles[7856]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 14 00:21:33.816532 systemd-tmpfiles[7856]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 14 00:21:33.819211 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7-rootfs.mount: Deactivated successfully. Jan 14 00:21:33.821009 systemd-tmpfiles[7856]: ACLs are not supported, ignoring. Jan 14 00:21:33.821065 systemd-tmpfiles[7856]: ACLs are not supported, ignoring. Jan 14 00:21:33.828957 systemd-tmpfiles[7856]: Detected autofs mount point /boot during canonicalization of boot. Jan 14 00:21:33.828973 systemd-tmpfiles[7856]: Skipping /boot Jan 14 00:21:33.838230 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully. Jan 14 00:21:33.838772 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories. Jan 14 00:21:33.838000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:21:33.841000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:21:33.846756 kernel: audit: type=1130 audit(1768350093.838:1752): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:21:33.846874 kernel: audit: type=1131 audit(1768350093.841:1753): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-clean comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 14 00:21:33.866605 kubelet[2832]: I0114 00:21:33.865972 2832 scope.go:117] "RemoveContainer" containerID="f2ae447358501c1fec2af67d4616cba7b7560190fddd13901f036ca9cb2386d7" Jan 14 00:21:33.873462 containerd[1590]: time="2026-01-14T00:21:33.873353895Z" level=info msg="CreateContainer within sandbox \"7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 14 00:21:33.886049 containerd[1590]: time="2026-01-14T00:21:33.885991442Z" level=info msg="Container b507d0b00572cd8ec338942642e233122c3f49e11b5cab8b1fd14f99ebd09d8e: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:21:33.888562 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1161809575.mount: Deactivated successfully. Jan 14 00:21:33.900504 containerd[1590]: time="2026-01-14T00:21:33.900389882Z" level=info msg="CreateContainer within sandbox \"7861594a7d84cb26262b927db2ad31405bcd849c42db799e56ffd3d142b5ec41\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"b507d0b00572cd8ec338942642e233122c3f49e11b5cab8b1fd14f99ebd09d8e\"" Jan 14 00:21:33.901433 containerd[1590]: time="2026-01-14T00:21:33.901368072Z" level=info msg="StartContainer for \"b507d0b00572cd8ec338942642e233122c3f49e11b5cab8b1fd14f99ebd09d8e\"" Jan 14 00:21:33.903107 containerd[1590]: time="2026-01-14T00:21:33.903045283Z" level=info msg="connecting to shim b507d0b00572cd8ec338942642e233122c3f49e11b5cab8b1fd14f99ebd09d8e" address="unix:///run/containerd/s/e2e8f688f643db3c26029dc7cb440cdf2959ea46d3db5dc1f392128db2a0374e" protocol=ttrpc version=3 Jan 14 00:21:33.925800 systemd[1]: Started cri-containerd-b507d0b00572cd8ec338942642e233122c3f49e11b5cab8b1fd14f99ebd09d8e.scope - libcontainer container b507d0b00572cd8ec338942642e233122c3f49e11b5cab8b1fd14f99ebd09d8e. Jan 14 00:21:33.943839 kernel: audit: type=1334 audit(1768350093.941:1754): prog-id=255 op=LOAD Jan 14 00:21:33.941000 audit: BPF prog-id=255 op=LOAD Jan 14 00:21:33.943000 audit: BPF prog-id=256 op=LOAD Jan 14 00:21:33.943000 audit[7872]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2555 pid=7872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:33.948045 kernel: audit: type=1334 audit(1768350093.943:1755): prog-id=256 op=LOAD Jan 14 00:21:33.948708 kernel: audit: type=1300 audit(1768350093.943:1755): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2555 pid=7872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:33.949048 kernel: audit: type=1327 audit(1768350093.943:1755): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235303764306230303537326364386563333338393432363432653233 Jan 14 00:21:33.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235303764306230303537326364386563333338393432363432653233 Jan 14 00:21:33.944000 audit: BPF prog-id=256 op=UNLOAD Jan 14 00:21:33.944000 audit[7872]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2555 pid=7872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:33.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235303764306230303537326364386563333338393432363432653233 Jan 14 00:21:33.944000 audit: BPF prog-id=257 op=LOAD Jan 14 00:21:33.944000 audit[7872]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2555 pid=7872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:33.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235303764306230303537326364386563333338393432363432653233 Jan 14 00:21:33.944000 audit: BPF prog-id=258 op=LOAD Jan 14 00:21:33.944000 audit[7872]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2555 pid=7872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:33.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235303764306230303537326364386563333338393432363432653233 Jan 14 00:21:33.944000 audit: BPF prog-id=258 op=UNLOAD Jan 14 00:21:33.944000 audit[7872]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2555 pid=7872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:33.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235303764306230303537326364386563333338393432363432653233 Jan 14 00:21:33.944000 audit: BPF prog-id=257 op=UNLOAD Jan 14 00:21:33.944000 audit[7872]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2555 pid=7872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:33.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235303764306230303537326364386563333338393432363432653233 Jan 14 00:21:33.944000 audit: BPF prog-id=259 op=LOAD Jan 14 00:21:33.944000 audit[7872]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2555 pid=7872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:33.944000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235303764306230303537326364386563333338393432363432653233 Jan 14 00:21:33.987592 containerd[1590]: time="2026-01-14T00:21:33.987548827Z" level=info msg="StartContainer for \"b507d0b00572cd8ec338942642e233122c3f49e11b5cab8b1fd14f99ebd09d8e\" returns successfully" Jan 14 00:21:34.205677 kubelet[2832]: E0114 00:21:34.205601 2832 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:32890->10.0.0.2:2379: read: connection timed out" Jan 14 00:21:34.219393 systemd[1]: cri-containerd-400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b.scope: Deactivated successfully. Jan 14 00:21:34.219948 systemd[1]: cri-containerd-400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b.scope: Consumed 2min 30.381s CPU time, 120.1M memory peak. Jan 14 00:21:34.222000 audit: BPF prog-id=144 op=UNLOAD Jan 14 00:21:34.222000 audit: BPF prog-id=148 op=UNLOAD Jan 14 00:21:34.225613 containerd[1590]: time="2026-01-14T00:21:34.225513990Z" level=info msg="received container exit event container_id:\"400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b\" id:\"400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b\" pid:3159 exit_status:1 exited_at:{seconds:1768350094 nanos:224497479}" Jan 14 00:21:34.803323 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b-rootfs.mount: Deactivated successfully. Jan 14 00:21:34.871064 kubelet[2832]: I0114 00:21:34.870846 2832 scope.go:117] "RemoveContainer" containerID="400821a6f1a8f3fecea025a94ff69575cb3f9e567358d505ffad646fa0df313b" Jan 14 00:21:34.886609 containerd[1590]: time="2026-01-14T00:21:34.885401708Z" level=info msg="CreateContainer within sandbox \"e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 14 00:21:34.897655 containerd[1590]: time="2026-01-14T00:21:34.897614682Z" level=info msg="Container b35aacfc87364a280ac670689b497fa8f9cca9300870e2bc7236baca13788997: CDI devices from CRI Config.CDIDevices: []" Jan 14 00:21:34.905921 containerd[1590]: time="2026-01-14T00:21:34.905871734Z" level=info msg="CreateContainer within sandbox \"e31b70c6d7e37036e049ea3561581dad05912c288fbd74ebbc5f3b3364a43495\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"b35aacfc87364a280ac670689b497fa8f9cca9300870e2bc7236baca13788997\"" Jan 14 00:21:34.906699 containerd[1590]: time="2026-01-14T00:21:34.906667919Z" level=info msg="StartContainer for \"b35aacfc87364a280ac670689b497fa8f9cca9300870e2bc7236baca13788997\"" Jan 14 00:21:34.907638 containerd[1590]: time="2026-01-14T00:21:34.907577386Z" level=info msg="connecting to shim b35aacfc87364a280ac670689b497fa8f9cca9300870e2bc7236baca13788997" address="unix:///run/containerd/s/efa8e772016b0eb7b52b1c2214aef417240b316ade27f4c8e890824fd30a31ce" protocol=ttrpc version=3 Jan 14 00:21:34.934026 systemd[1]: Started cri-containerd-b35aacfc87364a280ac670689b497fa8f9cca9300870e2bc7236baca13788997.scope - libcontainer container b35aacfc87364a280ac670689b497fa8f9cca9300870e2bc7236baca13788997. Jan 14 00:21:34.992000 audit: BPF prog-id=260 op=LOAD Jan 14 00:21:34.993000 audit: BPF prog-id=261 op=LOAD Jan 14 00:21:34.993000 audit[7915]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2956 pid=7915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:34.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233356161636663383733363461323830616336373036383962343937 Jan 14 00:21:34.993000 audit: BPF prog-id=261 op=UNLOAD Jan 14 00:21:34.993000 audit[7915]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2956 pid=7915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:34.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233356161636663383733363461323830616336373036383962343937 Jan 14 00:21:34.993000 audit: BPF prog-id=262 op=LOAD Jan 14 00:21:34.993000 audit[7915]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2956 pid=7915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:34.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233356161636663383733363461323830616336373036383962343937 Jan 14 00:21:34.993000 audit: BPF prog-id=263 op=LOAD Jan 14 00:21:34.993000 audit[7915]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2956 pid=7915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:34.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233356161636663383733363461323830616336373036383962343937 Jan 14 00:21:34.993000 audit: BPF prog-id=263 op=UNLOAD Jan 14 00:21:34.993000 audit[7915]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2956 pid=7915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:34.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233356161636663383733363461323830616336373036383962343937 Jan 14 00:21:34.993000 audit: BPF prog-id=262 op=UNLOAD Jan 14 00:21:34.993000 audit[7915]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2956 pid=7915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:34.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233356161636663383733363461323830616336373036383962343937 Jan 14 00:21:34.993000 audit: BPF prog-id=264 op=LOAD Jan 14 00:21:34.993000 audit[7915]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2956 pid=7915 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 14 00:21:34.993000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233356161636663383733363461323830616336373036383962343937 Jan 14 00:21:35.022367 containerd[1590]: time="2026-01-14T00:21:35.022305739Z" level=info msg="StartContainer for \"b35aacfc87364a280ac670689b497fa8f9cca9300870e2bc7236baca13788997\" returns successfully" Jan 14 00:21:35.333154 kubelet[2832]: E0114 00:21:35.333094 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-b44cc6f4-gxl6c" podUID="5ee70bb0-55b7-4a80-b5cb-3133091615ae" Jan 14 00:21:36.337004 kubelet[2832]: E0114 00:21:36.336962 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6f67969d8d-vdxqm" podUID="1e4bec8e-a684-46cb-852e-ae05ed7b56d7" Jan 14 00:21:38.334375 kubelet[2832]: E0114 00:21:38.334191 2832 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-hrn72" podUID="1e53bd66-4746-482e-bb2b-bfd29a1ef20e" Jan 14 00:21:38.813250 kubelet[2832]: E0114 00:21:38.813041 2832 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:60960->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4547-0-0-n-fb1a601aa4.188a7109478435ab kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4547-0-0-n-fb1a601aa4,UID:08b119e21354abecec3569c2fc59abfa,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-n-fb1a601aa4,},FirstTimestamp:2026-01-14 00:21:28.364832171 +0000 UTC m=+822.167243518,LastTimestamp:2026-01-14 00:21:28.364832171 +0000 UTC m=+822.167243518,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-n-fb1a601aa4,}"