Jul 15 23:11:49.786499 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jul 15 23:11:49.786520 kernel: Linux version 6.12.36-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT Tue Jul 15 22:00:45 -00 2025 Jul 15 23:11:49.786530 kernel: KASLR enabled Jul 15 23:11:49.786535 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jul 15 23:11:49.786541 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Jul 15 23:11:49.786546 kernel: random: crng init done Jul 15 23:11:49.786553 kernel: secureboot: Secure boot disabled Jul 15 23:11:49.786558 kernel: ACPI: Early table checksum verification disabled Jul 15 23:11:49.786564 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Jul 15 23:11:49.786570 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Jul 15 23:11:49.786577 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:11:49.786583 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:11:49.786588 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:11:49.786594 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:11:49.786601 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:11:49.786608 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:11:49.786614 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:11:49.786620 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:11:49.786626 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jul 15 23:11:49.786632 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Jul 15 23:11:49.786638 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jul 15 23:11:49.786644 kernel: ACPI: Use ACPI SPCR as default console: Yes Jul 15 23:11:49.786650 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Jul 15 23:11:49.786656 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Jul 15 23:11:49.786662 kernel: Zone ranges: Jul 15 23:11:49.786669 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jul 15 23:11:49.786675 kernel: DMA32 empty Jul 15 23:11:49.786681 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Jul 15 23:11:49.786687 kernel: Device empty Jul 15 23:11:49.786693 kernel: Movable zone start for each node Jul 15 23:11:49.786698 kernel: Early memory node ranges Jul 15 23:11:49.786704 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Jul 15 23:11:49.786710 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Jul 15 23:11:49.786716 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Jul 15 23:11:49.786722 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Jul 15 23:11:49.786728 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Jul 15 23:11:49.786734 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Jul 15 23:11:49.786740 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Jul 15 23:11:49.786748 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Jul 15 23:11:49.786754 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Jul 15 23:11:49.786762 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Jul 15 23:11:49.786768 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jul 15 23:11:49.786775 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Jul 15 23:11:49.786782 kernel: psci: probing for conduit method from ACPI. Jul 15 23:11:49.786789 kernel: psci: PSCIv1.1 detected in firmware. Jul 15 23:11:49.786795 kernel: psci: Using standard PSCI v0.2 function IDs Jul 15 23:11:49.786801 kernel: psci: Trusted OS migration not required Jul 15 23:11:49.786808 kernel: psci: SMC Calling Convention v1.1 Jul 15 23:11:49.786831 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jul 15 23:11:49.786838 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jul 15 23:11:49.786862 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jul 15 23:11:49.786870 kernel: pcpu-alloc: [0] 0 [0] 1 Jul 15 23:11:49.786876 kernel: Detected PIPT I-cache on CPU0 Jul 15 23:11:49.786882 kernel: CPU features: detected: GIC system register CPU interface Jul 15 23:11:49.786891 kernel: CPU features: detected: Spectre-v4 Jul 15 23:11:49.786898 kernel: CPU features: detected: Spectre-BHB Jul 15 23:11:49.786904 kernel: CPU features: kernel page table isolation forced ON by KASLR Jul 15 23:11:49.786910 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jul 15 23:11:49.786917 kernel: CPU features: detected: ARM erratum 1418040 Jul 15 23:11:49.786923 kernel: CPU features: detected: SSBS not fully self-synchronizing Jul 15 23:11:49.786929 kernel: alternatives: applying boot alternatives Jul 15 23:11:49.786937 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=6efbcbd16e8e41b645be9f8e34b328753e37d282675200dab08e504f8e58a578 Jul 15 23:11:49.786944 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Jul 15 23:11:49.786950 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jul 15 23:11:49.786958 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jul 15 23:11:49.786964 kernel: Fallback order for Node 0: 0 Jul 15 23:11:49.786971 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Jul 15 23:11:49.786977 kernel: Policy zone: Normal Jul 15 23:11:49.786983 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jul 15 23:11:49.786989 kernel: software IO TLB: area num 2. Jul 15 23:11:49.786996 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Jul 15 23:11:49.787002 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jul 15 23:11:49.787013 kernel: rcu: Preemptible hierarchical RCU implementation. Jul 15 23:11:49.787024 kernel: rcu: RCU event tracing is enabled. Jul 15 23:11:49.787032 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jul 15 23:11:49.787041 kernel: Trampoline variant of Tasks RCU enabled. Jul 15 23:11:49.787052 kernel: Tracing variant of Tasks RCU enabled. Jul 15 23:11:49.787059 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jul 15 23:11:49.787066 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jul 15 23:11:49.787072 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 23:11:49.787079 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jul 15 23:11:49.787085 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jul 15 23:11:49.787092 kernel: GICv3: 256 SPIs implemented Jul 15 23:11:49.787098 kernel: GICv3: 0 Extended SPIs implemented Jul 15 23:11:49.787104 kernel: Root IRQ handler: gic_handle_irq Jul 15 23:11:49.787110 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jul 15 23:11:49.787117 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jul 15 23:11:49.787123 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jul 15 23:11:49.787131 kernel: ITS [mem 0x08080000-0x0809ffff] Jul 15 23:11:49.787137 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Jul 15 23:11:49.787144 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Jul 15 23:11:49.787150 kernel: GICv3: using LPI property table @0x0000000100120000 Jul 15 23:11:49.787157 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Jul 15 23:11:49.787163 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jul 15 23:11:49.787169 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 15 23:11:49.787176 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jul 15 23:11:49.787182 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jul 15 23:11:49.787189 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jul 15 23:11:49.787195 kernel: Console: colour dummy device 80x25 Jul 15 23:11:49.787203 kernel: ACPI: Core revision 20240827 Jul 15 23:11:49.787210 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jul 15 23:11:49.787216 kernel: pid_max: default: 32768 minimum: 301 Jul 15 23:11:49.787223 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jul 15 23:11:49.787230 kernel: landlock: Up and running. Jul 15 23:11:49.787236 kernel: SELinux: Initializing. Jul 15 23:11:49.787242 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 15 23:11:49.787249 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jul 15 23:11:49.787256 kernel: rcu: Hierarchical SRCU implementation. Jul 15 23:11:49.787264 kernel: rcu: Max phase no-delay instances is 400. Jul 15 23:11:49.787271 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jul 15 23:11:49.787277 kernel: Remapping and enabling EFI services. Jul 15 23:11:49.787284 kernel: smp: Bringing up secondary CPUs ... Jul 15 23:11:49.787290 kernel: Detected PIPT I-cache on CPU1 Jul 15 23:11:49.787297 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jul 15 23:11:49.787304 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Jul 15 23:11:49.787310 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jul 15 23:11:49.787317 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jul 15 23:11:49.787325 kernel: smp: Brought up 1 node, 2 CPUs Jul 15 23:11:49.787336 kernel: SMP: Total of 2 processors activated. Jul 15 23:11:49.787343 kernel: CPU: All CPU(s) started at EL1 Jul 15 23:11:49.787351 kernel: CPU features: detected: 32-bit EL0 Support Jul 15 23:11:49.787358 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jul 15 23:11:49.787365 kernel: CPU features: detected: Common not Private translations Jul 15 23:11:49.787372 kernel: CPU features: detected: CRC32 instructions Jul 15 23:11:49.787379 kernel: CPU features: detected: Enhanced Virtualization Traps Jul 15 23:11:49.787388 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jul 15 23:11:49.787395 kernel: CPU features: detected: LSE atomic instructions Jul 15 23:11:49.787402 kernel: CPU features: detected: Privileged Access Never Jul 15 23:11:49.787409 kernel: CPU features: detected: RAS Extension Support Jul 15 23:11:49.787416 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jul 15 23:11:49.787425 kernel: alternatives: applying system-wide alternatives Jul 15 23:11:49.787432 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Jul 15 23:11:49.787440 kernel: Memory: 3859044K/4096000K available (11136K kernel code, 2436K rwdata, 9076K rodata, 39488K init, 1038K bss, 215476K reserved, 16384K cma-reserved) Jul 15 23:11:49.787450 kernel: devtmpfs: initialized Jul 15 23:11:49.787458 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jul 15 23:11:49.787466 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jul 15 23:11:49.787472 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jul 15 23:11:49.787479 kernel: 0 pages in range for non-PLT usage Jul 15 23:11:49.787486 kernel: 508432 pages in range for PLT usage Jul 15 23:11:49.787493 kernel: pinctrl core: initialized pinctrl subsystem Jul 15 23:11:49.787499 kernel: SMBIOS 3.0.0 present. Jul 15 23:11:49.787506 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Jul 15 23:11:49.787513 kernel: DMI: Memory slots populated: 1/1 Jul 15 23:11:49.787521 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jul 15 23:11:49.787528 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Jul 15 23:11:49.787535 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jul 15 23:11:49.787542 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jul 15 23:11:49.787549 kernel: audit: initializing netlink subsys (disabled) Jul 15 23:11:49.787556 kernel: audit: type=2000 audit(0.015:1): state=initialized audit_enabled=0 res=1 Jul 15 23:11:49.787563 kernel: thermal_sys: Registered thermal governor 'step_wise' Jul 15 23:11:49.787570 kernel: cpuidle: using governor menu Jul 15 23:11:49.787577 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jul 15 23:11:49.787585 kernel: ASID allocator initialised with 32768 entries Jul 15 23:11:49.787592 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jul 15 23:11:49.787599 kernel: Serial: AMBA PL011 UART driver Jul 15 23:11:49.787606 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jul 15 23:11:49.787613 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jul 15 23:11:49.787619 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jul 15 23:11:49.787626 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jul 15 23:11:49.787633 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jul 15 23:11:49.787640 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jul 15 23:11:49.787648 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jul 15 23:11:49.787655 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jul 15 23:11:49.787662 kernel: ACPI: Added _OSI(Module Device) Jul 15 23:11:49.787669 kernel: ACPI: Added _OSI(Processor Device) Jul 15 23:11:49.787676 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jul 15 23:11:49.787682 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jul 15 23:11:49.787690 kernel: ACPI: Interpreter enabled Jul 15 23:11:49.787696 kernel: ACPI: Using GIC for interrupt routing Jul 15 23:11:49.787703 kernel: ACPI: MCFG table detected, 1 entries Jul 15 23:11:49.787711 kernel: ACPI: CPU0 has been hot-added Jul 15 23:11:49.787718 kernel: ACPI: CPU1 has been hot-added Jul 15 23:11:49.787725 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jul 15 23:11:49.787732 kernel: printk: legacy console [ttyAMA0] enabled Jul 15 23:11:49.787739 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jul 15 23:11:49.789006 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jul 15 23:11:49.789093 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jul 15 23:11:49.789152 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jul 15 23:11:49.789215 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jul 15 23:11:49.789272 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jul 15 23:11:49.789281 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jul 15 23:11:49.789288 kernel: PCI host bridge to bus 0000:00 Jul 15 23:11:49.789358 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jul 15 23:11:49.789412 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jul 15 23:11:49.789466 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jul 15 23:11:49.789526 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jul 15 23:11:49.789599 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jul 15 23:11:49.789670 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Jul 15 23:11:49.789730 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Jul 15 23:11:49.789789 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Jul 15 23:11:49.789922 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:11:49.789994 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Jul 15 23:11:49.790054 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 23:11:49.790112 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jul 15 23:11:49.790170 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jul 15 23:11:49.790239 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:11:49.790299 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Jul 15 23:11:49.790357 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 23:11:49.790418 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Jul 15 23:11:49.790484 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:11:49.790545 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Jul 15 23:11:49.790609 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 23:11:49.790669 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Jul 15 23:11:49.790727 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jul 15 23:11:49.790790 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:11:49.792030 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Jul 15 23:11:49.792112 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 23:11:49.792172 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Jul 15 23:11:49.792230 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jul 15 23:11:49.792300 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:11:49.792359 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Jul 15 23:11:49.792416 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 23:11:49.792480 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jul 15 23:11:49.792537 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jul 15 23:11:49.792604 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:11:49.792662 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Jul 15 23:11:49.792720 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 23:11:49.792777 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Jul 15 23:11:49.793480 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jul 15 23:11:49.793566 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:11:49.793626 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Jul 15 23:11:49.793685 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 23:11:49.793743 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Jul 15 23:11:49.793801 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Jul 15 23:11:49.794364 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:11:49.794439 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Jul 15 23:11:49.794506 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 23:11:49.794565 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Jul 15 23:11:49.794630 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jul 15 23:11:49.794691 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Jul 15 23:11:49.794749 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 23:11:49.794807 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Jul 15 23:11:49.794948 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Jul 15 23:11:49.795013 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Jul 15 23:11:49.795085 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 23:11:49.795147 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Jul 15 23:11:49.795207 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jul 15 23:11:49.795267 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jul 15 23:11:49.795334 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jul 15 23:11:49.795399 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Jul 15 23:11:49.795467 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jul 15 23:11:49.795528 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Jul 15 23:11:49.795588 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jul 15 23:11:49.795655 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jul 15 23:11:49.795715 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jul 15 23:11:49.795787 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jul 15 23:11:49.795918 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jul 15 23:11:49.795995 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jul 15 23:11:49.796058 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Jul 15 23:11:49.796118 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jul 15 23:11:49.796187 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jul 15 23:11:49.796248 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Jul 15 23:11:49.796311 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Jul 15 23:11:49.796371 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jul 15 23:11:49.796432 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jul 15 23:11:49.796490 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jul 15 23:11:49.796548 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jul 15 23:11:49.796609 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jul 15 23:11:49.796668 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jul 15 23:11:49.796728 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jul 15 23:11:49.796789 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jul 15 23:11:49.796881 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jul 15 23:11:49.796945 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jul 15 23:11:49.797005 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jul 15 23:11:49.797063 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jul 15 23:11:49.797127 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jul 15 23:11:49.797189 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jul 15 23:11:49.797248 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jul 15 23:11:49.797306 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Jul 15 23:11:49.797366 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jul 15 23:11:49.797424 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jul 15 23:11:49.797481 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jul 15 23:11:49.797545 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jul 15 23:11:49.797605 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Jul 15 23:11:49.797663 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Jul 15 23:11:49.797723 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jul 15 23:11:49.797781 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jul 15 23:11:49.797895 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jul 15 23:11:49.797961 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jul 15 23:11:49.798024 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jul 15 23:11:49.798082 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jul 15 23:11:49.798141 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jul 15 23:11:49.798199 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jul 15 23:11:49.798258 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jul 15 23:11:49.798316 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jul 15 23:11:49.798374 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jul 15 23:11:49.798433 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jul 15 23:11:49.798491 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jul 15 23:11:49.798548 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jul 15 23:11:49.798607 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jul 15 23:11:49.798665 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jul 15 23:11:49.798722 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jul 15 23:11:49.798779 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jul 15 23:11:49.799188 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jul 15 23:11:49.799280 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jul 15 23:11:49.799343 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jul 15 23:11:49.799402 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jul 15 23:11:49.799462 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jul 15 23:11:49.799520 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jul 15 23:11:49.799584 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Jul 15 23:11:49.799682 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Jul 15 23:11:49.799746 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Jul 15 23:11:49.799811 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Jul 15 23:11:49.799941 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Jul 15 23:11:49.800002 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Jul 15 23:11:49.800061 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Jul 15 23:11:49.800126 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Jul 15 23:11:49.800226 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Jul 15 23:11:49.800291 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Jul 15 23:11:49.800351 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Jul 15 23:11:49.800409 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Jul 15 23:11:49.800468 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Jul 15 23:11:49.800526 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Jul 15 23:11:49.800584 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Jul 15 23:11:49.800645 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Jul 15 23:11:49.800703 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Jul 15 23:11:49.800761 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Jul 15 23:11:49.800836 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Jul 15 23:11:49.802939 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Jul 15 23:11:49.803029 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Jul 15 23:11:49.803099 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jul 15 23:11:49.803162 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jul 15 23:11:49.803230 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jul 15 23:11:49.803291 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jul 15 23:11:49.803349 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jul 15 23:11:49.803408 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Jul 15 23:11:49.803466 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jul 15 23:11:49.803541 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jul 15 23:11:49.803603 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jul 15 23:11:49.803664 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jul 15 23:11:49.803722 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Jul 15 23:11:49.803781 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jul 15 23:11:49.803918 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jul 15 23:11:49.803988 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jul 15 23:11:49.804050 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jul 15 23:11:49.804109 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jul 15 23:11:49.804171 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Jul 15 23:11:49.804228 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jul 15 23:11:49.804295 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jul 15 23:11:49.804356 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jul 15 23:11:49.804416 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jul 15 23:11:49.804474 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Jul 15 23:11:49.804532 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jul 15 23:11:49.804598 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jul 15 23:11:49.804658 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jul 15 23:11:49.804716 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jul 15 23:11:49.804773 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jul 15 23:11:49.806929 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jul 15 23:11:49.807019 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jul 15 23:11:49.807083 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jul 15 23:11:49.807149 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jul 15 23:11:49.807210 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jul 15 23:11:49.807280 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Jul 15 23:11:49.807337 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jul 15 23:11:49.807403 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Jul 15 23:11:49.807464 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Jul 15 23:11:49.807524 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Jul 15 23:11:49.807585 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jul 15 23:11:49.807643 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jul 15 23:11:49.807702 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Jul 15 23:11:49.807760 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jul 15 23:11:49.807862 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jul 15 23:11:49.807937 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jul 15 23:11:49.807997 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Jul 15 23:11:49.808056 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jul 15 23:11:49.808116 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jul 15 23:11:49.808174 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Jul 15 23:11:49.808235 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Jul 15 23:11:49.808302 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jul 15 23:11:49.808363 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jul 15 23:11:49.808415 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jul 15 23:11:49.808467 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jul 15 23:11:49.808531 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jul 15 23:11:49.808586 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jul 15 23:11:49.808641 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jul 15 23:11:49.808707 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Jul 15 23:11:49.808761 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jul 15 23:11:49.808827 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jul 15 23:11:49.809224 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Jul 15 23:11:49.809290 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jul 15 23:11:49.809345 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jul 15 23:11:49.809413 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Jul 15 23:11:49.809470 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jul 15 23:11:49.809524 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jul 15 23:11:49.809586 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Jul 15 23:11:49.809642 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jul 15 23:11:49.809696 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jul 15 23:11:49.809760 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Jul 15 23:11:49.809910 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jul 15 23:11:49.809983 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jul 15 23:11:49.810050 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Jul 15 23:11:49.810105 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jul 15 23:11:49.810159 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jul 15 23:11:49.810219 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Jul 15 23:11:49.810282 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jul 15 23:11:49.810336 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jul 15 23:11:49.810396 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Jul 15 23:11:49.810450 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jul 15 23:11:49.810503 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jul 15 23:11:49.810513 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jul 15 23:11:49.810521 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jul 15 23:11:49.810528 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jul 15 23:11:49.810538 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jul 15 23:11:49.810545 kernel: iommu: Default domain type: Translated Jul 15 23:11:49.810552 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jul 15 23:11:49.810560 kernel: efivars: Registered efivars operations Jul 15 23:11:49.810567 kernel: vgaarb: loaded Jul 15 23:11:49.810575 kernel: clocksource: Switched to clocksource arch_sys_counter Jul 15 23:11:49.810582 kernel: VFS: Disk quotas dquot_6.6.0 Jul 15 23:11:49.810589 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jul 15 23:11:49.810597 kernel: pnp: PnP ACPI init Jul 15 23:11:49.810910 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jul 15 23:11:49.810927 kernel: pnp: PnP ACPI: found 1 devices Jul 15 23:11:49.810935 kernel: NET: Registered PF_INET protocol family Jul 15 23:11:49.810942 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jul 15 23:11:49.810950 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jul 15 23:11:49.810957 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jul 15 23:11:49.810965 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jul 15 23:11:49.810972 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jul 15 23:11:49.810984 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jul 15 23:11:49.810991 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 15 23:11:49.810999 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jul 15 23:11:49.811006 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jul 15 23:11:49.811077 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jul 15 23:11:49.811088 kernel: PCI: CLS 0 bytes, default 64 Jul 15 23:11:49.811095 kernel: kvm [1]: HYP mode not available Jul 15 23:11:49.811102 kernel: Initialise system trusted keyrings Jul 15 23:11:49.811110 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jul 15 23:11:49.811119 kernel: Key type asymmetric registered Jul 15 23:11:49.811126 kernel: Asymmetric key parser 'x509' registered Jul 15 23:11:49.811133 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jul 15 23:11:49.811141 kernel: io scheduler mq-deadline registered Jul 15 23:11:49.811148 kernel: io scheduler kyber registered Jul 15 23:11:49.811156 kernel: io scheduler bfq registered Jul 15 23:11:49.811163 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jul 15 23:11:49.811224 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Jul 15 23:11:49.811284 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Jul 15 23:11:49.811345 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:11:49.811405 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Jul 15 23:11:49.811464 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Jul 15 23:11:49.811522 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:11:49.811583 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Jul 15 23:11:49.811641 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Jul 15 23:11:49.811699 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:11:49.811758 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Jul 15 23:11:49.811831 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Jul 15 23:11:49.811916 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:11:49.811980 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Jul 15 23:11:49.812038 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Jul 15 23:11:49.812098 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:11:49.812168 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Jul 15 23:11:49.812228 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Jul 15 23:11:49.812292 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:11:49.812354 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Jul 15 23:11:49.812413 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Jul 15 23:11:49.812471 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:11:49.812530 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Jul 15 23:11:49.812588 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Jul 15 23:11:49.812646 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:11:49.812656 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jul 15 23:11:49.812716 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Jul 15 23:11:49.812774 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Jul 15 23:11:49.812915 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jul 15 23:11:49.812929 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jul 15 23:11:49.812937 kernel: ACPI: button: Power Button [PWRB] Jul 15 23:11:49.812945 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jul 15 23:11:49.813014 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jul 15 23:11:49.813080 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Jul 15 23:11:49.813093 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jul 15 23:11:49.813101 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jul 15 23:11:49.813164 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Jul 15 23:11:49.813175 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Jul 15 23:11:49.813183 kernel: thunder_xcv, ver 1.0 Jul 15 23:11:49.813190 kernel: thunder_bgx, ver 1.0 Jul 15 23:11:49.813197 kernel: nicpf, ver 1.0 Jul 15 23:11:49.813204 kernel: nicvf, ver 1.0 Jul 15 23:11:49.813273 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jul 15 23:11:49.813338 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-07-15T23:11:49 UTC (1752621109) Jul 15 23:11:49.813348 kernel: hid: raw HID events driver (C) Jiri Kosina Jul 15 23:11:49.813356 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jul 15 23:11:49.813364 kernel: watchdog: NMI not fully supported Jul 15 23:11:49.813371 kernel: watchdog: Hard watchdog permanently disabled Jul 15 23:11:49.813379 kernel: NET: Registered PF_INET6 protocol family Jul 15 23:11:49.813386 kernel: Segment Routing with IPv6 Jul 15 23:11:49.813393 kernel: In-situ OAM (IOAM) with IPv6 Jul 15 23:11:49.813400 kernel: NET: Registered PF_PACKET protocol family Jul 15 23:11:49.813409 kernel: Key type dns_resolver registered Jul 15 23:11:49.813417 kernel: registered taskstats version 1 Jul 15 23:11:49.813424 kernel: Loading compiled-in X.509 certificates Jul 15 23:11:49.813431 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.36-flatcar: 2e049b1166d7080a2074348abe7e86e115624bdd' Jul 15 23:11:49.813438 kernel: Demotion targets for Node 0: null Jul 15 23:11:49.813446 kernel: Key type .fscrypt registered Jul 15 23:11:49.813453 kernel: Key type fscrypt-provisioning registered Jul 15 23:11:49.813460 kernel: ima: No TPM chip found, activating TPM-bypass! Jul 15 23:11:49.813469 kernel: ima: Allocated hash algorithm: sha1 Jul 15 23:11:49.813477 kernel: ima: No architecture policies found Jul 15 23:11:49.813484 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jul 15 23:11:49.813491 kernel: clk: Disabling unused clocks Jul 15 23:11:49.813499 kernel: PM: genpd: Disabling unused power domains Jul 15 23:11:49.813506 kernel: Warning: unable to open an initial console. Jul 15 23:11:49.813514 kernel: Freeing unused kernel memory: 39488K Jul 15 23:11:49.813522 kernel: Run /init as init process Jul 15 23:11:49.813529 kernel: with arguments: Jul 15 23:11:49.813538 kernel: /init Jul 15 23:11:49.813545 kernel: with environment: Jul 15 23:11:49.813552 kernel: HOME=/ Jul 15 23:11:49.813560 kernel: TERM=linux Jul 15 23:11:49.813567 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Jul 15 23:11:49.813574 systemd[1]: Successfully made /usr/ read-only. Jul 15 23:11:49.813584 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 23:11:49.813593 systemd[1]: Detected virtualization kvm. Jul 15 23:11:49.813602 systemd[1]: Detected architecture arm64. Jul 15 23:11:49.813609 systemd[1]: Running in initrd. Jul 15 23:11:49.813616 systemd[1]: No hostname configured, using default hostname. Jul 15 23:11:49.813624 systemd[1]: Hostname set to . Jul 15 23:11:49.813632 systemd[1]: Initializing machine ID from VM UUID. Jul 15 23:11:49.813640 systemd[1]: Queued start job for default target initrd.target. Jul 15 23:11:49.813647 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 23:11:49.813655 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 23:11:49.813666 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jul 15 23:11:49.813674 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 23:11:49.813682 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jul 15 23:11:49.813690 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jul 15 23:11:49.813699 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Jul 15 23:11:49.813707 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Jul 15 23:11:49.813715 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 23:11:49.813724 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 23:11:49.813732 systemd[1]: Reached target paths.target - Path Units. Jul 15 23:11:49.813740 systemd[1]: Reached target slices.target - Slice Units. Jul 15 23:11:49.813747 systemd[1]: Reached target swap.target - Swaps. Jul 15 23:11:49.813755 systemd[1]: Reached target timers.target - Timer Units. Jul 15 23:11:49.813763 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 23:11:49.813771 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 23:11:49.813779 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jul 15 23:11:49.813788 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jul 15 23:11:49.813796 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 23:11:49.813803 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 23:11:49.813811 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 23:11:49.813831 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 23:11:49.813839 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jul 15 23:11:49.814073 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 23:11:49.814082 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jul 15 23:11:49.814090 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jul 15 23:11:49.814102 systemd[1]: Starting systemd-fsck-usr.service... Jul 15 23:11:49.814110 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 23:11:49.814118 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 23:11:49.814126 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 23:11:49.814133 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jul 15 23:11:49.814430 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 23:11:49.814443 systemd[1]: Finished systemd-fsck-usr.service. Jul 15 23:11:49.814451 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 23:11:49.814489 systemd-journald[244]: Collecting audit messages is disabled. Jul 15 23:11:49.814513 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jul 15 23:11:49.814521 kernel: Bridge firewalling registered Jul 15 23:11:49.814532 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 23:11:49.814541 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 23:11:49.814549 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:11:49.814557 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jul 15 23:11:49.814565 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 23:11:49.814576 systemd-journald[244]: Journal started Jul 15 23:11:49.814594 systemd-journald[244]: Runtime Journal (/run/log/journal/b6f55687f07f4ac9878af85274ea6e3d) is 8M, max 76.5M, 68.5M free. Jul 15 23:11:49.771360 systemd-modules-load[245]: Inserted module 'overlay' Jul 15 23:11:49.787398 systemd-modules-load[245]: Inserted module 'br_netfilter' Jul 15 23:11:49.818868 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 23:11:49.821971 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 23:11:49.827308 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 23:11:49.828116 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 23:11:49.840919 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 23:11:49.845096 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jul 15 23:11:49.846157 systemd-tmpfiles[269]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jul 15 23:11:49.848068 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 23:11:49.855069 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 23:11:49.863499 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 23:11:49.873013 dracut-cmdline[282]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=6efbcbd16e8e41b645be9f8e34b328753e37d282675200dab08e504f8e58a578 Jul 15 23:11:49.905940 systemd-resolved[288]: Positive Trust Anchors: Jul 15 23:11:49.906521 systemd-resolved[288]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 23:11:49.906554 systemd-resolved[288]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 23:11:49.915993 systemd-resolved[288]: Defaulting to hostname 'linux'. Jul 15 23:11:49.917528 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 23:11:49.919079 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 23:11:49.963881 kernel: SCSI subsystem initialized Jul 15 23:11:49.968876 kernel: Loading iSCSI transport class v2.0-870. Jul 15 23:11:49.976876 kernel: iscsi: registered transport (tcp) Jul 15 23:11:49.990875 kernel: iscsi: registered transport (qla4xxx) Jul 15 23:11:49.990980 kernel: QLogic iSCSI HBA Driver Jul 15 23:11:50.013513 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 23:11:50.042951 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 23:11:50.048064 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 23:11:50.102959 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jul 15 23:11:50.105025 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jul 15 23:11:50.169897 kernel: raid6: neonx8 gen() 15662 MB/s Jul 15 23:11:50.186886 kernel: raid6: neonx4 gen() 15723 MB/s Jul 15 23:11:50.203892 kernel: raid6: neonx2 gen() 13179 MB/s Jul 15 23:11:50.220955 kernel: raid6: neonx1 gen() 10391 MB/s Jul 15 23:11:50.237906 kernel: raid6: int64x8 gen() 6871 MB/s Jul 15 23:11:50.254891 kernel: raid6: int64x4 gen() 7311 MB/s Jul 15 23:11:50.271911 kernel: raid6: int64x2 gen() 6074 MB/s Jul 15 23:11:50.288907 kernel: raid6: int64x1 gen() 5031 MB/s Jul 15 23:11:50.288996 kernel: raid6: using algorithm neonx4 gen() 15723 MB/s Jul 15 23:11:50.305927 kernel: raid6: .... xor() 12294 MB/s, rmw enabled Jul 15 23:11:50.306015 kernel: raid6: using neon recovery algorithm Jul 15 23:11:50.311065 kernel: xor: measuring software checksum speed Jul 15 23:11:50.311139 kernel: 8regs : 21647 MB/sec Jul 15 23:11:50.311160 kernel: 32regs : 21710 MB/sec Jul 15 23:11:50.311880 kernel: arm64_neon : 28032 MB/sec Jul 15 23:11:50.311915 kernel: xor: using function: arm64_neon (28032 MB/sec) Jul 15 23:11:50.364886 kernel: Btrfs loaded, zoned=no, fsverity=no Jul 15 23:11:50.374100 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jul 15 23:11:50.377489 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 23:11:50.410734 systemd-udevd[493]: Using default interface naming scheme 'v255'. Jul 15 23:11:50.416110 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 23:11:50.421136 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jul 15 23:11:50.456723 dracut-pre-trigger[502]: rd.md=0: removing MD RAID activation Jul 15 23:11:50.485260 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 23:11:50.488166 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 23:11:50.556885 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 23:11:50.560262 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jul 15 23:11:50.643868 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jul 15 23:11:50.645777 kernel: scsi host0: Virtio SCSI HBA Jul 15 23:11:50.655866 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Jul 15 23:11:50.657953 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jul 15 23:11:50.693241 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 23:11:50.693351 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:11:50.696971 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 23:11:50.700112 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 23:11:50.711005 kernel: ACPI: bus type USB registered Jul 15 23:11:50.711048 kernel: sd 0:0:0:1: Power-on or device reset occurred Jul 15 23:11:50.711196 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jul 15 23:11:50.713538 kernel: sd 0:0:0:1: [sda] Write Protect is off Jul 15 23:11:50.713899 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Jul 15 23:11:50.714022 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jul 15 23:11:50.714664 kernel: usbcore: registered new interface driver usbfs Jul 15 23:11:50.714679 kernel: usbcore: registered new interface driver hub Jul 15 23:11:50.714688 kernel: usbcore: registered new device driver usb Jul 15 23:11:50.724508 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jul 15 23:11:50.724550 kernel: GPT:17805311 != 80003071 Jul 15 23:11:50.724559 kernel: GPT:Alternate GPT header not at the end of the disk. Jul 15 23:11:50.724569 kernel: GPT:17805311 != 80003071 Jul 15 23:11:50.725069 kernel: GPT: Use GNU Parted to correct GPT errors. Jul 15 23:11:50.725867 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 23:11:50.725897 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Jul 15 23:11:50.730906 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:11:50.733941 kernel: sr 0:0:0:0: Power-on or device reset occurred Jul 15 23:11:50.734967 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Jul 15 23:11:50.735105 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jul 15 23:11:50.736872 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jul 15 23:11:50.745530 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 23:11:50.745702 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jul 15 23:11:50.747902 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jul 15 23:11:50.751516 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jul 15 23:11:50.751679 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jul 15 23:11:50.751762 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jul 15 23:11:50.753094 kernel: hub 1-0:1.0: USB hub found Jul 15 23:11:50.753247 kernel: hub 1-0:1.0: 4 ports detected Jul 15 23:11:50.754929 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jul 15 23:11:50.756888 kernel: hub 2-0:1.0: USB hub found Jul 15 23:11:50.757020 kernel: hub 2-0:1.0: 4 ports detected Jul 15 23:11:50.804617 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jul 15 23:11:50.825694 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 23:11:50.837020 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jul 15 23:11:50.843708 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jul 15 23:11:50.844939 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Jul 15 23:11:50.851237 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jul 15 23:11:50.855882 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jul 15 23:11:50.857019 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 23:11:50.858518 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 23:11:50.859268 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 23:11:50.863346 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jul 15 23:11:50.869058 disk-uuid[599]: Primary Header is updated. Jul 15 23:11:50.869058 disk-uuid[599]: Secondary Entries is updated. Jul 15 23:11:50.869058 disk-uuid[599]: Secondary Header is updated. Jul 15 23:11:50.880888 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 23:11:50.887338 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jul 15 23:11:50.989935 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jul 15 23:11:51.122870 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jul 15 23:11:51.122931 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jul 15 23:11:51.124065 kernel: usbcore: registered new interface driver usbhid Jul 15 23:11:51.124102 kernel: usbhid: USB HID core driver Jul 15 23:11:51.227996 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jul 15 23:11:51.354891 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jul 15 23:11:51.407922 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jul 15 23:11:51.899917 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jul 15 23:11:51.901273 disk-uuid[600]: The operation has completed successfully. Jul 15 23:11:51.955765 systemd[1]: disk-uuid.service: Deactivated successfully. Jul 15 23:11:51.955932 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jul 15 23:11:51.985442 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Jul 15 23:11:52.000900 sh[625]: Success Jul 15 23:11:52.018960 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jul 15 23:11:52.019019 kernel: device-mapper: uevent: version 1.0.3 Jul 15 23:11:52.020157 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jul 15 23:11:52.030917 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jul 15 23:11:52.089344 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Jul 15 23:11:52.094052 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Jul 15 23:11:52.105957 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Jul 15 23:11:52.119896 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' Jul 15 23:11:52.119960 kernel: BTRFS: device fsid e70e9257-c19d-4e0a-b2ee-631da7d0eb2b devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (637) Jul 15 23:11:52.123132 kernel: BTRFS info (device dm-0): first mount of filesystem e70e9257-c19d-4e0a-b2ee-631da7d0eb2b Jul 15 23:11:52.123263 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jul 15 23:11:52.123310 kernel: BTRFS info (device dm-0): using free-space-tree Jul 15 23:11:52.134152 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Jul 15 23:11:52.134872 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jul 15 23:11:52.136129 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jul 15 23:11:52.136973 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jul 15 23:11:52.140828 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jul 15 23:11:52.174879 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (667) Jul 15 23:11:52.178359 kernel: BTRFS info (device sda6): first mount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:11:52.178417 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 15 23:11:52.178436 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 23:11:52.187896 kernel: BTRFS info (device sda6): last unmount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:11:52.189483 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jul 15 23:11:52.192268 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jul 15 23:11:52.288634 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 23:11:52.293880 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 23:11:52.335403 systemd-networkd[811]: lo: Link UP Jul 15 23:11:52.335416 systemd-networkd[811]: lo: Gained carrier Jul 15 23:11:52.337082 systemd-networkd[811]: Enumeration completed Jul 15 23:11:52.337408 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 23:11:52.337503 systemd-networkd[811]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:11:52.337507 systemd-networkd[811]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 23:11:52.338650 systemd[1]: Reached target network.target - Network. Jul 15 23:11:52.341332 systemd-networkd[811]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:11:52.341335 systemd-networkd[811]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 23:11:52.341683 systemd-networkd[811]: eth0: Link UP Jul 15 23:11:52.341685 systemd-networkd[811]: eth0: Gained carrier Jul 15 23:11:52.341693 systemd-networkd[811]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:11:52.347154 systemd-networkd[811]: eth1: Link UP Jul 15 23:11:52.346026 ignition[719]: Ignition 2.21.0 Jul 15 23:11:52.347157 systemd-networkd[811]: eth1: Gained carrier Jul 15 23:11:52.346038 ignition[719]: Stage: fetch-offline Jul 15 23:11:52.347166 systemd-networkd[811]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:11:52.346110 ignition[719]: no configs at "/usr/lib/ignition/base.d" Jul 15 23:11:52.348498 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 23:11:52.346120 ignition[719]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:11:52.350663 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jul 15 23:11:52.346359 ignition[719]: parsed url from cmdline: "" Jul 15 23:11:52.346363 ignition[719]: no config URL provided Jul 15 23:11:52.346369 ignition[719]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 23:11:52.346376 ignition[719]: no config at "/usr/lib/ignition/user.ign" Jul 15 23:11:52.346382 ignition[719]: failed to fetch config: resource requires networking Jul 15 23:11:52.346605 ignition[719]: Ignition finished successfully Jul 15 23:11:52.367940 systemd-networkd[811]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 23:11:52.379946 ignition[816]: Ignition 2.21.0 Jul 15 23:11:52.379961 ignition[816]: Stage: fetch Jul 15 23:11:52.380139 ignition[816]: no configs at "/usr/lib/ignition/base.d" Jul 15 23:11:52.380149 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:11:52.380248 ignition[816]: parsed url from cmdline: "" Jul 15 23:11:52.380252 ignition[816]: no config URL provided Jul 15 23:11:52.380256 ignition[816]: reading system config file "/usr/lib/ignition/user.ign" Jul 15 23:11:52.380263 ignition[816]: no config at "/usr/lib/ignition/user.ign" Jul 15 23:11:52.380364 ignition[816]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jul 15 23:11:52.381383 ignition[816]: GET error: Get "http://169.254.169.254/hetzner/v1/userdata": dial tcp 169.254.169.254:80: connect: network is unreachable Jul 15 23:11:52.414944 systemd-networkd[811]: eth0: DHCPv4 address 91.99.216.80/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 23:11:52.582217 ignition[816]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #2 Jul 15 23:11:52.589575 ignition[816]: GET result: OK Jul 15 23:11:52.589815 ignition[816]: parsing config with SHA512: 2d37ed48942789366a17666dd420a42ff48105a54b28f60b88fa0e8eb8e982ba6fc308ada53099bebef7389804bd60fdb0c5fc2f118d21ba3fadc359490598cd Jul 15 23:11:52.598978 unknown[816]: fetched base config from "system" Jul 15 23:11:52.599000 unknown[816]: fetched base config from "system" Jul 15 23:11:52.599806 ignition[816]: fetch: fetch complete Jul 15 23:11:52.599013 unknown[816]: fetched user config from "hetzner" Jul 15 23:11:52.599819 ignition[816]: fetch: fetch passed Jul 15 23:11:52.599930 ignition[816]: Ignition finished successfully Jul 15 23:11:52.605317 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jul 15 23:11:52.608568 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jul 15 23:11:52.640093 ignition[824]: Ignition 2.21.0 Jul 15 23:11:52.640699 ignition[824]: Stage: kargs Jul 15 23:11:52.641008 ignition[824]: no configs at "/usr/lib/ignition/base.d" Jul 15 23:11:52.641047 ignition[824]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:11:52.643472 ignition[824]: kargs: kargs passed Jul 15 23:11:52.643558 ignition[824]: Ignition finished successfully Jul 15 23:11:52.644886 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jul 15 23:11:52.646964 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jul 15 23:11:52.671863 ignition[830]: Ignition 2.21.0 Jul 15 23:11:52.671872 ignition[830]: Stage: disks Jul 15 23:11:52.672014 ignition[830]: no configs at "/usr/lib/ignition/base.d" Jul 15 23:11:52.674067 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jul 15 23:11:52.672024 ignition[830]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:11:52.675826 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jul 15 23:11:52.672938 ignition[830]: disks: disks passed Jul 15 23:11:52.676750 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jul 15 23:11:52.672987 ignition[830]: Ignition finished successfully Jul 15 23:11:52.677759 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 23:11:52.678697 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 23:11:52.679394 systemd[1]: Reached target basic.target - Basic System. Jul 15 23:11:52.681175 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jul 15 23:11:52.712528 systemd-fsck[838]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Jul 15 23:11:52.717488 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jul 15 23:11:52.723492 systemd[1]: Mounting sysroot.mount - /sysroot... Jul 15 23:11:52.806882 kernel: EXT4-fs (sda9): mounted filesystem db08fdf6-07fd-45a1-bb3b-a7d0399d70fd r/w with ordered data mode. Quota mode: none. Jul 15 23:11:52.808408 systemd[1]: Mounted sysroot.mount - /sysroot. Jul 15 23:11:52.809622 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jul 15 23:11:52.812163 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 23:11:52.814300 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jul 15 23:11:52.830158 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jul 15 23:11:52.832969 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jul 15 23:11:52.833013 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 23:11:52.838212 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jul 15 23:11:52.842881 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (846) Jul 15 23:11:52.844309 kernel: BTRFS info (device sda6): first mount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:11:52.844331 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 15 23:11:52.844341 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 23:11:52.847007 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jul 15 23:11:52.854377 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 23:11:52.901594 initrd-setup-root[873]: cut: /sysroot/etc/passwd: No such file or directory Jul 15 23:11:52.906907 initrd-setup-root[880]: cut: /sysroot/etc/group: No such file or directory Jul 15 23:11:52.908257 coreos-metadata[848]: Jul 15 23:11:52.908 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jul 15 23:11:52.910341 coreos-metadata[848]: Jul 15 23:11:52.910 INFO Fetch successful Jul 15 23:11:52.910341 coreos-metadata[848]: Jul 15 23:11:52.910 INFO wrote hostname ci-4372-0-1-n-91aeaf5bee to /sysroot/etc/hostname Jul 15 23:11:52.914386 initrd-setup-root[888]: cut: /sysroot/etc/shadow: No such file or directory Jul 15 23:11:52.914657 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 23:11:52.920478 initrd-setup-root[895]: cut: /sysroot/etc/gshadow: No such file or directory Jul 15 23:11:53.024937 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jul 15 23:11:53.029366 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jul 15 23:11:53.031973 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jul 15 23:11:53.046011 kernel: BTRFS info (device sda6): last unmount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:11:53.069560 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jul 15 23:11:53.084656 ignition[963]: INFO : Ignition 2.21.0 Jul 15 23:11:53.084656 ignition[963]: INFO : Stage: mount Jul 15 23:11:53.087702 ignition[963]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 23:11:53.087702 ignition[963]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:11:53.087702 ignition[963]: INFO : mount: mount passed Jul 15 23:11:53.087702 ignition[963]: INFO : Ignition finished successfully Jul 15 23:11:53.089213 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jul 15 23:11:53.091354 systemd[1]: Starting ignition-files.service - Ignition (files)... Jul 15 23:11:53.122336 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jul 15 23:11:53.124638 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jul 15 23:11:53.154976 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (975) Jul 15 23:11:53.156912 kernel: BTRFS info (device sda6): first mount of filesystem b155db48-94d7-40af-bc6d-97d496102c15 Jul 15 23:11:53.156964 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jul 15 23:11:53.156982 kernel: BTRFS info (device sda6): using free-space-tree Jul 15 23:11:53.164400 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jul 15 23:11:53.197351 ignition[992]: INFO : Ignition 2.21.0 Jul 15 23:11:53.197351 ignition[992]: INFO : Stage: files Jul 15 23:11:53.199219 ignition[992]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 23:11:53.199219 ignition[992]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:11:53.199219 ignition[992]: DEBUG : files: compiled without relabeling support, skipping Jul 15 23:11:53.202291 ignition[992]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jul 15 23:11:53.202291 ignition[992]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jul 15 23:11:53.202291 ignition[992]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jul 15 23:11:53.205353 ignition[992]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jul 15 23:11:53.205353 ignition[992]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jul 15 23:11:53.203363 unknown[992]: wrote ssh authorized keys file for user: core Jul 15 23:11:53.207954 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jul 15 23:11:53.207954 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.13.2-linux-arm64.tar.gz: attempt #1 Jul 15 23:11:53.323165 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jul 15 23:11:53.550149 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.13.2-linux-arm64.tar.gz" Jul 15 23:11:53.551668 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jul 15 23:11:53.551668 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jul 15 23:11:53.551668 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jul 15 23:11:53.551668 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jul 15 23:11:53.551668 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 23:11:53.551668 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jul 15 23:11:53.551668 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 23:11:53.551668 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jul 15 23:11:53.562962 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 23:11:53.562962 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jul 15 23:11:53.562962 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 15 23:11:53.565969 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 15 23:11:53.565969 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 15 23:11:53.565969 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.31.8-arm64.raw: attempt #1 Jul 15 23:11:53.581081 systemd-networkd[811]: eth1: Gained IPv6LL Jul 15 23:11:53.965132 systemd-networkd[811]: eth0: Gained IPv6LL Jul 15 23:11:53.968427 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jul 15 23:11:55.358751 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.31.8-arm64.raw" Jul 15 23:11:55.358751 ignition[992]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jul 15 23:11:55.365033 ignition[992]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 23:11:55.365033 ignition[992]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jul 15 23:11:55.365033 ignition[992]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jul 15 23:11:55.365033 ignition[992]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jul 15 23:11:55.365033 ignition[992]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 23:11:55.365033 ignition[992]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jul 15 23:11:55.365033 ignition[992]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jul 15 23:11:55.365033 ignition[992]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jul 15 23:11:55.365033 ignition[992]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jul 15 23:11:55.365033 ignition[992]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jul 15 23:11:55.365033 ignition[992]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jul 15 23:11:55.365033 ignition[992]: INFO : files: files passed Jul 15 23:11:55.365033 ignition[992]: INFO : Ignition finished successfully Jul 15 23:11:55.368439 systemd[1]: Finished ignition-files.service - Ignition (files). Jul 15 23:11:55.371066 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jul 15 23:11:55.376339 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jul 15 23:11:55.388735 systemd[1]: ignition-quench.service: Deactivated successfully. Jul 15 23:11:55.388937 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jul 15 23:11:55.395735 initrd-setup-root-after-ignition[1022]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 23:11:55.395735 initrd-setup-root-after-ignition[1022]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jul 15 23:11:55.398337 initrd-setup-root-after-ignition[1026]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jul 15 23:11:55.400801 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 23:11:55.403127 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jul 15 23:11:55.404360 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jul 15 23:11:55.454633 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jul 15 23:11:55.455926 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jul 15 23:11:55.457946 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jul 15 23:11:55.459838 systemd[1]: Reached target initrd.target - Initrd Default Target. Jul 15 23:11:55.460573 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jul 15 23:11:55.461399 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jul 15 23:11:55.482035 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 23:11:55.485116 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jul 15 23:11:55.515116 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jul 15 23:11:55.516477 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 23:11:55.517379 systemd[1]: Stopped target timers.target - Timer Units. Jul 15 23:11:55.518497 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jul 15 23:11:55.518639 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jul 15 23:11:55.520005 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jul 15 23:11:55.520562 systemd[1]: Stopped target basic.target - Basic System. Jul 15 23:11:55.521538 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jul 15 23:11:55.522531 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jul 15 23:11:55.523472 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jul 15 23:11:55.524508 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jul 15 23:11:55.526076 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jul 15 23:11:55.527086 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jul 15 23:11:55.528136 systemd[1]: Stopped target sysinit.target - System Initialization. Jul 15 23:11:55.529250 systemd[1]: Stopped target local-fs.target - Local File Systems. Jul 15 23:11:55.530158 systemd[1]: Stopped target swap.target - Swaps. Jul 15 23:11:55.531017 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jul 15 23:11:55.531139 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jul 15 23:11:55.532375 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jul 15 23:11:55.533005 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 23:11:55.533581 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jul 15 23:11:55.533665 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 23:11:55.534700 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jul 15 23:11:55.534830 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jul 15 23:11:55.536284 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jul 15 23:11:55.536403 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jul 15 23:11:55.537600 systemd[1]: ignition-files.service: Deactivated successfully. Jul 15 23:11:55.537697 systemd[1]: Stopped ignition-files.service - Ignition (files). Jul 15 23:11:55.538820 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jul 15 23:11:55.538932 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jul 15 23:11:55.540598 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jul 15 23:11:55.545035 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jul 15 23:11:55.545509 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jul 15 23:11:55.545629 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 23:11:55.547059 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jul 15 23:11:55.547152 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jul 15 23:11:55.552182 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jul 15 23:11:55.554145 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jul 15 23:11:55.567995 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jul 15 23:11:55.573113 systemd[1]: sysroot-boot.service: Deactivated successfully. Jul 15 23:11:55.574925 ignition[1046]: INFO : Ignition 2.21.0 Jul 15 23:11:55.574925 ignition[1046]: INFO : Stage: umount Jul 15 23:11:55.574925 ignition[1046]: INFO : no configs at "/usr/lib/ignition/base.d" Jul 15 23:11:55.574925 ignition[1046]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jul 15 23:11:55.574925 ignition[1046]: INFO : umount: umount passed Jul 15 23:11:55.578326 ignition[1046]: INFO : Ignition finished successfully Jul 15 23:11:55.575837 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jul 15 23:11:55.577231 systemd[1]: ignition-mount.service: Deactivated successfully. Jul 15 23:11:55.577324 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jul 15 23:11:55.578504 systemd[1]: ignition-disks.service: Deactivated successfully. Jul 15 23:11:55.578597 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jul 15 23:11:55.579438 systemd[1]: ignition-kargs.service: Deactivated successfully. Jul 15 23:11:55.579492 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jul 15 23:11:55.580433 systemd[1]: ignition-fetch.service: Deactivated successfully. Jul 15 23:11:55.580474 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jul 15 23:11:55.581366 systemd[1]: Stopped target network.target - Network. Jul 15 23:11:55.582308 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jul 15 23:11:55.582362 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jul 15 23:11:55.583358 systemd[1]: Stopped target paths.target - Path Units. Jul 15 23:11:55.584146 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jul 15 23:11:55.587936 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 23:11:55.589882 systemd[1]: Stopped target slices.target - Slice Units. Jul 15 23:11:55.591560 systemd[1]: Stopped target sockets.target - Socket Units. Jul 15 23:11:55.592599 systemd[1]: iscsid.socket: Deactivated successfully. Jul 15 23:11:55.592652 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jul 15 23:11:55.593520 systemd[1]: iscsiuio.socket: Deactivated successfully. Jul 15 23:11:55.593559 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jul 15 23:11:55.594456 systemd[1]: ignition-setup.service: Deactivated successfully. Jul 15 23:11:55.594518 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jul 15 23:11:55.595339 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jul 15 23:11:55.595378 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jul 15 23:11:55.596192 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jul 15 23:11:55.596238 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jul 15 23:11:55.597247 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jul 15 23:11:55.598157 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jul 15 23:11:55.607006 systemd[1]: systemd-resolved.service: Deactivated successfully. Jul 15 23:11:55.607221 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jul 15 23:11:55.613509 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Jul 15 23:11:55.614931 systemd[1]: systemd-networkd.service: Deactivated successfully. Jul 15 23:11:55.615659 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jul 15 23:11:55.618163 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Jul 15 23:11:55.619162 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jul 15 23:11:55.620408 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jul 15 23:11:55.620449 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jul 15 23:11:55.621895 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jul 15 23:11:55.624940 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jul 15 23:11:55.625004 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jul 15 23:11:55.627058 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jul 15 23:11:55.627129 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jul 15 23:11:55.628960 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jul 15 23:11:55.629003 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jul 15 23:11:55.629989 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jul 15 23:11:55.630032 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 23:11:55.632124 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 23:11:55.638481 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Jul 15 23:11:55.638552 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Jul 15 23:11:55.646833 systemd[1]: systemd-udevd.service: Deactivated successfully. Jul 15 23:11:55.656255 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 23:11:55.658704 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jul 15 23:11:55.658830 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jul 15 23:11:55.661119 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jul 15 23:11:55.661177 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 23:11:55.662606 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jul 15 23:11:55.662688 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jul 15 23:11:55.664872 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jul 15 23:11:55.664988 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jul 15 23:11:55.666195 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jul 15 23:11:55.666244 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jul 15 23:11:55.668173 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jul 15 23:11:55.668860 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jul 15 23:11:55.668909 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 23:11:55.672340 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jul 15 23:11:55.672404 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 23:11:55.675135 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Jul 15 23:11:55.675188 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 23:11:55.677610 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jul 15 23:11:55.677662 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 23:11:55.679032 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jul 15 23:11:55.679135 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:11:55.683150 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. Jul 15 23:11:55.683211 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev\x2dearly.service.mount: Deactivated successfully. Jul 15 23:11:55.683241 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. Jul 15 23:11:55.683275 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Jul 15 23:11:55.683608 systemd[1]: network-cleanup.service: Deactivated successfully. Jul 15 23:11:55.685924 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jul 15 23:11:55.690216 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jul 15 23:11:55.691024 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jul 15 23:11:55.692300 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jul 15 23:11:55.695352 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jul 15 23:11:55.718015 systemd[1]: Switching root. Jul 15 23:11:55.756169 systemd-journald[244]: Journal stopped Jul 15 23:11:56.681270 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Jul 15 23:11:56.681325 kernel: SELinux: policy capability network_peer_controls=1 Jul 15 23:11:56.681337 kernel: SELinux: policy capability open_perms=1 Jul 15 23:11:56.681346 kernel: SELinux: policy capability extended_socket_class=1 Jul 15 23:11:56.681355 kernel: SELinux: policy capability always_check_network=0 Jul 15 23:11:56.681363 kernel: SELinux: policy capability cgroup_seclabel=1 Jul 15 23:11:56.681372 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jul 15 23:11:56.681381 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jul 15 23:11:56.681389 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jul 15 23:11:56.681400 kernel: SELinux: policy capability userspace_initial_context=0 Jul 15 23:11:56.681409 kernel: audit: type=1403 audit(1752621115.896:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Jul 15 23:11:56.681418 systemd[1]: Successfully loaded SELinux policy in 35.709ms. Jul 15 23:11:56.681437 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 10.610ms. Jul 15 23:11:56.681448 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jul 15 23:11:56.681460 systemd[1]: Detected virtualization kvm. Jul 15 23:11:56.681470 systemd[1]: Detected architecture arm64. Jul 15 23:11:56.681480 systemd[1]: Detected first boot. Jul 15 23:11:56.681491 systemd[1]: Hostname set to . Jul 15 23:11:56.681500 systemd[1]: Initializing machine ID from VM UUID. Jul 15 23:11:56.681510 zram_generator::config[1090]: No configuration found. Jul 15 23:11:56.681521 kernel: NET: Registered PF_VSOCK protocol family Jul 15 23:11:56.681530 systemd[1]: Populated /etc with preset unit settings. Jul 15 23:11:56.681541 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Jul 15 23:11:56.681551 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jul 15 23:11:56.681560 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jul 15 23:11:56.681570 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jul 15 23:11:56.681582 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jul 15 23:11:56.681594 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jul 15 23:11:56.681606 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jul 15 23:11:56.681616 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jul 15 23:11:56.681626 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jul 15 23:11:56.681639 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jul 15 23:11:56.681649 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jul 15 23:11:56.681659 systemd[1]: Created slice user.slice - User and Session Slice. Jul 15 23:11:56.681669 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jul 15 23:11:56.681679 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jul 15 23:11:56.681700 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jul 15 23:11:56.681712 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jul 15 23:11:56.681722 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jul 15 23:11:56.681732 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jul 15 23:11:56.681748 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jul 15 23:11:56.681758 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jul 15 23:11:56.681768 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jul 15 23:11:56.681777 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jul 15 23:11:56.681787 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jul 15 23:11:56.681797 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jul 15 23:11:56.681808 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jul 15 23:11:56.681818 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jul 15 23:11:56.681830 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jul 15 23:11:56.681856 systemd[1]: Reached target slices.target - Slice Units. Jul 15 23:11:56.681868 systemd[1]: Reached target swap.target - Swaps. Jul 15 23:11:56.681878 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jul 15 23:11:56.681888 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jul 15 23:11:56.681898 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jul 15 23:11:56.681907 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jul 15 23:11:56.681919 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jul 15 23:11:56.681930 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jul 15 23:11:56.681940 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jul 15 23:11:56.681950 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jul 15 23:11:56.681959 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jul 15 23:11:56.681973 systemd[1]: Mounting media.mount - External Media Directory... Jul 15 23:11:56.681982 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jul 15 23:11:56.681992 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jul 15 23:11:56.682002 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jul 15 23:11:56.682014 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jul 15 23:11:56.682024 systemd[1]: Reached target machines.target - Containers. Jul 15 23:11:56.682034 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jul 15 23:11:56.682044 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 23:11:56.682054 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jul 15 23:11:56.682064 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jul 15 23:11:56.682073 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 23:11:56.682083 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 23:11:56.682093 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 23:11:56.682104 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jul 15 23:11:56.682114 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 23:11:56.682124 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jul 15 23:11:56.682133 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jul 15 23:11:56.682143 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jul 15 23:11:56.682153 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jul 15 23:11:56.682164 systemd[1]: Stopped systemd-fsck-usr.service. Jul 15 23:11:56.682174 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 23:11:56.682186 systemd[1]: Starting systemd-journald.service - Journal Service... Jul 15 23:11:56.682197 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jul 15 23:11:56.682207 kernel: loop: module loaded Jul 15 23:11:56.682217 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jul 15 23:11:56.682230 kernel: fuse: init (API version 7.41) Jul 15 23:11:56.682240 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jul 15 23:11:56.682251 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jul 15 23:11:56.682261 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jul 15 23:11:56.682272 systemd[1]: verity-setup.service: Deactivated successfully. Jul 15 23:11:56.682283 systemd[1]: Stopped verity-setup.service. Jul 15 23:11:56.682294 kernel: ACPI: bus type drm_connector registered Jul 15 23:11:56.682303 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jul 15 23:11:56.682313 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jul 15 23:11:56.682323 systemd[1]: Mounted media.mount - External Media Directory. Jul 15 23:11:56.682333 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jul 15 23:11:56.682343 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jul 15 23:11:56.682353 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jul 15 23:11:56.682363 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jul 15 23:11:56.682374 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jul 15 23:11:56.682404 systemd-journald[1162]: Collecting audit messages is disabled. Jul 15 23:11:56.682431 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jul 15 23:11:56.682442 systemd-journald[1162]: Journal started Jul 15 23:11:56.682463 systemd-journald[1162]: Runtime Journal (/run/log/journal/b6f55687f07f4ac9878af85274ea6e3d) is 8M, max 76.5M, 68.5M free. Jul 15 23:11:56.426497 systemd[1]: Queued start job for default target multi-user.target. Jul 15 23:11:56.449958 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jul 15 23:11:56.450611 systemd[1]: systemd-journald.service: Deactivated successfully. Jul 15 23:11:56.685075 systemd[1]: Started systemd-journald.service - Journal Service. Jul 15 23:11:56.687029 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 23:11:56.687216 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 23:11:56.688123 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 23:11:56.689865 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 23:11:56.690788 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 23:11:56.692109 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 23:11:56.693385 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jul 15 23:11:56.693542 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jul 15 23:11:56.695233 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 23:11:56.695550 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 23:11:56.696722 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jul 15 23:11:56.697704 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jul 15 23:11:56.698553 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jul 15 23:11:56.699503 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jul 15 23:11:56.708878 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jul 15 23:11:56.714022 systemd[1]: Reached target network-pre.target - Preparation for Network. Jul 15 23:11:56.717962 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jul 15 23:11:56.722988 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jul 15 23:11:56.723562 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jul 15 23:11:56.723599 systemd[1]: Reached target local-fs.target - Local File Systems. Jul 15 23:11:56.727484 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jul 15 23:11:56.736028 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jul 15 23:11:56.737048 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 23:11:56.740237 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jul 15 23:11:56.742029 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jul 15 23:11:56.743941 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 23:11:56.745960 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jul 15 23:11:56.746647 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 23:11:56.750344 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jul 15 23:11:56.761050 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jul 15 23:11:56.764366 systemd-journald[1162]: Time spent on flushing to /var/log/journal/b6f55687f07f4ac9878af85274ea6e3d is 61.625ms for 1165 entries. Jul 15 23:11:56.764366 systemd-journald[1162]: System Journal (/var/log/journal/b6f55687f07f4ac9878af85274ea6e3d) is 8M, max 584.8M, 576.8M free. Jul 15 23:11:56.833564 systemd-journald[1162]: Received client request to flush runtime journal. Jul 15 23:11:56.833608 kernel: loop0: detected capacity change from 0 to 138376 Jul 15 23:11:56.765701 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jul 15 23:11:56.768982 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jul 15 23:11:56.769722 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jul 15 23:11:56.788939 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jul 15 23:11:56.789636 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jul 15 23:11:56.795201 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jul 15 23:11:56.806398 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jul 15 23:11:56.833738 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jul 15 23:11:56.839988 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jul 15 23:11:56.851065 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jul 15 23:11:56.855075 systemd-tmpfiles[1210]: ACLs are not supported, ignoring. Jul 15 23:11:56.855096 systemd-tmpfiles[1210]: ACLs are not supported, ignoring. Jul 15 23:11:56.865502 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jul 15 23:11:56.866707 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jul 15 23:11:56.870279 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jul 15 23:11:56.870922 kernel: loop1: detected capacity change from 0 to 8 Jul 15 23:11:56.891101 kernel: loop2: detected capacity change from 0 to 203944 Jul 15 23:11:56.914483 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jul 15 23:11:56.918008 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jul 15 23:11:56.939173 kernel: loop3: detected capacity change from 0 to 107312 Jul 15 23:11:56.952668 systemd-tmpfiles[1231]: ACLs are not supported, ignoring. Jul 15 23:11:56.952695 systemd-tmpfiles[1231]: ACLs are not supported, ignoring. Jul 15 23:11:56.963082 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jul 15 23:11:56.987882 kernel: loop4: detected capacity change from 0 to 138376 Jul 15 23:11:57.022953 kernel: loop5: detected capacity change from 0 to 8 Jul 15 23:11:57.029024 kernel: loop6: detected capacity change from 0 to 203944 Jul 15 23:11:57.048961 kernel: loop7: detected capacity change from 0 to 107312 Jul 15 23:11:57.073017 (sd-merge)[1236]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Jul 15 23:11:57.073478 (sd-merge)[1236]: Merged extensions into '/usr'. Jul 15 23:11:57.081014 systemd[1]: Reload requested from client PID 1209 ('systemd-sysext') (unit systemd-sysext.service)... Jul 15 23:11:57.081035 systemd[1]: Reloading... Jul 15 23:11:57.198213 zram_generator::config[1277]: No configuration found. Jul 15 23:11:57.282277 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 23:11:57.316549 ldconfig[1204]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jul 15 23:11:57.373105 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jul 15 23:11:57.373175 systemd[1]: Reloading finished in 291 ms. Jul 15 23:11:57.398234 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jul 15 23:11:57.400884 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jul 15 23:11:57.410004 systemd[1]: Starting ensure-sysext.service... Jul 15 23:11:57.414083 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jul 15 23:11:57.434050 systemd[1]: Reload requested from client PID 1299 ('systemctl') (unit ensure-sysext.service)... Jul 15 23:11:57.434072 systemd[1]: Reloading... Jul 15 23:11:57.448077 systemd-tmpfiles[1300]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jul 15 23:11:57.448603 systemd-tmpfiles[1300]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jul 15 23:11:57.449004 systemd-tmpfiles[1300]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jul 15 23:11:57.450129 systemd-tmpfiles[1300]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Jul 15 23:11:57.450877 systemd-tmpfiles[1300]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Jul 15 23:11:57.451184 systemd-tmpfiles[1300]: ACLs are not supported, ignoring. Jul 15 23:11:57.451298 systemd-tmpfiles[1300]: ACLs are not supported, ignoring. Jul 15 23:11:57.459308 systemd-tmpfiles[1300]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 23:11:57.459317 systemd-tmpfiles[1300]: Skipping /boot Jul 15 23:11:57.473764 systemd-tmpfiles[1300]: Detected autofs mount point /boot during canonicalization of boot. Jul 15 23:11:57.475000 systemd-tmpfiles[1300]: Skipping /boot Jul 15 23:11:57.528963 zram_generator::config[1326]: No configuration found. Jul 15 23:11:57.607564 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 23:11:57.681482 systemd[1]: Reloading finished in 247 ms. Jul 15 23:11:57.710316 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jul 15 23:11:57.715584 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jul 15 23:11:57.724189 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 23:11:57.729067 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jul 15 23:11:57.731593 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jul 15 23:11:57.741179 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jul 15 23:11:57.747981 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jul 15 23:11:57.750142 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jul 15 23:11:57.758917 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jul 15 23:11:57.763108 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 23:11:57.766550 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jul 15 23:11:57.770037 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jul 15 23:11:57.778152 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jul 15 23:11:57.778933 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 23:11:57.779061 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 23:11:57.783523 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 23:11:57.783730 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 23:11:57.784618 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 23:11:57.792947 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jul 15 23:11:57.795645 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jul 15 23:11:57.797358 systemd[1]: Finished ensure-sysext.service. Jul 15 23:11:57.798329 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jul 15 23:11:57.798505 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jul 15 23:11:57.805003 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jul 15 23:11:57.805336 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jul 15 23:11:57.807898 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jul 15 23:11:57.810987 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jul 15 23:11:57.811634 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jul 15 23:11:57.811705 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jul 15 23:11:57.811750 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jul 15 23:11:57.816705 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jul 15 23:11:57.820560 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jul 15 23:11:57.826426 systemd[1]: modprobe@loop.service: Deactivated successfully. Jul 15 23:11:57.828071 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jul 15 23:11:57.830481 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jul 15 23:11:57.836397 systemd[1]: modprobe@drm.service: Deactivated successfully. Jul 15 23:11:57.840340 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jul 15 23:11:57.852298 systemd-udevd[1370]: Using default interface naming scheme 'v255'. Jul 15 23:11:57.858982 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jul 15 23:11:57.874654 augenrules[1406]: No rules Jul 15 23:11:57.875933 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 23:11:57.876140 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 23:11:57.880434 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jul 15 23:11:57.891656 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jul 15 23:11:57.901476 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jul 15 23:11:57.902473 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jul 15 23:11:57.904136 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jul 15 23:11:58.002333 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jul 15 23:11:58.147932 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jul 15 23:11:58.148693 systemd[1]: Reached target time-set.target - System Time Set. Jul 15 23:11:58.161920 systemd-networkd[1424]: lo: Link UP Jul 15 23:11:58.161928 systemd-networkd[1424]: lo: Gained carrier Jul 15 23:11:58.164435 systemd-networkd[1424]: Enumeration completed Jul 15 23:11:58.165910 systemd[1]: Started systemd-networkd.service - Network Configuration. Jul 15 23:11:58.166759 systemd-networkd[1424]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:11:58.167978 systemd-networkd[1424]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 23:11:58.168549 systemd-networkd[1424]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:11:58.168734 systemd-networkd[1424]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jul 15 23:11:58.168839 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jul 15 23:11:58.170916 systemd-networkd[1424]: eth0: Link UP Jul 15 23:11:58.171969 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jul 15 23:11:58.172966 systemd-networkd[1424]: eth0: Gained carrier Jul 15 23:11:58.173050 systemd-networkd[1424]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:11:58.175152 systemd-networkd[1424]: eth1: Link UP Jul 15 23:11:58.177047 systemd-networkd[1424]: eth1: Gained carrier Jul 15 23:11:58.177066 systemd-networkd[1424]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Jul 15 23:11:58.200479 kernel: mousedev: PS/2 mouse device common for all mice Jul 15 23:11:58.200242 systemd-networkd[1424]: eth1: DHCPv4 address 10.0.0.3/32, gateway 10.0.0.1 acquired from 10.0.0.1 Jul 15 23:11:58.200802 systemd-timesyncd[1389]: Network configuration changed, trying to establish connection. Jul 15 23:11:58.204833 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jul 15 23:11:58.208112 systemd-resolved[1369]: Positive Trust Anchors: Jul 15 23:11:58.208125 systemd-resolved[1369]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jul 15 23:11:58.208157 systemd-resolved[1369]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jul 15 23:11:58.215255 systemd-resolved[1369]: Using system hostname 'ci-4372-0-1-n-91aeaf5bee'. Jul 15 23:11:58.218535 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jul 15 23:11:58.219297 systemd[1]: Reached target network.target - Network. Jul 15 23:11:58.219771 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jul 15 23:11:58.220395 systemd[1]: Reached target sysinit.target - System Initialization. Jul 15 23:11:58.221062 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jul 15 23:11:58.221722 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jul 15 23:11:58.222527 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jul 15 23:11:58.223188 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jul 15 23:11:58.223799 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jul 15 23:11:58.224466 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jul 15 23:11:58.224498 systemd[1]: Reached target paths.target - Path Units. Jul 15 23:11:58.224985 systemd[1]: Reached target timers.target - Timer Units. Jul 15 23:11:58.227124 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jul 15 23:11:58.229928 systemd-networkd[1424]: eth0: DHCPv4 address 91.99.216.80/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jul 15 23:11:58.230268 systemd[1]: Starting docker.socket - Docker Socket for the API... Jul 15 23:11:58.230808 systemd-timesyncd[1389]: Network configuration changed, trying to establish connection. Jul 15 23:11:58.233275 systemd-timesyncd[1389]: Network configuration changed, trying to establish connection. Jul 15 23:11:58.234353 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jul 15 23:11:58.237103 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jul 15 23:11:58.237734 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jul 15 23:11:58.245098 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jul 15 23:11:58.246674 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jul 15 23:11:58.249148 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jul 15 23:11:58.250512 systemd[1]: Reached target sockets.target - Socket Units. Jul 15 23:11:58.251368 systemd[1]: Reached target basic.target - Basic System. Jul 15 23:11:58.251928 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jul 15 23:11:58.251956 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jul 15 23:11:58.254956 systemd[1]: Starting containerd.service - containerd container runtime... Jul 15 23:11:58.256827 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jul 15 23:11:58.260655 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jul 15 23:11:58.265014 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jul 15 23:11:58.269392 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jul 15 23:11:58.271192 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jul 15 23:11:58.271722 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jul 15 23:11:58.273744 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jul 15 23:11:58.277612 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jul 15 23:11:58.283017 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jul 15 23:11:58.289311 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jul 15 23:11:58.306749 systemd[1]: Starting systemd-logind.service - User Login Management... Jul 15 23:11:58.309903 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jul 15 23:11:58.310355 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jul 15 23:11:58.312060 systemd[1]: Starting update-engine.service - Update Engine... Jul 15 23:11:58.315220 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jul 15 23:11:58.319078 jq[1480]: false Jul 15 23:11:58.320881 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jul 15 23:11:58.326464 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jul 15 23:11:58.339696 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jul 15 23:11:58.342296 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jul 15 23:11:58.342490 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jul 15 23:11:58.344580 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jul 15 23:11:58.344787 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jul 15 23:11:58.355497 coreos-metadata[1477]: Jul 15 23:11:58.354 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jul 15 23:11:58.359231 extend-filesystems[1481]: Found /dev/sda6 Jul 15 23:11:58.360539 coreos-metadata[1477]: Jul 15 23:11:58.359 INFO Fetch successful Jul 15 23:11:58.360539 coreos-metadata[1477]: Jul 15 23:11:58.360 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jul 15 23:11:58.360139 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jul 15 23:11:58.361731 jq[1490]: true Jul 15 23:11:58.364785 coreos-metadata[1477]: Jul 15 23:11:58.363 INFO Fetch successful Jul 15 23:11:58.365814 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jul 15 23:11:58.394347 tar[1494]: linux-arm64/helm Jul 15 23:11:58.401305 extend-filesystems[1481]: Found /dev/sda9 Jul 15 23:11:58.411761 extend-filesystems[1481]: Checking size of /dev/sda9 Jul 15 23:11:58.411140 (ntainerd)[1512]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Jul 15 23:11:58.428438 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jul 15 23:11:58.430888 dbus-daemon[1478]: [system] SELinux support is enabled Jul 15 23:11:58.431129 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jul 15 23:11:58.436999 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jul 15 23:11:58.437036 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jul 15 23:11:58.437706 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jul 15 23:11:58.437724 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jul 15 23:11:58.442578 systemd[1]: motdgen.service: Deactivated successfully. Jul 15 23:11:58.444018 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jul 15 23:11:58.447036 jq[1510]: true Jul 15 23:11:58.452322 update_engine[1489]: I20250715 23:11:58.451678 1489 main.cc:92] Flatcar Update Engine starting Jul 15 23:11:58.458896 systemd[1]: Started update-engine.service - Update Engine. Jul 15 23:11:58.460776 update_engine[1489]: I20250715 23:11:58.457983 1489 update_check_scheduler.cc:74] Next update check in 2m18s Jul 15 23:11:58.460804 extend-filesystems[1481]: Resized partition /dev/sda9 Jul 15 23:11:58.468894 extend-filesystems[1534]: resize2fs 1.47.2 (1-Jan-2025) Jul 15 23:11:58.480885 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Jul 15 23:11:58.495037 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jul 15 23:11:58.549129 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Jul 15 23:11:58.549217 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jul 15 23:11:58.549237 kernel: [drm] features: -context_init Jul 15 23:11:58.550245 kernel: [drm] number of scanouts: 1 Jul 15 23:11:58.555988 kernel: [drm] number of cap sets: 0 Jul 15 23:11:58.556043 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jul 15 23:11:58.580869 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jul 15 23:11:58.581884 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jul 15 23:11:58.591126 systemd-logind[1488]: New seat seat0. Jul 15 23:11:58.596362 systemd[1]: Started systemd-logind.service - User Login Management. Jul 15 23:11:58.613108 bash[1556]: Updated "/home/core/.ssh/authorized_keys" Jul 15 23:11:58.620898 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jul 15 23:11:58.628443 systemd[1]: Starting sshkeys.service... Jul 15 23:11:58.652867 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Jul 15 23:11:58.671866 extend-filesystems[1534]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jul 15 23:11:58.671866 extend-filesystems[1534]: old_desc_blocks = 1, new_desc_blocks = 5 Jul 15 23:11:58.671866 extend-filesystems[1534]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Jul 15 23:11:58.680866 extend-filesystems[1481]: Resized filesystem in /dev/sda9 Jul 15 23:11:58.674143 systemd[1]: extend-filesystems.service: Deactivated successfully. Jul 15 23:11:58.676972 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jul 15 23:11:58.704452 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jul 15 23:11:58.712955 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jul 15 23:11:58.733953 locksmithd[1533]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jul 15 23:11:58.823951 containerd[1512]: time="2025-07-15T23:11:58Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jul 15 23:11:58.827858 containerd[1512]: time="2025-07-15T23:11:58.824829200Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 Jul 15 23:11:58.846509 containerd[1512]: time="2025-07-15T23:11:58.846465360Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.44µs" Jul 15 23:11:58.846620 containerd[1512]: time="2025-07-15T23:11:58.846604880Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jul 15 23:11:58.846690 containerd[1512]: time="2025-07-15T23:11:58.846675400Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jul 15 23:11:58.846916 containerd[1512]: time="2025-07-15T23:11:58.846894920Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jul 15 23:11:58.847306 containerd[1512]: time="2025-07-15T23:11:58.847286040Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jul 15 23:11:58.847386 containerd[1512]: time="2025-07-15T23:11:58.847372880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 23:11:58.847504 containerd[1512]: time="2025-07-15T23:11:58.847486160Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jul 15 23:11:58.847557 containerd[1512]: time="2025-07-15T23:11:58.847543320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 23:11:58.847930 containerd[1512]: time="2025-07-15T23:11:58.847904360Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jul 15 23:11:58.848265 containerd[1512]: time="2025-07-15T23:11:58.848250080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 23:11:58.848340 containerd[1512]: time="2025-07-15T23:11:58.848324760Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jul 15 23:11:58.848384 containerd[1512]: time="2025-07-15T23:11:58.848373320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jul 15 23:11:58.848521 containerd[1512]: time="2025-07-15T23:11:58.848503240Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jul 15 23:11:58.849082 containerd[1512]: time="2025-07-15T23:11:58.849057600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 23:11:58.849475 coreos-metadata[1579]: Jul 15 23:11:58.849 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jul 15 23:11:58.849746 containerd[1512]: time="2025-07-15T23:11:58.849721280Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jul 15 23:11:58.849800 containerd[1512]: time="2025-07-15T23:11:58.849786560Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jul 15 23:11:58.849991 containerd[1512]: time="2025-07-15T23:11:58.849972040Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jul 15 23:11:58.850839 containerd[1512]: time="2025-07-15T23:11:58.850817120Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jul 15 23:11:58.850986 containerd[1512]: time="2025-07-15T23:11:58.850969600Z" level=info msg="metadata content store policy set" policy=shared Jul 15 23:11:58.854352 coreos-metadata[1579]: Jul 15 23:11:58.853 INFO Fetch successful Jul 15 23:11:58.855296 unknown[1579]: wrote ssh authorized keys file for user: core Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859159840Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859227120Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859241640Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859253680Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859265840Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859311880Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859325520Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859337440Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859350240Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859362440Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859372720Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859385600Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859504240Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jul 15 23:11:58.860366 containerd[1512]: time="2025-07-15T23:11:58.859524120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859538880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859549720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859559920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859570560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859582320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859591560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859603400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859615680Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859625560Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859866760Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859887360Z" level=info msg="Start snapshots syncer" Jul 15 23:11:58.860690 containerd[1512]: time="2025-07-15T23:11:58.859925760Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jul 15 23:11:58.861036 containerd[1512]: time="2025-07-15T23:11:58.860263480Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jul 15 23:11:58.861036 containerd[1512]: time="2025-07-15T23:11:58.860316160Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jul 15 23:11:58.861235 containerd[1512]: time="2025-07-15T23:11:58.860940120Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jul 15 23:11:58.861382 containerd[1512]: time="2025-07-15T23:11:58.861362880Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862489480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862514200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862524960Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862537600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862550080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862562760Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862591440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862610160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862620880Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862701160Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862719760Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862729040Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862738200Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jul 15 23:11:58.863202 containerd[1512]: time="2025-07-15T23:11:58.862792120Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jul 15 23:11:58.863468 containerd[1512]: time="2025-07-15T23:11:58.862805080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jul 15 23:11:58.863468 containerd[1512]: time="2025-07-15T23:11:58.862815840Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jul 15 23:11:58.863468 containerd[1512]: time="2025-07-15T23:11:58.862899720Z" level=info msg="runtime interface created" Jul 15 23:11:58.863468 containerd[1512]: time="2025-07-15T23:11:58.862905960Z" level=info msg="created NRI interface" Jul 15 23:11:58.863468 containerd[1512]: time="2025-07-15T23:11:58.862914360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jul 15 23:11:58.863468 containerd[1512]: time="2025-07-15T23:11:58.862927840Z" level=info msg="Connect containerd service" Jul 15 23:11:58.863468 containerd[1512]: time="2025-07-15T23:11:58.862956880Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jul 15 23:11:58.864339 containerd[1512]: time="2025-07-15T23:11:58.864312920Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 15 23:11:58.890122 systemd-logind[1488]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jul 15 23:11:58.922363 systemd-logind[1488]: Watching system buttons on /dev/input/event0 (Power Button) Jul 15 23:11:58.927834 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jul 15 23:11:58.936808 update-ssh-keys[1588]: Updated "/home/core/.ssh/authorized_keys" Jul 15 23:11:58.938333 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jul 15 23:11:58.942584 systemd[1]: Finished sshkeys.service. Jul 15 23:11:59.092081 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jul 15 23:11:59.150117 containerd[1512]: time="2025-07-15T23:11:59.150076680Z" level=info msg="Start subscribing containerd event" Jul 15 23:11:59.150944 containerd[1512]: time="2025-07-15T23:11:59.150912240Z" level=info msg="Start recovering state" Jul 15 23:11:59.151039 containerd[1512]: time="2025-07-15T23:11:59.151018960Z" level=info msg="Start event monitor" Jul 15 23:11:59.151039 containerd[1512]: time="2025-07-15T23:11:59.151038840Z" level=info msg="Start cni network conf syncer for default" Jul 15 23:11:59.151109 containerd[1512]: time="2025-07-15T23:11:59.151046960Z" level=info msg="Start streaming server" Jul 15 23:11:59.151109 containerd[1512]: time="2025-07-15T23:11:59.151055640Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jul 15 23:11:59.151109 containerd[1512]: time="2025-07-15T23:11:59.151062440Z" level=info msg="runtime interface starting up..." Jul 15 23:11:59.151109 containerd[1512]: time="2025-07-15T23:11:59.151067880Z" level=info msg="starting plugins..." Jul 15 23:11:59.151109 containerd[1512]: time="2025-07-15T23:11:59.151080280Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jul 15 23:11:59.151190 containerd[1512]: time="2025-07-15T23:11:59.150322320Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jul 15 23:11:59.151221 containerd[1512]: time="2025-07-15T23:11:59.151204880Z" level=info msg=serving... address=/run/containerd/containerd.sock Jul 15 23:11:59.151347 systemd[1]: Started containerd.service - containerd container runtime. Jul 15 23:11:59.151961 containerd[1512]: time="2025-07-15T23:11:59.151926600Z" level=info msg="containerd successfully booted in 0.328349s" Jul 15 23:11:59.305739 tar[1494]: linux-arm64/LICENSE Jul 15 23:11:59.306099 tar[1494]: linux-arm64/README.md Jul 15 23:11:59.325547 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jul 15 23:11:59.561156 sshd_keygen[1504]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jul 15 23:11:59.589228 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jul 15 23:11:59.593678 systemd[1]: Starting issuegen.service - Generate /run/issue... Jul 15 23:11:59.633171 systemd[1]: issuegen.service: Deactivated successfully. Jul 15 23:11:59.633436 systemd[1]: Finished issuegen.service - Generate /run/issue. Jul 15 23:11:59.636827 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jul 15 23:11:59.664401 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jul 15 23:11:59.669243 systemd[1]: Started getty@tty1.service - Getty on tty1. Jul 15 23:11:59.672352 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jul 15 23:11:59.673148 systemd[1]: Reached target getty.target - Login Prompts. Jul 15 23:11:59.917066 systemd-networkd[1424]: eth1: Gained IPv6LL Jul 15 23:11:59.917701 systemd-timesyncd[1389]: Network configuration changed, trying to establish connection. Jul 15 23:11:59.920957 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jul 15 23:11:59.922238 systemd[1]: Reached target network-online.target - Network is Online. Jul 15 23:11:59.926337 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:11:59.930066 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jul 15 23:11:59.958906 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jul 15 23:12:00.173448 systemd-networkd[1424]: eth0: Gained IPv6LL Jul 15 23:12:00.175391 systemd-timesyncd[1389]: Network configuration changed, trying to establish connection. Jul 15 23:12:00.764284 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:12:00.765678 systemd[1]: Reached target multi-user.target - Multi-User System. Jul 15 23:12:00.771095 systemd[1]: Startup finished in 2.280s (kernel) + 6.303s (initrd) + 4.909s (userspace) = 13.493s. Jul 15 23:12:00.771810 (kubelet)[1649]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:12:01.317764 kubelet[1649]: E0715 23:12:01.317714 1649 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:12:01.321161 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:12:01.321428 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:12:01.322396 systemd[1]: kubelet.service: Consumed 953ms CPU time, 258.1M memory peak. Jul 15 23:12:03.591870 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jul 15 23:12:03.594294 systemd[1]: Started sshd@0-91.99.216.80:22-139.178.68.195:40004.service - OpenSSH per-connection server daemon (139.178.68.195:40004). Jul 15 23:12:04.632225 sshd[1661]: Accepted publickey for core from 139.178.68.195 port 40004 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:12:04.635913 sshd-session[1661]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:12:04.650971 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jul 15 23:12:04.653375 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jul 15 23:12:04.657528 systemd-logind[1488]: New session 1 of user core. Jul 15 23:12:04.688177 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jul 15 23:12:04.692510 systemd[1]: Starting user@500.service - User Manager for UID 500... Jul 15 23:12:04.705834 (systemd)[1665]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Jul 15 23:12:04.709765 systemd-logind[1488]: New session c1 of user core. Jul 15 23:12:04.850187 systemd[1665]: Queued start job for default target default.target. Jul 15 23:12:04.858881 systemd[1665]: Created slice app.slice - User Application Slice. Jul 15 23:12:04.858933 systemd[1665]: Reached target paths.target - Paths. Jul 15 23:12:04.859009 systemd[1665]: Reached target timers.target - Timers. Jul 15 23:12:04.861430 systemd[1665]: Starting dbus.socket - D-Bus User Message Bus Socket... Jul 15 23:12:04.898013 systemd[1665]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jul 15 23:12:04.898234 systemd[1665]: Reached target sockets.target - Sockets. Jul 15 23:12:04.898318 systemd[1665]: Reached target basic.target - Basic System. Jul 15 23:12:04.898385 systemd[1665]: Reached target default.target - Main User Target. Jul 15 23:12:04.898437 systemd[1665]: Startup finished in 180ms. Jul 15 23:12:04.898464 systemd[1]: Started user@500.service - User Manager for UID 500. Jul 15 23:12:04.906161 systemd[1]: Started session-1.scope - Session 1 of User core. Jul 15 23:12:05.614946 systemd[1]: Started sshd@1-91.99.216.80:22-139.178.68.195:40010.service - OpenSSH per-connection server daemon (139.178.68.195:40010). Jul 15 23:12:06.621234 sshd[1676]: Accepted publickey for core from 139.178.68.195 port 40010 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:12:06.623385 sshd-session[1676]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:12:06.630147 systemd-logind[1488]: New session 2 of user core. Jul 15 23:12:06.638175 systemd[1]: Started session-2.scope - Session 2 of User core. Jul 15 23:12:07.301680 sshd[1678]: Connection closed by 139.178.68.195 port 40010 Jul 15 23:12:07.302565 sshd-session[1676]: pam_unix(sshd:session): session closed for user core Jul 15 23:12:07.307631 systemd[1]: sshd@1-91.99.216.80:22-139.178.68.195:40010.service: Deactivated successfully. Jul 15 23:12:07.310756 systemd[1]: session-2.scope: Deactivated successfully. Jul 15 23:12:07.312712 systemd-logind[1488]: Session 2 logged out. Waiting for processes to exit. Jul 15 23:12:07.314816 systemd-logind[1488]: Removed session 2. Jul 15 23:12:07.478602 systemd[1]: Started sshd@2-91.99.216.80:22-139.178.68.195:40018.service - OpenSSH per-connection server daemon (139.178.68.195:40018). Jul 15 23:12:08.490055 sshd[1684]: Accepted publickey for core from 139.178.68.195 port 40018 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:12:08.492441 sshd-session[1684]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:12:08.498201 systemd-logind[1488]: New session 3 of user core. Jul 15 23:12:08.503056 systemd[1]: Started session-3.scope - Session 3 of User core. Jul 15 23:12:09.174925 sshd[1686]: Connection closed by 139.178.68.195 port 40018 Jul 15 23:12:09.175730 sshd-session[1684]: pam_unix(sshd:session): session closed for user core Jul 15 23:12:09.180882 systemd[1]: sshd@2-91.99.216.80:22-139.178.68.195:40018.service: Deactivated successfully. Jul 15 23:12:09.182669 systemd[1]: session-3.scope: Deactivated successfully. Jul 15 23:12:09.185131 systemd-logind[1488]: Session 3 logged out. Waiting for processes to exit. Jul 15 23:12:09.187148 systemd-logind[1488]: Removed session 3. Jul 15 23:12:09.349214 systemd[1]: Started sshd@3-91.99.216.80:22-139.178.68.195:40026.service - OpenSSH per-connection server daemon (139.178.68.195:40026). Jul 15 23:12:10.371798 sshd[1692]: Accepted publickey for core from 139.178.68.195 port 40026 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:12:10.374119 sshd-session[1692]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:12:10.382579 systemd-logind[1488]: New session 4 of user core. Jul 15 23:12:10.391176 systemd[1]: Started session-4.scope - Session 4 of User core. Jul 15 23:12:11.062083 sshd[1694]: Connection closed by 139.178.68.195 port 40026 Jul 15 23:12:11.063035 sshd-session[1692]: pam_unix(sshd:session): session closed for user core Jul 15 23:12:11.068026 systemd[1]: sshd@3-91.99.216.80:22-139.178.68.195:40026.service: Deactivated successfully. Jul 15 23:12:11.069746 systemd[1]: session-4.scope: Deactivated successfully. Jul 15 23:12:11.070610 systemd-logind[1488]: Session 4 logged out. Waiting for processes to exit. Jul 15 23:12:11.072557 systemd-logind[1488]: Removed session 4. Jul 15 23:12:11.237807 systemd[1]: Started sshd@4-91.99.216.80:22-139.178.68.195:47772.service - OpenSSH per-connection server daemon (139.178.68.195:47772). Jul 15 23:12:11.571991 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jul 15 23:12:11.575090 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:12:11.730260 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:12:11.743412 (kubelet)[1710]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:12:11.795836 kubelet[1710]: E0715 23:12:11.795753 1710 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:12:11.802244 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:12:11.802588 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:12:11.803308 systemd[1]: kubelet.service: Consumed 176ms CPU time, 105.6M memory peak. Jul 15 23:12:12.258155 sshd[1700]: Accepted publickey for core from 139.178.68.195 port 47772 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:12:12.260221 sshd-session[1700]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:12:12.267916 systemd-logind[1488]: New session 5 of user core. Jul 15 23:12:12.271030 systemd[1]: Started session-5.scope - Session 5 of User core. Jul 15 23:12:12.796143 sudo[1719]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jul 15 23:12:12.796434 sudo[1719]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 23:12:12.812145 sudo[1719]: pam_unix(sudo:session): session closed for user root Jul 15 23:12:12.974830 sshd[1718]: Connection closed by 139.178.68.195 port 47772 Jul 15 23:12:12.975912 sshd-session[1700]: pam_unix(sshd:session): session closed for user core Jul 15 23:12:12.981108 systemd-logind[1488]: Session 5 logged out. Waiting for processes to exit. Jul 15 23:12:12.982013 systemd[1]: sshd@4-91.99.216.80:22-139.178.68.195:47772.service: Deactivated successfully. Jul 15 23:12:12.984415 systemd[1]: session-5.scope: Deactivated successfully. Jul 15 23:12:12.987536 systemd-logind[1488]: Removed session 5. Jul 15 23:12:13.149520 systemd[1]: Started sshd@5-91.99.216.80:22-139.178.68.195:47782.service - OpenSSH per-connection server daemon (139.178.68.195:47782). Jul 15 23:12:14.148816 sshd[1725]: Accepted publickey for core from 139.178.68.195 port 47782 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:12:14.151120 sshd-session[1725]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:12:14.156585 systemd-logind[1488]: New session 6 of user core. Jul 15 23:12:14.164177 systemd[1]: Started session-6.scope - Session 6 of User core. Jul 15 23:12:14.671512 sudo[1729]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jul 15 23:12:14.671789 sudo[1729]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 23:12:14.678880 sudo[1729]: pam_unix(sudo:session): session closed for user root Jul 15 23:12:14.686163 sudo[1728]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jul 15 23:12:14.686444 sudo[1728]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 23:12:14.699326 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jul 15 23:12:14.748030 augenrules[1751]: No rules Jul 15 23:12:14.749606 systemd[1]: audit-rules.service: Deactivated successfully. Jul 15 23:12:14.749824 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jul 15 23:12:14.751388 sudo[1728]: pam_unix(sudo:session): session closed for user root Jul 15 23:12:14.910701 sshd[1727]: Connection closed by 139.178.68.195 port 47782 Jul 15 23:12:14.911774 sshd-session[1725]: pam_unix(sshd:session): session closed for user core Jul 15 23:12:14.917448 systemd-logind[1488]: Session 6 logged out. Waiting for processes to exit. Jul 15 23:12:14.918213 systemd[1]: sshd@5-91.99.216.80:22-139.178.68.195:47782.service: Deactivated successfully. Jul 15 23:12:14.920185 systemd[1]: session-6.scope: Deactivated successfully. Jul 15 23:12:14.923879 systemd-logind[1488]: Removed session 6. Jul 15 23:12:15.085574 systemd[1]: Started sshd@6-91.99.216.80:22-139.178.68.195:47792.service - OpenSSH per-connection server daemon (139.178.68.195:47792). Jul 15 23:12:16.089078 sshd[1760]: Accepted publickey for core from 139.178.68.195 port 47792 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:12:16.090972 sshd-session[1760]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:12:16.097196 systemd-logind[1488]: New session 7 of user core. Jul 15 23:12:16.104208 systemd[1]: Started session-7.scope - Session 7 of User core. Jul 15 23:12:16.616622 sudo[1763]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jul 15 23:12:16.616927 sudo[1763]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jul 15 23:12:16.966773 systemd[1]: Starting docker.service - Docker Application Container Engine... Jul 15 23:12:16.980411 (dockerd)[1780]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jul 15 23:12:17.231552 dockerd[1780]: time="2025-07-15T23:12:17.231115440Z" level=info msg="Starting up" Jul 15 23:12:17.234269 dockerd[1780]: time="2025-07-15T23:12:17.234151160Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jul 15 23:12:17.279358 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3582254563-merged.mount: Deactivated successfully. Jul 15 23:12:17.291251 systemd[1]: var-lib-docker-metacopy\x2dcheck2116964001-merged.mount: Deactivated successfully. Jul 15 23:12:17.300811 dockerd[1780]: time="2025-07-15T23:12:17.300750760Z" level=info msg="Loading containers: start." Jul 15 23:12:17.313886 kernel: Initializing XFRM netlink socket Jul 15 23:12:17.518063 systemd-timesyncd[1389]: Network configuration changed, trying to establish connection. Jul 15 23:12:17.532260 systemd-timesyncd[1389]: Contacted time server 213.206.165.21:123 (2.flatcar.pool.ntp.org). Jul 15 23:12:17.532334 systemd-timesyncd[1389]: Initial clock synchronization to Tue 2025-07-15 23:12:17.257897 UTC. Jul 15 23:12:17.571486 systemd-networkd[1424]: docker0: Link UP Jul 15 23:12:17.577720 dockerd[1780]: time="2025-07-15T23:12:17.577633880Z" level=info msg="Loading containers: done." Jul 15 23:12:17.598130 dockerd[1780]: time="2025-07-15T23:12:17.598046920Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jul 15 23:12:17.598377 dockerd[1780]: time="2025-07-15T23:12:17.598163880Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 Jul 15 23:12:17.598377 dockerd[1780]: time="2025-07-15T23:12:17.598296800Z" level=info msg="Initializing buildkit" Jul 15 23:12:17.625055 dockerd[1780]: time="2025-07-15T23:12:17.624996560Z" level=info msg="Completed buildkit initialization" Jul 15 23:12:17.634673 dockerd[1780]: time="2025-07-15T23:12:17.634617680Z" level=info msg="Daemon has completed initialization" Jul 15 23:12:17.634811 dockerd[1780]: time="2025-07-15T23:12:17.634685680Z" level=info msg="API listen on /run/docker.sock" Jul 15 23:12:17.635076 systemd[1]: Started docker.service - Docker Application Container Engine. Jul 15 23:12:18.648402 containerd[1512]: time="2025-07-15T23:12:18.648305736Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.11\"" Jul 15 23:12:19.349534 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount401311987.mount: Deactivated successfully. Jul 15 23:12:20.811395 containerd[1512]: time="2025-07-15T23:12:20.811317756Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:20.813089 containerd[1512]: time="2025-07-15T23:12:20.813029668Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.31.11: active requests=0, bytes read=25651905" Jul 15 23:12:20.814917 containerd[1512]: time="2025-07-15T23:12:20.814818274Z" level=info msg="ImageCreate event name:\"sha256:00a68b619a4bfa14c989a2181a7aa0726a5cb1272a7f65394e6a594ad6eade27\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:20.819906 containerd[1512]: time="2025-07-15T23:12:20.819794275Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:a3d1c4440817725a1b503a7ccce94f3dce2b208ebf257b405dc2d97817df3dde\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:20.821109 containerd[1512]: time="2025-07-15T23:12:20.820920217Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.31.11\" with image id \"sha256:00a68b619a4bfa14c989a2181a7aa0726a5cb1272a7f65394e6a594ad6eade27\", repo tag \"registry.k8s.io/kube-apiserver:v1.31.11\", repo digest \"registry.k8s.io/kube-apiserver@sha256:a3d1c4440817725a1b503a7ccce94f3dce2b208ebf257b405dc2d97817df3dde\", size \"25648613\" in 2.172573254s" Jul 15 23:12:20.821109 containerd[1512]: time="2025-07-15T23:12:20.820969567Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.31.11\" returns image reference \"sha256:00a68b619a4bfa14c989a2181a7aa0726a5cb1272a7f65394e6a594ad6eade27\"" Jul 15 23:12:20.822859 containerd[1512]: time="2025-07-15T23:12:20.822792644Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.11\"" Jul 15 23:12:22.052785 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jul 15 23:12:22.055749 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:12:22.200995 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:12:22.213374 (kubelet)[2051]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:12:22.268766 kubelet[2051]: E0715 23:12:22.268713 2051 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:12:22.272144 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:12:22.272434 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:12:22.273949 systemd[1]: kubelet.service: Consumed 155ms CPU time, 105.1M memory peak. Jul 15 23:12:22.439787 containerd[1512]: time="2025-07-15T23:12:22.438725050Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:22.439787 containerd[1512]: time="2025-07-15T23:12:22.439678621Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.31.11: active requests=0, bytes read=22460303" Jul 15 23:12:22.447457 containerd[1512]: time="2025-07-15T23:12:22.447419020Z" level=info msg="ImageCreate event name:\"sha256:5c5dc52b837451e0fe6108fdfb9cfa431191ce227ce71d103dec8a8c655c4e71\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:22.450859 containerd[1512]: time="2025-07-15T23:12:22.450764634Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:0f19de157f3d251f5ddeb6e9d026895bc55cb02592874b326fa345c57e5e2848\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:22.452262 containerd[1512]: time="2025-07-15T23:12:22.452205652Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.31.11\" with image id \"sha256:5c5dc52b837451e0fe6108fdfb9cfa431191ce227ce71d103dec8a8c655c4e71\", repo tag \"registry.k8s.io/kube-controller-manager:v1.31.11\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:0f19de157f3d251f5ddeb6e9d026895bc55cb02592874b326fa345c57e5e2848\", size \"23996073\" in 1.629358621s" Jul 15 23:12:22.452262 containerd[1512]: time="2025-07-15T23:12:22.452259626Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.31.11\" returns image reference \"sha256:5c5dc52b837451e0fe6108fdfb9cfa431191ce227ce71d103dec8a8c655c4e71\"" Jul 15 23:12:22.453046 containerd[1512]: time="2025-07-15T23:12:22.453014824Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.11\"" Jul 15 23:12:24.236872 containerd[1512]: time="2025-07-15T23:12:24.235363081Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:24.237250 containerd[1512]: time="2025-07-15T23:12:24.237004521Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.31.11: active requests=0, bytes read=17125109" Jul 15 23:12:24.237439 containerd[1512]: time="2025-07-15T23:12:24.237409711Z" level=info msg="ImageCreate event name:\"sha256:89be0efdc4ab1793b9b1b05e836e33dc50f5b2911b57609b315b58608b2d3746\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:24.241269 containerd[1512]: time="2025-07-15T23:12:24.241228184Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:1a9b59b3bfa6c1f1911f6f865a795620c461d079e413061bb71981cadd67f39d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:24.242433 containerd[1512]: time="2025-07-15T23:12:24.242384837Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.31.11\" with image id \"sha256:89be0efdc4ab1793b9b1b05e836e33dc50f5b2911b57609b315b58608b2d3746\", repo tag \"registry.k8s.io/kube-scheduler:v1.31.11\", repo digest \"registry.k8s.io/kube-scheduler@sha256:1a9b59b3bfa6c1f1911f6f865a795620c461d079e413061bb71981cadd67f39d\", size \"18660897\" in 1.789204746s" Jul 15 23:12:24.242433 containerd[1512]: time="2025-07-15T23:12:24.242425364Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.31.11\" returns image reference \"sha256:89be0efdc4ab1793b9b1b05e836e33dc50f5b2911b57609b315b58608b2d3746\"" Jul 15 23:12:24.243492 containerd[1512]: time="2025-07-15T23:12:24.243454646Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.11\"" Jul 15 23:12:25.421861 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3087110052.mount: Deactivated successfully. Jul 15 23:12:25.751935 containerd[1512]: time="2025-07-15T23:12:25.751268854Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.31.11\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:25.752965 containerd[1512]: time="2025-07-15T23:12:25.752911763Z" level=info msg="ImageCreate event name:\"sha256:7d1e7db6660181423f98acbe3a495b3fe5cec9b85cdef245540cc2cb3b180ab0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:25.753079 containerd[1512]: time="2025-07-15T23:12:25.752984412Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.31.11: active requests=0, bytes read=26916019" Jul 15 23:12:25.757075 containerd[1512]: time="2025-07-15T23:12:25.755759358Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:a31da847792c5e7e92e91b78da1ad21d693e4b2b48d0e9f4610c8764dc2a5d79\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:25.757075 containerd[1512]: time="2025-07-15T23:12:25.756826126Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.31.11\" with image id \"sha256:7d1e7db6660181423f98acbe3a495b3fe5cec9b85cdef245540cc2cb3b180ab0\", repo tag \"registry.k8s.io/kube-proxy:v1.31.11\", repo digest \"registry.k8s.io/kube-proxy@sha256:a31da847792c5e7e92e91b78da1ad21d693e4b2b48d0e9f4610c8764dc2a5d79\", size \"26915012\" in 1.513088984s" Jul 15 23:12:25.757075 containerd[1512]: time="2025-07-15T23:12:25.756898341Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.31.11\" returns image reference \"sha256:7d1e7db6660181423f98acbe3a495b3fe5cec9b85cdef245540cc2cb3b180ab0\"" Jul 15 23:12:25.757802 containerd[1512]: time="2025-07-15T23:12:25.757777152Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Jul 15 23:12:26.407547 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3329202787.mount: Deactivated successfully. Jul 15 23:12:27.356869 containerd[1512]: time="2025-07-15T23:12:27.356207076Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:27.358890 containerd[1512]: time="2025-07-15T23:12:27.358809423Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=16951714" Jul 15 23:12:27.359757 containerd[1512]: time="2025-07-15T23:12:27.359719627Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:27.364048 containerd[1512]: time="2025-07-15T23:12:27.363178893Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:27.364455 containerd[1512]: time="2025-07-15T23:12:27.364410669Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.606502s" Jul 15 23:12:27.364455 containerd[1512]: time="2025-07-15T23:12:27.364450771Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Jul 15 23:12:27.364904 containerd[1512]: time="2025-07-15T23:12:27.364869413Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Jul 15 23:12:27.923497 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3432528281.mount: Deactivated successfully. Jul 15 23:12:27.931044 containerd[1512]: time="2025-07-15T23:12:27.930930142Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 23:12:27.932210 containerd[1512]: time="2025-07-15T23:12:27.932139393Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Jul 15 23:12:27.933372 containerd[1512]: time="2025-07-15T23:12:27.933252406Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 23:12:27.935759 containerd[1512]: time="2025-07-15T23:12:27.935697747Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jul 15 23:12:27.936687 containerd[1512]: time="2025-07-15T23:12:27.936355104Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 571.451725ms" Jul 15 23:12:27.936687 containerd[1512]: time="2025-07-15T23:12:27.936391366Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Jul 15 23:12:27.936886 containerd[1512]: time="2025-07-15T23:12:27.936819153Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\"" Jul 15 23:12:28.526586 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1541527201.mount: Deactivated successfully. Jul 15 23:12:29.936507 containerd[1512]: time="2025-07-15T23:12:29.936410113Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.15-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:29.938908 containerd[1512]: time="2025-07-15T23:12:29.938830825Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.15-0: active requests=0, bytes read=66406533" Jul 15 23:12:29.939533 containerd[1512]: time="2025-07-15T23:12:29.939465181Z" level=info msg="ImageCreate event name:\"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:29.945225 containerd[1512]: time="2025-07-15T23:12:29.945152003Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:29.947866 containerd[1512]: time="2025-07-15T23:12:29.947395008Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.15-0\" with image id \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\", repo tag \"registry.k8s.io/etcd:3.5.15-0\", repo digest \"registry.k8s.io/etcd@sha256:a6dc63e6e8cfa0307d7851762fa6b629afb18f28d8aa3fab5a6e91b4af60026a\", size \"66535646\" in 2.01051102s" Jul 15 23:12:29.947866 containerd[1512]: time="2025-07-15T23:12:29.947449971Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.15-0\" returns image reference \"sha256:27e3830e1402783674d8b594038967deea9d51f0d91b34c93c8f39d2f68af7da\"" Jul 15 23:12:32.522481 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jul 15 23:12:32.529183 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:12:32.674000 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:12:32.683399 (kubelet)[2208]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jul 15 23:12:32.732658 kubelet[2208]: E0715 23:12:32.732591 2208 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jul 15 23:12:32.736456 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jul 15 23:12:32.736582 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jul 15 23:12:32.736867 systemd[1]: kubelet.service: Consumed 152ms CPU time, 105.1M memory peak. Jul 15 23:12:34.895003 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:12:34.895762 systemd[1]: kubelet.service: Consumed 152ms CPU time, 105.1M memory peak. Jul 15 23:12:34.899369 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:12:34.929615 systemd[1]: Reload requested from client PID 2222 ('systemctl') (unit session-7.scope)... Jul 15 23:12:34.929639 systemd[1]: Reloading... Jul 15 23:12:35.046877 zram_generator::config[2269]: No configuration found. Jul 15 23:12:35.128454 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 23:12:35.230566 systemd[1]: Reloading finished in 300 ms. Jul 15 23:12:35.263731 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jul 15 23:12:35.263813 systemd[1]: kubelet.service: Failed with result 'signal'. Jul 15 23:12:35.264121 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:12:35.264170 systemd[1]: kubelet.service: Consumed 94ms CPU time, 95M memory peak. Jul 15 23:12:35.265974 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:12:35.421211 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:12:35.433371 (kubelet)[2312]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 23:12:35.485215 kubelet[2312]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 23:12:35.485215 kubelet[2312]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 15 23:12:35.485215 kubelet[2312]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 23:12:35.485546 kubelet[2312]: I0715 23:12:35.485211 2312 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 23:12:36.768887 kubelet[2312]: I0715 23:12:36.768698 2312 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 15 23:12:36.768887 kubelet[2312]: I0715 23:12:36.768732 2312 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 23:12:36.769494 kubelet[2312]: I0715 23:12:36.768995 2312 server.go:934] "Client rotation is on, will bootstrap in background" Jul 15 23:12:36.802597 kubelet[2312]: E0715 23:12:36.802540 2312 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://91.99.216.80:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.99.216.80:6443: connect: connection refused" logger="UnhandledError" Jul 15 23:12:36.804302 kubelet[2312]: I0715 23:12:36.804116 2312 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 23:12:36.817442 kubelet[2312]: I0715 23:12:36.817407 2312 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 23:12:36.821927 kubelet[2312]: I0715 23:12:36.821523 2312 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 23:12:36.822830 kubelet[2312]: I0715 23:12:36.822805 2312 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 15 23:12:36.823192 kubelet[2312]: I0715 23:12:36.823111 2312 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 23:12:36.823430 kubelet[2312]: I0715 23:12:36.823257 2312 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-0-1-n-91aeaf5bee","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 23:12:36.823611 kubelet[2312]: I0715 23:12:36.823598 2312 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 23:12:36.823688 kubelet[2312]: I0715 23:12:36.823680 2312 container_manager_linux.go:300] "Creating device plugin manager" Jul 15 23:12:36.824034 kubelet[2312]: I0715 23:12:36.824017 2312 state_mem.go:36] "Initialized new in-memory state store" Jul 15 23:12:36.827686 kubelet[2312]: I0715 23:12:36.827649 2312 kubelet.go:408] "Attempting to sync node with API server" Jul 15 23:12:36.827898 kubelet[2312]: I0715 23:12:36.827877 2312 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 23:12:36.828039 kubelet[2312]: I0715 23:12:36.828018 2312 kubelet.go:314] "Adding apiserver pod source" Jul 15 23:12:36.828140 kubelet[2312]: I0715 23:12:36.828123 2312 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 23:12:36.834060 kubelet[2312]: W0715 23:12:36.834001 2312 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://91.99.216.80:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-0-1-n-91aeaf5bee&limit=500&resourceVersion=0": dial tcp 91.99.216.80:6443: connect: connection refused Jul 15 23:12:36.834157 kubelet[2312]: E0715 23:12:36.834069 2312 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://91.99.216.80:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4372-0-1-n-91aeaf5bee&limit=500&resourceVersion=0\": dial tcp 91.99.216.80:6443: connect: connection refused" logger="UnhandledError" Jul 15 23:12:36.835020 kubelet[2312]: W0715 23:12:36.834747 2312 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://91.99.216.80:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 91.99.216.80:6443: connect: connection refused Jul 15 23:12:36.835020 kubelet[2312]: E0715 23:12:36.834821 2312 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://91.99.216.80:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.216.80:6443: connect: connection refused" logger="UnhandledError" Jul 15 23:12:36.835533 kubelet[2312]: I0715 23:12:36.835499 2312 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 15 23:12:36.836444 kubelet[2312]: I0715 23:12:36.836410 2312 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 23:12:36.836617 kubelet[2312]: W0715 23:12:36.836591 2312 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jul 15 23:12:36.843608 kubelet[2312]: I0715 23:12:36.843585 2312 server.go:1274] "Started kubelet" Jul 15 23:12:36.849946 kubelet[2312]: I0715 23:12:36.849900 2312 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 23:12:36.851535 kubelet[2312]: I0715 23:12:36.851508 2312 server.go:449] "Adding debug handlers to kubelet server" Jul 15 23:12:36.852933 kubelet[2312]: I0715 23:12:36.852718 2312 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 23:12:36.853326 kubelet[2312]: I0715 23:12:36.853311 2312 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 23:12:36.854524 kubelet[2312]: I0715 23:12:36.854477 2312 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 23:12:36.856671 kubelet[2312]: E0715 23:12:36.854191 2312 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.216.80:6443/api/v1/namespaces/default/events\": dial tcp 91.99.216.80:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4372-0-1-n-91aeaf5bee.18528fa8077e8e91 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4372-0-1-n-91aeaf5bee,UID:ci-4372-0-1-n-91aeaf5bee,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4372-0-1-n-91aeaf5bee,},FirstTimestamp:2025-07-15 23:12:36.843556497 +0000 UTC m=+1.404738729,LastTimestamp:2025-07-15 23:12:36.843556497 +0000 UTC m=+1.404738729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-0-1-n-91aeaf5bee,}" Jul 15 23:12:36.860391 kubelet[2312]: I0715 23:12:36.860342 2312 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 23:12:36.863607 kubelet[2312]: I0715 23:12:36.863563 2312 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 15 23:12:36.864181 kubelet[2312]: E0715 23:12:36.864131 2312 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372-0-1-n-91aeaf5bee\" not found" Jul 15 23:12:36.866862 kubelet[2312]: E0715 23:12:36.865897 2312 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.216.80:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-0-1-n-91aeaf5bee?timeout=10s\": dial tcp 91.99.216.80:6443: connect: connection refused" interval="200ms" Jul 15 23:12:36.866862 kubelet[2312]: I0715 23:12:36.866052 2312 factory.go:221] Registration of the systemd container factory successfully Jul 15 23:12:36.866862 kubelet[2312]: I0715 23:12:36.866142 2312 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 23:12:36.866862 kubelet[2312]: E0715 23:12:36.866645 2312 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 15 23:12:36.866862 kubelet[2312]: I0715 23:12:36.866753 2312 reconciler.go:26] "Reconciler: start to sync state" Jul 15 23:12:36.866862 kubelet[2312]: I0715 23:12:36.866789 2312 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 15 23:12:36.867629 kubelet[2312]: W0715 23:12:36.867589 2312 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://91.99.216.80:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 91.99.216.80:6443: connect: connection refused Jul 15 23:12:36.867755 kubelet[2312]: E0715 23:12:36.867737 2312 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://91.99.216.80:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.216.80:6443: connect: connection refused" logger="UnhandledError" Jul 15 23:12:36.868389 kubelet[2312]: I0715 23:12:36.868371 2312 factory.go:221] Registration of the containerd container factory successfully Jul 15 23:12:36.878315 kubelet[2312]: I0715 23:12:36.878258 2312 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 23:12:36.879562 kubelet[2312]: I0715 23:12:36.879531 2312 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 23:12:36.879715 kubelet[2312]: I0715 23:12:36.879699 2312 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 15 23:12:36.879810 kubelet[2312]: I0715 23:12:36.879800 2312 kubelet.go:2321] "Starting kubelet main sync loop" Jul 15 23:12:36.879937 kubelet[2312]: E0715 23:12:36.879908 2312 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 23:12:36.889543 kubelet[2312]: W0715 23:12:36.889497 2312 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://91.99.216.80:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 91.99.216.80:6443: connect: connection refused Jul 15 23:12:36.889733 kubelet[2312]: E0715 23:12:36.889710 2312 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://91.99.216.80:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.216.80:6443: connect: connection refused" logger="UnhandledError" Jul 15 23:12:36.903994 kubelet[2312]: I0715 23:12:36.903939 2312 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 15 23:12:36.904405 kubelet[2312]: I0715 23:12:36.904161 2312 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 15 23:12:36.904405 kubelet[2312]: I0715 23:12:36.904195 2312 state_mem.go:36] "Initialized new in-memory state store" Jul 15 23:12:36.906368 kubelet[2312]: I0715 23:12:36.906093 2312 policy_none.go:49] "None policy: Start" Jul 15 23:12:36.907138 kubelet[2312]: I0715 23:12:36.907077 2312 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 15 23:12:36.907138 kubelet[2312]: I0715 23:12:36.907115 2312 state_mem.go:35] "Initializing new in-memory state store" Jul 15 23:12:36.914424 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jul 15 23:12:36.929385 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jul 15 23:12:36.933243 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jul 15 23:12:36.945096 kubelet[2312]: I0715 23:12:36.944978 2312 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 23:12:36.946355 kubelet[2312]: I0715 23:12:36.946330 2312 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 23:12:36.947003 kubelet[2312]: I0715 23:12:36.946927 2312 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 23:12:36.947699 kubelet[2312]: I0715 23:12:36.947666 2312 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 23:12:36.951197 kubelet[2312]: E0715 23:12:36.951160 2312 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4372-0-1-n-91aeaf5bee\" not found" Jul 15 23:12:36.996151 systemd[1]: Created slice kubepods-burstable-pod0fe268a4ee8fc52243b93ea09c6ed498.slice - libcontainer container kubepods-burstable-pod0fe268a4ee8fc52243b93ea09c6ed498.slice. Jul 15 23:12:37.027241 systemd[1]: Created slice kubepods-burstable-pod835fd3f7d259d9471f171b976abbd90b.slice - libcontainer container kubepods-burstable-pod835fd3f7d259d9471f171b976abbd90b.slice. Jul 15 23:12:37.035170 systemd[1]: Created slice kubepods-burstable-pod3870ed6cf33960bb05d8df9d0d2ac562.slice - libcontainer container kubepods-burstable-pod3870ed6cf33960bb05d8df9d0d2ac562.slice. Jul 15 23:12:37.050760 kubelet[2312]: I0715 23:12:37.050642 2312 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.051347 kubelet[2312]: E0715 23:12:37.051305 2312 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://91.99.216.80:6443/api/v1/nodes\": dial tcp 91.99.216.80:6443: connect: connection refused" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.067054 kubelet[2312]: E0715 23:12:37.066992 2312 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.216.80:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-0-1-n-91aeaf5bee?timeout=10s\": dial tcp 91.99.216.80:6443: connect: connection refused" interval="400ms" Jul 15 23:12:37.068948 kubelet[2312]: I0715 23:12:37.068506 2312 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0fe268a4ee8fc52243b93ea09c6ed498-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-0-1-n-91aeaf5bee\" (UID: \"0fe268a4ee8fc52243b93ea09c6ed498\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.068948 kubelet[2312]: I0715 23:12:37.068575 2312 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-kubeconfig\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.068948 kubelet[2312]: I0715 23:12:37.068613 2312 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.068948 kubelet[2312]: I0715 23:12:37.068649 2312 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-k8s-certs\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.068948 kubelet[2312]: I0715 23:12:37.068684 2312 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.069371 kubelet[2312]: I0715 23:12:37.068722 2312 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3870ed6cf33960bb05d8df9d0d2ac562-kubeconfig\") pod \"kube-scheduler-ci-4372-0-1-n-91aeaf5bee\" (UID: \"3870ed6cf33960bb05d8df9d0d2ac562\") " pod="kube-system/kube-scheduler-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.069371 kubelet[2312]: I0715 23:12:37.068756 2312 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0fe268a4ee8fc52243b93ea09c6ed498-ca-certs\") pod \"kube-apiserver-ci-4372-0-1-n-91aeaf5bee\" (UID: \"0fe268a4ee8fc52243b93ea09c6ed498\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.069371 kubelet[2312]: I0715 23:12:37.068788 2312 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0fe268a4ee8fc52243b93ea09c6ed498-k8s-certs\") pod \"kube-apiserver-ci-4372-0-1-n-91aeaf5bee\" (UID: \"0fe268a4ee8fc52243b93ea09c6ed498\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.069371 kubelet[2312]: I0715 23:12:37.068822 2312 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-ca-certs\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.254530 kubelet[2312]: I0715 23:12:37.254463 2312 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.255078 kubelet[2312]: E0715 23:12:37.255028 2312 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://91.99.216.80:6443/api/v1/nodes\": dial tcp 91.99.216.80:6443: connect: connection refused" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.322566 containerd[1512]: time="2025-07-15T23:12:37.322174671Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-0-1-n-91aeaf5bee,Uid:0fe268a4ee8fc52243b93ea09c6ed498,Namespace:kube-system,Attempt:0,}" Jul 15 23:12:37.334507 containerd[1512]: time="2025-07-15T23:12:37.334189893Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-0-1-n-91aeaf5bee,Uid:835fd3f7d259d9471f171b976abbd90b,Namespace:kube-system,Attempt:0,}" Jul 15 23:12:37.348975 containerd[1512]: time="2025-07-15T23:12:37.348940282Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-0-1-n-91aeaf5bee,Uid:3870ed6cf33960bb05d8df9d0d2ac562,Namespace:kube-system,Attempt:0,}" Jul 15 23:12:37.350308 containerd[1512]: time="2025-07-15T23:12:37.350257621Z" level=info msg="connecting to shim 973d3a398527f1e1f78e460ba674010473209918ea3877e786367c893e5af577" address="unix:///run/containerd/s/6f90b6d4565ed55948956585376d024bac84b08c7b7c939f1a3fe0d3388dee27" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:12:37.373445 containerd[1512]: time="2025-07-15T23:12:37.373385081Z" level=info msg="connecting to shim 4f0e01a9fa54188028a3565c8024d3635cc15aeaf6792066e2d97731de1fc24a" address="unix:///run/containerd/s/98ec46a433129f8ba016994d760164ab7e3d5df326d4a6c8560004079f856c5c" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:12:37.400033 systemd[1]: Started cri-containerd-973d3a398527f1e1f78e460ba674010473209918ea3877e786367c893e5af577.scope - libcontainer container 973d3a398527f1e1f78e460ba674010473209918ea3877e786367c893e5af577. Jul 15 23:12:37.409905 systemd[1]: Started cri-containerd-4f0e01a9fa54188028a3565c8024d3635cc15aeaf6792066e2d97731de1fc24a.scope - libcontainer container 4f0e01a9fa54188028a3565c8024d3635cc15aeaf6792066e2d97731de1fc24a. Jul 15 23:12:37.422815 containerd[1512]: time="2025-07-15T23:12:37.422774920Z" level=info msg="connecting to shim 9f2b0c232ec9a91076fdbc2131dadedc332ff6c0ca74bb53a0cd4ba374312575" address="unix:///run/containerd/s/69683dcbbe27a3963d3389cdbff78471cbeba379606dfa1d1cfe668f9cc78e5a" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:12:37.453245 systemd[1]: Started cri-containerd-9f2b0c232ec9a91076fdbc2131dadedc332ff6c0ca74bb53a0cd4ba374312575.scope - libcontainer container 9f2b0c232ec9a91076fdbc2131dadedc332ff6c0ca74bb53a0cd4ba374312575. Jul 15 23:12:37.467706 kubelet[2312]: E0715 23:12:37.467643 2312 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.216.80:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4372-0-1-n-91aeaf5bee?timeout=10s\": dial tcp 91.99.216.80:6443: connect: connection refused" interval="800ms" Jul 15 23:12:37.472730 containerd[1512]: time="2025-07-15T23:12:37.472687458Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4372-0-1-n-91aeaf5bee,Uid:835fd3f7d259d9471f171b976abbd90b,Namespace:kube-system,Attempt:0,} returns sandbox id \"4f0e01a9fa54188028a3565c8024d3635cc15aeaf6792066e2d97731de1fc24a\"" Jul 15 23:12:37.475640 containerd[1512]: time="2025-07-15T23:12:37.475604290Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4372-0-1-n-91aeaf5bee,Uid:0fe268a4ee8fc52243b93ea09c6ed498,Namespace:kube-system,Attempt:0,} returns sandbox id \"973d3a398527f1e1f78e460ba674010473209918ea3877e786367c893e5af577\"" Jul 15 23:12:37.477916 containerd[1512]: time="2025-07-15T23:12:37.477859679Z" level=info msg="CreateContainer within sandbox \"4f0e01a9fa54188028a3565c8024d3635cc15aeaf6792066e2d97731de1fc24a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jul 15 23:12:37.480803 containerd[1512]: time="2025-07-15T23:12:37.480758122Z" level=info msg="CreateContainer within sandbox \"973d3a398527f1e1f78e460ba674010473209918ea3877e786367c893e5af577\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jul 15 23:12:37.489693 containerd[1512]: time="2025-07-15T23:12:37.489655658Z" level=info msg="Container fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:12:37.492139 containerd[1512]: time="2025-07-15T23:12:37.492108431Z" level=info msg="Container fd2d336143723a894aac6c85a231b7b12f340581ea58e5d0cc69f4723516c400: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:12:37.503966 containerd[1512]: time="2025-07-15T23:12:37.501820833Z" level=info msg="CreateContainer within sandbox \"4f0e01a9fa54188028a3565c8024d3635cc15aeaf6792066e2d97731de1fc24a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5\"" Jul 15 23:12:37.504434 containerd[1512]: time="2025-07-15T23:12:37.504360530Z" level=info msg="StartContainer for \"fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5\"" Jul 15 23:12:37.505317 containerd[1512]: time="2025-07-15T23:12:37.505275603Z" level=info msg="CreateContainer within sandbox \"973d3a398527f1e1f78e460ba674010473209918ea3877e786367c893e5af577\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"fd2d336143723a894aac6c85a231b7b12f340581ea58e5d0cc69f4723516c400\"" Jul 15 23:12:37.505900 containerd[1512]: time="2025-07-15T23:12:37.505859456Z" level=info msg="connecting to shim fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5" address="unix:///run/containerd/s/98ec46a433129f8ba016994d760164ab7e3d5df326d4a6c8560004079f856c5c" protocol=ttrpc version=3 Jul 15 23:12:37.507541 containerd[1512]: time="2025-07-15T23:12:37.507515395Z" level=info msg="StartContainer for \"fd2d336143723a894aac6c85a231b7b12f340581ea58e5d0cc69f4723516c400\"" Jul 15 23:12:37.508739 containerd[1512]: time="2025-07-15T23:12:37.508637466Z" level=info msg="connecting to shim fd2d336143723a894aac6c85a231b7b12f340581ea58e5d0cc69f4723516c400" address="unix:///run/containerd/s/6f90b6d4565ed55948956585376d024bac84b08c7b7c939f1a3fe0d3388dee27" protocol=ttrpc version=3 Jul 15 23:12:37.517287 containerd[1512]: time="2025-07-15T23:12:37.517250057Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4372-0-1-n-91aeaf5bee,Uid:3870ed6cf33960bb05d8df9d0d2ac562,Namespace:kube-system,Attempt:0,} returns sandbox id \"9f2b0c232ec9a91076fdbc2131dadedc332ff6c0ca74bb53a0cd4ba374312575\"" Jul 15 23:12:37.519818 containerd[1512]: time="2025-07-15T23:12:37.519763266Z" level=info msg="CreateContainer within sandbox \"9f2b0c232ec9a91076fdbc2131dadedc332ff6c0ca74bb53a0cd4ba374312575\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jul 15 23:12:37.532556 containerd[1512]: time="2025-07-15T23:12:37.532097062Z" level=info msg="Container 55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:12:37.532143 systemd[1]: Started cri-containerd-fd2d336143723a894aac6c85a231b7b12f340581ea58e5d0cc69f4723516c400.scope - libcontainer container fd2d336143723a894aac6c85a231b7b12f340581ea58e5d0cc69f4723516c400. Jul 15 23:12:37.540299 systemd[1]: Started cri-containerd-fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5.scope - libcontainer container fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5. Jul 15 23:12:37.544875 containerd[1512]: time="2025-07-15T23:12:37.544801851Z" level=info msg="CreateContainer within sandbox \"9f2b0c232ec9a91076fdbc2131dadedc332ff6c0ca74bb53a0cd4ba374312575\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd\"" Jul 15 23:12:37.546117 containerd[1512]: time="2025-07-15T23:12:37.546047027Z" level=info msg="StartContainer for \"55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd\"" Jul 15 23:12:37.547482 containerd[1512]: time="2025-07-15T23:12:37.547392968Z" level=info msg="connecting to shim 55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd" address="unix:///run/containerd/s/69683dcbbe27a3963d3389cdbff78471cbeba379606dfa1d1cfe668f9cc78e5a" protocol=ttrpc version=3 Jul 15 23:12:37.573008 systemd[1]: Started cri-containerd-55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd.scope - libcontainer container 55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd. Jul 15 23:12:37.608216 containerd[1512]: time="2025-07-15T23:12:37.608168541Z" level=info msg="StartContainer for \"fd2d336143723a894aac6c85a231b7b12f340581ea58e5d0cc69f4723516c400\" returns successfully" Jul 15 23:12:37.610604 containerd[1512]: time="2025-07-15T23:12:37.610570452Z" level=info msg="StartContainer for \"fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5\" returns successfully" Jul 15 23:12:37.659305 kubelet[2312]: I0715 23:12:37.659276 2312 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.659938 kubelet[2312]: E0715 23:12:37.659885 2312 kubelet_node_status.go:95] "Unable to register node with API server" err="Post \"https://91.99.216.80:6443/api/v1/nodes\": dial tcp 91.99.216.80:6443: connect: connection refused" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:37.684214 containerd[1512]: time="2025-07-15T23:12:37.684112286Z" level=info msg="StartContainer for \"55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd\" returns successfully" Jul 15 23:12:38.463113 kubelet[2312]: I0715 23:12:38.463082 2312 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:40.213313 kubelet[2312]: E0715 23:12:40.213271 2312 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4372-0-1-n-91aeaf5bee\" not found" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:40.412744 kubelet[2312]: I0715 23:12:40.412703 2312 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:40.412897 kubelet[2312]: E0715 23:12:40.412758 2312 kubelet_node_status.go:535] "Error updating node status, will retry" err="error getting node \"ci-4372-0-1-n-91aeaf5bee\": node \"ci-4372-0-1-n-91aeaf5bee\" not found" Jul 15 23:12:40.836558 kubelet[2312]: I0715 23:12:40.836199 2312 apiserver.go:52] "Watching apiserver" Jul 15 23:12:40.867262 kubelet[2312]: I0715 23:12:40.867231 2312 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 15 23:12:42.387572 systemd[1]: Reload requested from client PID 2579 ('systemctl') (unit session-7.scope)... Jul 15 23:12:42.388001 systemd[1]: Reloading... Jul 15 23:12:42.513881 zram_generator::config[2626]: No configuration found. Jul 15 23:12:42.604410 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. Jul 15 23:12:42.735630 systemd[1]: Reloading finished in 347 ms. Jul 15 23:12:42.769705 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:12:42.784312 systemd[1]: kubelet.service: Deactivated successfully. Jul 15 23:12:42.784823 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:12:42.784905 systemd[1]: kubelet.service: Consumed 1.848s CPU time, 125.3M memory peak. Jul 15 23:12:42.791578 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jul 15 23:12:42.946380 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jul 15 23:12:42.959308 (kubelet)[2668]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jul 15 23:12:43.018044 kubelet[2668]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 23:12:43.018044 kubelet[2668]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jul 15 23:12:43.018044 kubelet[2668]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jul 15 23:12:43.018044 kubelet[2668]: I0715 23:12:43.017995 2668 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jul 15 23:12:43.025965 kubelet[2668]: I0715 23:12:43.025823 2668 server.go:491] "Kubelet version" kubeletVersion="v1.31.8" Jul 15 23:12:43.026094 kubelet[2668]: I0715 23:12:43.025982 2668 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jul 15 23:12:43.027647 kubelet[2668]: I0715 23:12:43.027455 2668 server.go:934] "Client rotation is on, will bootstrap in background" Jul 15 23:12:43.033004 kubelet[2668]: I0715 23:12:43.032967 2668 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jul 15 23:12:43.037147 kubelet[2668]: I0715 23:12:43.036986 2668 dynamic_cafile_content.go:160] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jul 15 23:12:43.046031 kubelet[2668]: I0715 23:12:43.045244 2668 server.go:1431] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jul 15 23:12:43.047978 kubelet[2668]: I0715 23:12:43.047840 2668 server.go:749] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Jul 15 23:12:43.048080 kubelet[2668]: I0715 23:12:43.047989 2668 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jul 15 23:12:43.048495 kubelet[2668]: I0715 23:12:43.048129 2668 container_manager_linux.go:264] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jul 15 23:12:43.048495 kubelet[2668]: I0715 23:12:43.048196 2668 container_manager_linux.go:269] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4372-0-1-n-91aeaf5bee","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jul 15 23:12:43.048495 kubelet[2668]: I0715 23:12:43.048389 2668 topology_manager.go:138] "Creating topology manager with none policy" Jul 15 23:12:43.048495 kubelet[2668]: I0715 23:12:43.048399 2668 container_manager_linux.go:300] "Creating device plugin manager" Jul 15 23:12:43.048709 kubelet[2668]: I0715 23:12:43.048445 2668 state_mem.go:36] "Initialized new in-memory state store" Jul 15 23:12:43.048709 kubelet[2668]: I0715 23:12:43.048590 2668 kubelet.go:408] "Attempting to sync node with API server" Jul 15 23:12:43.048709 kubelet[2668]: I0715 23:12:43.048603 2668 kubelet.go:303] "Adding static pod path" path="/etc/kubernetes/manifests" Jul 15 23:12:43.048709 kubelet[2668]: I0715 23:12:43.048621 2668 kubelet.go:314] "Adding apiserver pod source" Jul 15 23:12:43.049625 kubelet[2668]: I0715 23:12:43.048723 2668 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jul 15 23:12:43.053695 kubelet[2668]: I0715 23:12:43.053251 2668 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" Jul 15 23:12:43.055055 kubelet[2668]: I0715 23:12:43.054446 2668 kubelet.go:837] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jul 15 23:12:43.056861 kubelet[2668]: I0715 23:12:43.056755 2668 server.go:1274] "Started kubelet" Jul 15 23:12:43.060960 kubelet[2668]: I0715 23:12:43.056937 2668 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jul 15 23:12:43.061394 kubelet[2668]: I0715 23:12:43.061349 2668 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jul 15 23:12:43.064948 kubelet[2668]: I0715 23:12:43.064921 2668 server.go:449] "Adding debug handlers to kubelet server" Jul 15 23:12:43.065561 kubelet[2668]: I0715 23:12:43.061910 2668 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jul 15 23:12:43.074337 kubelet[2668]: I0715 23:12:43.062609 2668 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jul 15 23:12:43.074552 kubelet[2668]: I0715 23:12:43.062495 2668 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jul 15 23:12:43.078389 kubelet[2668]: I0715 23:12:43.077230 2668 volume_manager.go:289] "Starting Kubelet Volume Manager" Jul 15 23:12:43.078389 kubelet[2668]: I0715 23:12:43.077430 2668 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Jul 15 23:12:43.078389 kubelet[2668]: I0715 23:12:43.077550 2668 reconciler.go:26] "Reconciler: start to sync state" Jul 15 23:12:43.079790 kubelet[2668]: I0715 23:12:43.079241 2668 factory.go:221] Registration of the systemd container factory successfully Jul 15 23:12:43.080217 kubelet[2668]: I0715 23:12:43.080041 2668 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jul 15 23:12:43.082076 kubelet[2668]: E0715 23:12:43.081587 2668 kubelet_node_status.go:453] "Error getting the current node from lister" err="node \"ci-4372-0-1-n-91aeaf5bee\" not found" Jul 15 23:12:43.090879 kubelet[2668]: I0715 23:12:43.085957 2668 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jul 15 23:12:43.090879 kubelet[2668]: I0715 23:12:43.087712 2668 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jul 15 23:12:43.090879 kubelet[2668]: I0715 23:12:43.087734 2668 status_manager.go:217] "Starting to sync pod status with apiserver" Jul 15 23:12:43.090879 kubelet[2668]: I0715 23:12:43.087754 2668 kubelet.go:2321] "Starting kubelet main sync loop" Jul 15 23:12:43.090879 kubelet[2668]: E0715 23:12:43.087792 2668 kubelet.go:2345] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jul 15 23:12:43.099637 kubelet[2668]: E0715 23:12:43.099396 2668 kubelet.go:1478] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jul 15 23:12:43.099637 kubelet[2668]: I0715 23:12:43.099542 2668 factory.go:221] Registration of the containerd container factory successfully Jul 15 23:12:43.156945 kubelet[2668]: I0715 23:12:43.156918 2668 cpu_manager.go:214] "Starting CPU manager" policy="none" Jul 15 23:12:43.157118 kubelet[2668]: I0715 23:12:43.157104 2668 cpu_manager.go:215] "Reconciling" reconcilePeriod="10s" Jul 15 23:12:43.157174 kubelet[2668]: I0715 23:12:43.157166 2668 state_mem.go:36] "Initialized new in-memory state store" Jul 15 23:12:43.157366 kubelet[2668]: I0715 23:12:43.157351 2668 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jul 15 23:12:43.157441 kubelet[2668]: I0715 23:12:43.157417 2668 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jul 15 23:12:43.157489 kubelet[2668]: I0715 23:12:43.157480 2668 policy_none.go:49] "None policy: Start" Jul 15 23:12:43.158613 kubelet[2668]: I0715 23:12:43.158593 2668 memory_manager.go:170] "Starting memorymanager" policy="None" Jul 15 23:12:43.158786 kubelet[2668]: I0715 23:12:43.158774 2668 state_mem.go:35] "Initializing new in-memory state store" Jul 15 23:12:43.159093 kubelet[2668]: I0715 23:12:43.159079 2668 state_mem.go:75] "Updated machine memory state" Jul 15 23:12:43.167916 kubelet[2668]: I0715 23:12:43.167889 2668 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jul 15 23:12:43.168135 kubelet[2668]: I0715 23:12:43.168112 2668 eviction_manager.go:189] "Eviction manager: starting control loop" Jul 15 23:12:43.168178 kubelet[2668]: I0715 23:12:43.168133 2668 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jul 15 23:12:43.168623 kubelet[2668]: I0715 23:12:43.168599 2668 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jul 15 23:12:43.199554 kubelet[2668]: E0715 23:12:43.199343 2668 kubelet.go:1915] "Failed creating a mirror pod for" err="pods \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" already exists" pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.273928 kubelet[2668]: I0715 23:12:43.273771 2668 kubelet_node_status.go:72] "Attempting to register node" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.277990 update_engine[1489]: I20250715 23:12:43.277935 1489 update_attempter.cc:509] Updating boot flags... Jul 15 23:12:43.290416 kubelet[2668]: I0715 23:12:43.290018 2668 kubelet_node_status.go:111] "Node was previously registered" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.292405 kubelet[2668]: I0715 23:12:43.290557 2668 kubelet_node_status.go:75] "Successfully registered node" node="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.379326 kubelet[2668]: I0715 23:12:43.378619 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/0fe268a4ee8fc52243b93ea09c6ed498-k8s-certs\") pod \"kube-apiserver-ci-4372-0-1-n-91aeaf5bee\" (UID: \"0fe268a4ee8fc52243b93ea09c6ed498\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.379326 kubelet[2668]: I0715 23:12:43.378699 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/0fe268a4ee8fc52243b93ea09c6ed498-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4372-0-1-n-91aeaf5bee\" (UID: \"0fe268a4ee8fc52243b93ea09c6ed498\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.379326 kubelet[2668]: I0715 23:12:43.378723 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-kubeconfig\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.379326 kubelet[2668]: I0715 23:12:43.378780 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.379326 kubelet[2668]: I0715 23:12:43.378801 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/3870ed6cf33960bb05d8df9d0d2ac562-kubeconfig\") pod \"kube-scheduler-ci-4372-0-1-n-91aeaf5bee\" (UID: \"3870ed6cf33960bb05d8df9d0d2ac562\") " pod="kube-system/kube-scheduler-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.379519 kubelet[2668]: I0715 23:12:43.378817 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/0fe268a4ee8fc52243b93ea09c6ed498-ca-certs\") pod \"kube-apiserver-ci-4372-0-1-n-91aeaf5bee\" (UID: \"0fe268a4ee8fc52243b93ea09c6ed498\") " pod="kube-system/kube-apiserver-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.379519 kubelet[2668]: I0715 23:12:43.378833 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-ca-certs\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.379519 kubelet[2668]: I0715 23:12:43.378911 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-flexvolume-dir\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:43.379519 kubelet[2668]: I0715 23:12:43.378933 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/835fd3f7d259d9471f171b976abbd90b-k8s-certs\") pod \"kube-controller-manager-ci-4372-0-1-n-91aeaf5bee\" (UID: \"835fd3f7d259d9471f171b976abbd90b\") " pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" Jul 15 23:12:44.050411 kubelet[2668]: I0715 23:12:44.050337 2668 apiserver.go:52] "Watching apiserver" Jul 15 23:12:44.078322 kubelet[2668]: I0715 23:12:44.077975 2668 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Jul 15 23:12:44.174173 kubelet[2668]: I0715 23:12:44.173575 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4372-0-1-n-91aeaf5bee" podStartSLOduration=1.173547918 podStartE2EDuration="1.173547918s" podCreationTimestamp="2025-07-15 23:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:12:44.172475941 +0000 UTC m=+1.207015851" watchObservedRunningTime="2025-07-15 23:12:44.173547918 +0000 UTC m=+1.208087868" Jul 15 23:12:44.174173 kubelet[2668]: I0715 23:12:44.173811 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4372-0-1-n-91aeaf5bee" podStartSLOduration=1.173779152 podStartE2EDuration="1.173779152s" podCreationTimestamp="2025-07-15 23:12:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:12:44.160832268 +0000 UTC m=+1.195372178" watchObservedRunningTime="2025-07-15 23:12:44.173779152 +0000 UTC m=+1.208319062" Jul 15 23:12:44.210223 kubelet[2668]: I0715 23:12:44.210128 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4372-0-1-n-91aeaf5bee" podStartSLOduration=3.210102966 podStartE2EDuration="3.210102966s" podCreationTimestamp="2025-07-15 23:12:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:12:44.190409711 +0000 UTC m=+1.224949661" watchObservedRunningTime="2025-07-15 23:12:44.210102966 +0000 UTC m=+1.244642916" Jul 15 23:12:46.757006 kubelet[2668]: I0715 23:12:46.756960 2668 kuberuntime_manager.go:1635] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jul 15 23:12:46.757644 containerd[1512]: time="2025-07-15T23:12:46.757579138Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jul 15 23:12:46.758431 kubelet[2668]: I0715 23:12:46.757891 2668 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jul 15 23:12:47.341916 systemd[1]: Created slice kubepods-besteffort-pod9394000d_0384_4375_a31c_08c5607c9fa1.slice - libcontainer container kubepods-besteffort-pod9394000d_0384_4375_a31c_08c5607c9fa1.slice. Jul 15 23:12:47.404351 kubelet[2668]: I0715 23:12:47.404195 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9394000d-0384-4375-a31c-08c5607c9fa1-xtables-lock\") pod \"kube-proxy-h56fb\" (UID: \"9394000d-0384-4375-a31c-08c5607c9fa1\") " pod="kube-system/kube-proxy-h56fb" Jul 15 23:12:47.404758 kubelet[2668]: I0715 23:12:47.404581 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9394000d-0384-4375-a31c-08c5607c9fa1-lib-modules\") pod \"kube-proxy-h56fb\" (UID: \"9394000d-0384-4375-a31c-08c5607c9fa1\") " pod="kube-system/kube-proxy-h56fb" Jul 15 23:12:47.404758 kubelet[2668]: I0715 23:12:47.404611 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9j6g8\" (UniqueName: \"kubernetes.io/projected/9394000d-0384-4375-a31c-08c5607c9fa1-kube-api-access-9j6g8\") pod \"kube-proxy-h56fb\" (UID: \"9394000d-0384-4375-a31c-08c5607c9fa1\") " pod="kube-system/kube-proxy-h56fb" Jul 15 23:12:47.404758 kubelet[2668]: I0715 23:12:47.404636 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/9394000d-0384-4375-a31c-08c5607c9fa1-kube-proxy\") pod \"kube-proxy-h56fb\" (UID: \"9394000d-0384-4375-a31c-08c5607c9fa1\") " pod="kube-system/kube-proxy-h56fb" Jul 15 23:12:47.520010 kubelet[2668]: E0715 23:12:47.519952 2668 projected.go:288] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Jul 15 23:12:47.520010 kubelet[2668]: E0715 23:12:47.520007 2668 projected.go:194] Error preparing data for projected volume kube-api-access-9j6g8 for pod kube-system/kube-proxy-h56fb: configmap "kube-root-ca.crt" not found Jul 15 23:12:47.520180 kubelet[2668]: E0715 23:12:47.520112 2668 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9394000d-0384-4375-a31c-08c5607c9fa1-kube-api-access-9j6g8 podName:9394000d-0384-4375-a31c-08c5607c9fa1 nodeName:}" failed. No retries permitted until 2025-07-15 23:12:48.020075434 +0000 UTC m=+5.054615384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-9j6g8" (UniqueName: "kubernetes.io/projected/9394000d-0384-4375-a31c-08c5607c9fa1-kube-api-access-9j6g8") pod "kube-proxy-h56fb" (UID: "9394000d-0384-4375-a31c-08c5607c9fa1") : configmap "kube-root-ca.crt" not found Jul 15 23:12:47.862075 systemd[1]: Created slice kubepods-besteffort-pod08f44a52_eadb_4525_97f9_fbc2316594d1.slice - libcontainer container kubepods-besteffort-pod08f44a52_eadb_4525_97f9_fbc2316594d1.slice. Jul 15 23:12:47.911017 kubelet[2668]: I0715 23:12:47.910779 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/08f44a52-eadb-4525-97f9-fbc2316594d1-var-lib-calico\") pod \"tigera-operator-5bf8dfcb4-cvcr4\" (UID: \"08f44a52-eadb-4525-97f9-fbc2316594d1\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-cvcr4" Jul 15 23:12:47.912569 kubelet[2668]: I0715 23:12:47.912193 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zwv2\" (UniqueName: \"kubernetes.io/projected/08f44a52-eadb-4525-97f9-fbc2316594d1-kube-api-access-2zwv2\") pod \"tigera-operator-5bf8dfcb4-cvcr4\" (UID: \"08f44a52-eadb-4525-97f9-fbc2316594d1\") " pod="tigera-operator/tigera-operator-5bf8dfcb4-cvcr4" Jul 15 23:12:48.168283 containerd[1512]: time="2025-07-15T23:12:48.167637829Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-cvcr4,Uid:08f44a52-eadb-4525-97f9-fbc2316594d1,Namespace:tigera-operator,Attempt:0,}" Jul 15 23:12:48.194343 containerd[1512]: time="2025-07-15T23:12:48.194143840Z" level=info msg="connecting to shim d2e457d410b36301902c681c4607f7ce7b71f4b89493d6649bb0d3a54f3811a1" address="unix:///run/containerd/s/3fbc25577f74d3ccbb50896ec87d174916f1543fa1a673a86620f35ea84c88a2" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:12:48.226131 systemd[1]: Started cri-containerd-d2e457d410b36301902c681c4607f7ce7b71f4b89493d6649bb0d3a54f3811a1.scope - libcontainer container d2e457d410b36301902c681c4607f7ce7b71f4b89493d6649bb0d3a54f3811a1. Jul 15 23:12:48.254127 containerd[1512]: time="2025-07-15T23:12:48.254072743Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-h56fb,Uid:9394000d-0384-4375-a31c-08c5607c9fa1,Namespace:kube-system,Attempt:0,}" Jul 15 23:12:48.279482 containerd[1512]: time="2025-07-15T23:12:48.279331493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-5bf8dfcb4-cvcr4,Uid:08f44a52-eadb-4525-97f9-fbc2316594d1,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"d2e457d410b36301902c681c4607f7ce7b71f4b89493d6649bb0d3a54f3811a1\"" Jul 15 23:12:48.283458 containerd[1512]: time="2025-07-15T23:12:48.282839543Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Jul 15 23:12:48.286686 containerd[1512]: time="2025-07-15T23:12:48.286643688Z" level=info msg="connecting to shim c49eadf71da2ba52b212f08abaa128a0f9ed2395adb2d87ee767e8b11e1779b4" address="unix:///run/containerd/s/11fd3d0895dddf053020f4056af5b0eabc5816b248d33aee073f3b2ac93a4b46" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:12:48.317184 systemd[1]: Started cri-containerd-c49eadf71da2ba52b212f08abaa128a0f9ed2395adb2d87ee767e8b11e1779b4.scope - libcontainer container c49eadf71da2ba52b212f08abaa128a0f9ed2395adb2d87ee767e8b11e1779b4. Jul 15 23:12:48.347265 containerd[1512]: time="2025-07-15T23:12:48.347124566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-h56fb,Uid:9394000d-0384-4375-a31c-08c5607c9fa1,Namespace:kube-system,Attempt:0,} returns sandbox id \"c49eadf71da2ba52b212f08abaa128a0f9ed2395adb2d87ee767e8b11e1779b4\"" Jul 15 23:12:48.353941 containerd[1512]: time="2025-07-15T23:12:48.353840373Z" level=info msg="CreateContainer within sandbox \"c49eadf71da2ba52b212f08abaa128a0f9ed2395adb2d87ee767e8b11e1779b4\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jul 15 23:12:48.364974 containerd[1512]: time="2025-07-15T23:12:48.364811803Z" level=info msg="Container 54d91719ae2c82168d85aa12d407adf6bd43f9bdee3555851b624703974d9332: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:12:48.377622 containerd[1512]: time="2025-07-15T23:12:48.377529863Z" level=info msg="CreateContainer within sandbox \"c49eadf71da2ba52b212f08abaa128a0f9ed2395adb2d87ee767e8b11e1779b4\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"54d91719ae2c82168d85aa12d407adf6bd43f9bdee3555851b624703974d9332\"" Jul 15 23:12:48.380006 containerd[1512]: time="2025-07-15T23:12:48.378824974Z" level=info msg="StartContainer for \"54d91719ae2c82168d85aa12d407adf6bd43f9bdee3555851b624703974d9332\"" Jul 15 23:12:48.383536 containerd[1512]: time="2025-07-15T23:12:48.383493339Z" level=info msg="connecting to shim 54d91719ae2c82168d85aa12d407adf6bd43f9bdee3555851b624703974d9332" address="unix:///run/containerd/s/11fd3d0895dddf053020f4056af5b0eabc5816b248d33aee073f3b2ac93a4b46" protocol=ttrpc version=3 Jul 15 23:12:48.409016 systemd[1]: Started cri-containerd-54d91719ae2c82168d85aa12d407adf6bd43f9bdee3555851b624703974d9332.scope - libcontainer container 54d91719ae2c82168d85aa12d407adf6bd43f9bdee3555851b624703974d9332. Jul 15 23:12:48.461203 containerd[1512]: time="2025-07-15T23:12:48.461087174Z" level=info msg="StartContainer for \"54d91719ae2c82168d85aa12d407adf6bd43f9bdee3555851b624703974d9332\" returns successfully" Jul 15 23:12:49.166213 kubelet[2668]: I0715 23:12:49.166146 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-h56fb" podStartSLOduration=2.166125327 podStartE2EDuration="2.166125327s" podCreationTimestamp="2025-07-15 23:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:12:49.165774879 +0000 UTC m=+6.200314829" watchObservedRunningTime="2025-07-15 23:12:49.166125327 +0000 UTC m=+6.200665237" Jul 15 23:12:50.968429 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount255120851.mount: Deactivated successfully. Jul 15 23:12:51.417880 containerd[1512]: time="2025-07-15T23:12:51.417805352Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:51.419094 containerd[1512]: time="2025-07-15T23:12:51.419042283Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Jul 15 23:12:51.420316 containerd[1512]: time="2025-07-15T23:12:51.420208399Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:51.425527 containerd[1512]: time="2025-07-15T23:12:51.424269541Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:12:51.425527 containerd[1512]: time="2025-07-15T23:12:51.425315992Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 3.142415718s" Jul 15 23:12:51.425527 containerd[1512]: time="2025-07-15T23:12:51.425358761Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Jul 15 23:12:51.430331 containerd[1512]: time="2025-07-15T23:12:51.430293960Z" level=info msg="CreateContainer within sandbox \"d2e457d410b36301902c681c4607f7ce7b71f4b89493d6649bb0d3a54f3811a1\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jul 15 23:12:51.440896 containerd[1512]: time="2025-07-15T23:12:51.439525948Z" level=info msg="Container 57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:12:51.452819 containerd[1512]: time="2025-07-15T23:12:51.452695494Z" level=info msg="CreateContainer within sandbox \"d2e457d410b36301902c681c4607f7ce7b71f4b89493d6649bb0d3a54f3811a1\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586\"" Jul 15 23:12:51.455352 containerd[1512]: time="2025-07-15T23:12:51.455244690Z" level=info msg="StartContainer for \"57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586\"" Jul 15 23:12:51.460136 containerd[1512]: time="2025-07-15T23:12:51.460083149Z" level=info msg="connecting to shim 57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586" address="unix:///run/containerd/s/3fbc25577f74d3ccbb50896ec87d174916f1543fa1a673a86620f35ea84c88a2" protocol=ttrpc version=3 Jul 15 23:12:51.491176 systemd[1]: Started cri-containerd-57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586.scope - libcontainer container 57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586. Jul 15 23:12:51.532042 containerd[1512]: time="2025-07-15T23:12:51.531993943Z" level=info msg="StartContainer for \"57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586\" returns successfully" Jul 15 23:12:52.178722 kubelet[2668]: I0715 23:12:52.178560 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-5bf8dfcb4-cvcr4" podStartSLOduration=2.032967602 podStartE2EDuration="5.178539028s" podCreationTimestamp="2025-07-15 23:12:47 +0000 UTC" firstStartedPulling="2025-07-15 23:12:48.281762455 +0000 UTC m=+5.316302365" lastFinishedPulling="2025-07-15 23:12:51.427333881 +0000 UTC m=+8.461873791" observedRunningTime="2025-07-15 23:12:52.178146593 +0000 UTC m=+9.212686543" watchObservedRunningTime="2025-07-15 23:12:52.178539028 +0000 UTC m=+9.213078978" Jul 15 23:12:57.760355 sudo[1763]: pam_unix(sudo:session): session closed for user root Jul 15 23:12:57.920657 sshd[1762]: Connection closed by 139.178.68.195 port 47792 Jul 15 23:12:57.922159 sshd-session[1760]: pam_unix(sshd:session): session closed for user core Jul 15 23:12:57.931241 systemd[1]: sshd@6-91.99.216.80:22-139.178.68.195:47792.service: Deactivated successfully. Jul 15 23:12:57.931916 systemd-logind[1488]: Session 7 logged out. Waiting for processes to exit. Jul 15 23:12:57.937647 systemd[1]: session-7.scope: Deactivated successfully. Jul 15 23:12:57.938157 systemd[1]: session-7.scope: Consumed 6.649s CPU time, 231.1M memory peak. Jul 15 23:12:57.941247 systemd-logind[1488]: Removed session 7. Jul 15 23:13:05.727503 systemd[1]: Created slice kubepods-besteffort-pod560a798b_e6b4_4d17_ae24_bcdcb98e53ae.slice - libcontainer container kubepods-besteffort-pod560a798b_e6b4_4d17_ae24_bcdcb98e53ae.slice. Jul 15 23:13:05.878105 systemd[1]: Created slice kubepods-besteffort-podb1fde930_810b_4c09_b6b7_224e5a6021a5.slice - libcontainer container kubepods-besteffort-podb1fde930_810b_4c09_b6b7_224e5a6021a5.slice. Jul 15 23:13:05.917859 kubelet[2668]: I0715 23:13:05.917791 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/560a798b-e6b4-4d17-ae24-bcdcb98e53ae-tigera-ca-bundle\") pod \"calico-typha-796dfd74c8-jmf6x\" (UID: \"560a798b-e6b4-4d17-ae24-bcdcb98e53ae\") " pod="calico-system/calico-typha-796dfd74c8-jmf6x" Jul 15 23:13:05.917859 kubelet[2668]: I0715 23:13:05.917834 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/560a798b-e6b4-4d17-ae24-bcdcb98e53ae-typha-certs\") pod \"calico-typha-796dfd74c8-jmf6x\" (UID: \"560a798b-e6b4-4d17-ae24-bcdcb98e53ae\") " pod="calico-system/calico-typha-796dfd74c8-jmf6x" Jul 15 23:13:05.918388 kubelet[2668]: I0715 23:13:05.918351 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jz4x\" (UniqueName: \"kubernetes.io/projected/560a798b-e6b4-4d17-ae24-bcdcb98e53ae-kube-api-access-8jz4x\") pod \"calico-typha-796dfd74c8-jmf6x\" (UID: \"560a798b-e6b4-4d17-ae24-bcdcb98e53ae\") " pod="calico-system/calico-typha-796dfd74c8-jmf6x" Jul 15 23:13:06.015382 kubelet[2668]: E0715 23:13:06.014819 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hddsp" podUID="2a38c65f-24e4-465a-afbd-242e66579eef" Jul 15 23:13:06.019432 kubelet[2668]: I0715 23:13:06.019384 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/b1fde930-810b-4c09-b6b7-224e5a6021a5-policysync\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019432 kubelet[2668]: I0715 23:13:06.019438 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/b1fde930-810b-4c09-b6b7-224e5a6021a5-node-certs\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019432 kubelet[2668]: I0715 23:13:06.019456 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1fde930-810b-4c09-b6b7-224e5a6021a5-tigera-ca-bundle\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019659 kubelet[2668]: I0715 23:13:06.019472 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7g748\" (UniqueName: \"kubernetes.io/projected/b1fde930-810b-4c09-b6b7-224e5a6021a5-kube-api-access-7g748\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019659 kubelet[2668]: I0715 23:13:06.019503 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trxvz\" (UniqueName: \"kubernetes.io/projected/2a38c65f-24e4-465a-afbd-242e66579eef-kube-api-access-trxvz\") pod \"csi-node-driver-hddsp\" (UID: \"2a38c65f-24e4-465a-afbd-242e66579eef\") " pod="calico-system/csi-node-driver-hddsp" Jul 15 23:13:06.019659 kubelet[2668]: I0715 23:13:06.019519 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/2a38c65f-24e4-465a-afbd-242e66579eef-varrun\") pod \"csi-node-driver-hddsp\" (UID: \"2a38c65f-24e4-465a-afbd-242e66579eef\") " pod="calico-system/csi-node-driver-hddsp" Jul 15 23:13:06.019659 kubelet[2668]: I0715 23:13:06.019533 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b1fde930-810b-4c09-b6b7-224e5a6021a5-lib-modules\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019659 kubelet[2668]: I0715 23:13:06.019551 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/b1fde930-810b-4c09-b6b7-224e5a6021a5-var-run-calico\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019764 kubelet[2668]: I0715 23:13:06.019568 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/b1fde930-810b-4c09-b6b7-224e5a6021a5-cni-bin-dir\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019764 kubelet[2668]: I0715 23:13:06.019582 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/b1fde930-810b-4c09-b6b7-224e5a6021a5-cni-log-dir\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019764 kubelet[2668]: I0715 23:13:06.019597 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/b1fde930-810b-4c09-b6b7-224e5a6021a5-flexvol-driver-host\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019764 kubelet[2668]: I0715 23:13:06.019613 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/2a38c65f-24e4-465a-afbd-242e66579eef-socket-dir\") pod \"csi-node-driver-hddsp\" (UID: \"2a38c65f-24e4-465a-afbd-242e66579eef\") " pod="calico-system/csi-node-driver-hddsp" Jul 15 23:13:06.019764 kubelet[2668]: I0715 23:13:06.019628 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b1fde930-810b-4c09-b6b7-224e5a6021a5-var-lib-calico\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019907 kubelet[2668]: I0715 23:13:06.019642 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/b1fde930-810b-4c09-b6b7-224e5a6021a5-xtables-lock\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019907 kubelet[2668]: I0715 23:13:06.019660 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2a38c65f-24e4-465a-afbd-242e66579eef-kubelet-dir\") pod \"csi-node-driver-hddsp\" (UID: \"2a38c65f-24e4-465a-afbd-242e66579eef\") " pod="calico-system/csi-node-driver-hddsp" Jul 15 23:13:06.019907 kubelet[2668]: I0715 23:13:06.019674 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/b1fde930-810b-4c09-b6b7-224e5a6021a5-cni-net-dir\") pod \"calico-node-sskmx\" (UID: \"b1fde930-810b-4c09-b6b7-224e5a6021a5\") " pod="calico-system/calico-node-sskmx" Jul 15 23:13:06.019907 kubelet[2668]: I0715 23:13:06.019690 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/2a38c65f-24e4-465a-afbd-242e66579eef-registration-dir\") pod \"csi-node-driver-hddsp\" (UID: \"2a38c65f-24e4-465a-afbd-242e66579eef\") " pod="calico-system/csi-node-driver-hddsp" Jul 15 23:13:06.121955 kubelet[2668]: E0715 23:13:06.121425 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.121955 kubelet[2668]: W0715 23:13:06.121890 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.122185 kubelet[2668]: E0715 23:13:06.122108 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.124032 kubelet[2668]: E0715 23:13:06.123985 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.124032 kubelet[2668]: W0715 23:13:06.124017 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.124032 kubelet[2668]: E0715 23:13:06.124045 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.124285 kubelet[2668]: E0715 23:13:06.124232 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.124285 kubelet[2668]: W0715 23:13:06.124284 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.124360 kubelet[2668]: E0715 23:13:06.124300 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.124532 kubelet[2668]: E0715 23:13:06.124516 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.124532 kubelet[2668]: W0715 23:13:06.124529 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.124631 kubelet[2668]: E0715 23:13:06.124614 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.124719 kubelet[2668]: E0715 23:13:06.124704 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.124719 kubelet[2668]: W0715 23:13:06.124716 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.124840 kubelet[2668]: E0715 23:13:06.124774 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.124976 kubelet[2668]: E0715 23:13:06.124955 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.124976 kubelet[2668]: W0715 23:13:06.124973 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.125109 kubelet[2668]: E0715 23:13:06.125007 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.125133 kubelet[2668]: E0715 23:13:06.125123 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.125133 kubelet[2668]: W0715 23:13:06.125130 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.125231 kubelet[2668]: E0715 23:13:06.125160 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.125374 kubelet[2668]: E0715 23:13:06.125342 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.125374 kubelet[2668]: W0715 23:13:06.125371 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.125530 kubelet[2668]: E0715 23:13:06.125510 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.125591 kubelet[2668]: E0715 23:13:06.125575 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.125591 kubelet[2668]: W0715 23:13:06.125586 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.125718 kubelet[2668]: E0715 23:13:06.125618 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.125718 kubelet[2668]: E0715 23:13:06.125710 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.125778 kubelet[2668]: W0715 23:13:06.125721 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.125814 kubelet[2668]: E0715 23:13:06.125801 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.125916 kubelet[2668]: E0715 23:13:06.125898 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.125916 kubelet[2668]: W0715 23:13:06.125910 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.126043 kubelet[2668]: E0715 23:13:06.125947 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.126043 kubelet[2668]: E0715 23:13:06.126030 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.126043 kubelet[2668]: W0715 23:13:06.126037 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.126138 kubelet[2668]: E0715 23:13:06.126114 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.126223 kubelet[2668]: E0715 23:13:06.126208 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.126223 kubelet[2668]: W0715 23:13:06.126220 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128077 kubelet[2668]: E0715 23:13:06.126358 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128077 kubelet[2668]: W0715 23:13:06.126365 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128077 kubelet[2668]: E0715 23:13:06.126465 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128077 kubelet[2668]: W0715 23:13:06.126471 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128077 kubelet[2668]: E0715 23:13:06.126561 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128077 kubelet[2668]: W0715 23:13:06.126566 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128077 kubelet[2668]: E0715 23:13:06.126656 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128077 kubelet[2668]: W0715 23:13:06.126662 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128077 kubelet[2668]: E0715 23:13:06.126747 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128077 kubelet[2668]: W0715 23:13:06.126752 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128077 kubelet[2668]: E0715 23:13:06.126863 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128329 kubelet[2668]: W0715 23:13:06.126871 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128329 kubelet[2668]: E0715 23:13:06.126881 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128329 kubelet[2668]: E0715 23:13:06.126995 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128329 kubelet[2668]: W0715 23:13:06.127002 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128329 kubelet[2668]: E0715 23:13:06.127009 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128329 kubelet[2668]: E0715 23:13:06.127166 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128329 kubelet[2668]: W0715 23:13:06.127173 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128329 kubelet[2668]: E0715 23:13:06.127180 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128329 kubelet[2668]: E0715 23:13:06.127358 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128329 kubelet[2668]: W0715 23:13:06.127368 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128526 kubelet[2668]: E0715 23:13:06.127377 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128526 kubelet[2668]: E0715 23:13:06.127482 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128526 kubelet[2668]: W0715 23:13:06.127488 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128526 kubelet[2668]: E0715 23:13:06.127494 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128526 kubelet[2668]: E0715 23:13:06.127633 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128526 kubelet[2668]: W0715 23:13:06.127643 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128526 kubelet[2668]: E0715 23:13:06.127651 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128526 kubelet[2668]: E0715 23:13:06.127812 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.128526 kubelet[2668]: W0715 23:13:06.127820 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.128526 kubelet[2668]: E0715 23:13:06.127829 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128710 kubelet[2668]: E0715 23:13:06.127887 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128710 kubelet[2668]: E0715 23:13:06.128038 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128710 kubelet[2668]: E0715 23:13:06.128053 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128710 kubelet[2668]: E0715 23:13:06.128060 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128710 kubelet[2668]: E0715 23:13:06.128070 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.128710 kubelet[2668]: E0715 23:13:06.128085 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.129061 kubelet[2668]: E0715 23:13:06.128942 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.129061 kubelet[2668]: W0715 23:13:06.128957 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.129061 kubelet[2668]: E0715 23:13:06.128986 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.129268 kubelet[2668]: E0715 23:13:06.129231 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.129268 kubelet[2668]: W0715 23:13:06.129265 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.129342 kubelet[2668]: E0715 23:13:06.129283 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.129459 kubelet[2668]: E0715 23:13:06.129446 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.129459 kubelet[2668]: W0715 23:13:06.129458 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.129520 kubelet[2668]: E0715 23:13:06.129471 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.129694 kubelet[2668]: E0715 23:13:06.129675 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.129694 kubelet[2668]: W0715 23:13:06.129689 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.129798 kubelet[2668]: E0715 23:13:06.129783 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.131263 kubelet[2668]: E0715 23:13:06.129814 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.131263 kubelet[2668]: W0715 23:13:06.129993 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.131263 kubelet[2668]: E0715 23:13:06.130132 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.131263 kubelet[2668]: W0715 23:13:06.130140 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.131263 kubelet[2668]: E0715 23:13:06.130326 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.131263 kubelet[2668]: W0715 23:13:06.130336 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.131263 kubelet[2668]: E0715 23:13:06.130445 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.131263 kubelet[2668]: W0715 23:13:06.130451 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.131263 kubelet[2668]: E0715 23:13:06.130561 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.131263 kubelet[2668]: W0715 23:13:06.130566 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.131521 kubelet[2668]: E0715 23:13:06.130575 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.131521 kubelet[2668]: E0715 23:13:06.130594 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.131521 kubelet[2668]: E0715 23:13:06.130721 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.131521 kubelet[2668]: W0715 23:13:06.130733 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.131521 kubelet[2668]: E0715 23:13:06.130742 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.131521 kubelet[2668]: E0715 23:13:06.130756 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.131521 kubelet[2668]: E0715 23:13:06.130953 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.131521 kubelet[2668]: W0715 23:13:06.130961 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.131521 kubelet[2668]: E0715 23:13:06.130970 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.131521 kubelet[2668]: E0715 23:13:06.130984 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.131706 kubelet[2668]: E0715 23:13:06.130995 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.131706 kubelet[2668]: E0715 23:13:06.131135 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.131706 kubelet[2668]: W0715 23:13:06.131142 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.131706 kubelet[2668]: E0715 23:13:06.131150 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.138646 kubelet[2668]: E0715 23:13:06.138577 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.138646 kubelet[2668]: W0715 23:13:06.138596 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.138646 kubelet[2668]: E0715 23:13:06.138612 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.191839 kubelet[2668]: E0715 23:13:06.191441 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.191839 kubelet[2668]: W0715 23:13:06.191468 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.191839 kubelet[2668]: E0715 23:13:06.191492 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.195461 kubelet[2668]: E0715 23:13:06.195374 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:06.195461 kubelet[2668]: W0715 23:13:06.195399 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:06.195461 kubelet[2668]: E0715 23:13:06.195421 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:06.332831 containerd[1512]: time="2025-07-15T23:13:06.332642667Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-796dfd74c8-jmf6x,Uid:560a798b-e6b4-4d17-ae24-bcdcb98e53ae,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:06.363879 containerd[1512]: time="2025-07-15T23:13:06.363804362Z" level=info msg="connecting to shim 7c31b4a9d4d9af3c8dd9c9c510a99fd50b6838ae1d304377a98e1bf8956e470f" address="unix:///run/containerd/s/fa123970e828b70910b40b0558529b3ffddc662ef8297acfaf379ab1078b71ba" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:06.389322 systemd[1]: Started cri-containerd-7c31b4a9d4d9af3c8dd9c9c510a99fd50b6838ae1d304377a98e1bf8956e470f.scope - libcontainer container 7c31b4a9d4d9af3c8dd9c9c510a99fd50b6838ae1d304377a98e1bf8956e470f. Jul 15 23:13:06.442431 containerd[1512]: time="2025-07-15T23:13:06.442385884Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-796dfd74c8-jmf6x,Uid:560a798b-e6b4-4d17-ae24-bcdcb98e53ae,Namespace:calico-system,Attempt:0,} returns sandbox id \"7c31b4a9d4d9af3c8dd9c9c510a99fd50b6838ae1d304377a98e1bf8956e470f\"" Jul 15 23:13:06.448516 containerd[1512]: time="2025-07-15T23:13:06.448432479Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Jul 15 23:13:06.484009 containerd[1512]: time="2025-07-15T23:13:06.483448608Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-sskmx,Uid:b1fde930-810b-4c09-b6b7-224e5a6021a5,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:06.515426 containerd[1512]: time="2025-07-15T23:13:06.515387295Z" level=info msg="connecting to shim c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a" address="unix:///run/containerd/s/ea767c74e6e1cbe0b09e85d791c110057ca3d59eadcba66ba79f15bde271e199" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:06.550088 systemd[1]: Started cri-containerd-c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a.scope - libcontainer container c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a. Jul 15 23:13:06.607076 containerd[1512]: time="2025-07-15T23:13:06.606875800Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-sskmx,Uid:b1fde930-810b-4c09-b6b7-224e5a6021a5,Namespace:calico-system,Attempt:0,} returns sandbox id \"c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a\"" Jul 15 23:13:07.859700 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1374984312.mount: Deactivated successfully. Jul 15 23:13:08.091077 kubelet[2668]: E0715 23:13:08.089662 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hddsp" podUID="2a38c65f-24e4-465a-afbd-242e66579eef" Jul 15 23:13:08.942900 containerd[1512]: time="2025-07-15T23:13:08.941927163Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:08.943462 containerd[1512]: time="2025-07-15T23:13:08.943381125Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Jul 15 23:13:08.944503 containerd[1512]: time="2025-07-15T23:13:08.944448334Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:08.949107 containerd[1512]: time="2025-07-15T23:13:08.948922667Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:08.949573 containerd[1512]: time="2025-07-15T23:13:08.949451271Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 2.500958787s" Jul 15 23:13:08.949809 containerd[1512]: time="2025-07-15T23:13:08.949781539Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Jul 15 23:13:08.952882 containerd[1512]: time="2025-07-15T23:13:08.952106853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Jul 15 23:13:08.967890 containerd[1512]: time="2025-07-15T23:13:08.967831165Z" level=info msg="CreateContainer within sandbox \"7c31b4a9d4d9af3c8dd9c9c510a99fd50b6838ae1d304377a98e1bf8956e470f\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jul 15 23:13:08.980870 containerd[1512]: time="2025-07-15T23:13:08.979050501Z" level=info msg="Container 23f37f47e548face6d5cc193ebb9ea63bfc795067c1220130447b8a2028812d6: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:08.988885 containerd[1512]: time="2025-07-15T23:13:08.988811795Z" level=info msg="CreateContainer within sandbox \"7c31b4a9d4d9af3c8dd9c9c510a99fd50b6838ae1d304377a98e1bf8956e470f\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"23f37f47e548face6d5cc193ebb9ea63bfc795067c1220130447b8a2028812d6\"" Jul 15 23:13:08.992505 containerd[1512]: time="2025-07-15T23:13:08.992427177Z" level=info msg="StartContainer for \"23f37f47e548face6d5cc193ebb9ea63bfc795067c1220130447b8a2028812d6\"" Jul 15 23:13:08.995008 containerd[1512]: time="2025-07-15T23:13:08.994951428Z" level=info msg="connecting to shim 23f37f47e548face6d5cc193ebb9ea63bfc795067c1220130447b8a2028812d6" address="unix:///run/containerd/s/fa123970e828b70910b40b0558529b3ffddc662ef8297acfaf379ab1078b71ba" protocol=ttrpc version=3 Jul 15 23:13:09.029100 systemd[1]: Started cri-containerd-23f37f47e548face6d5cc193ebb9ea63bfc795067c1220130447b8a2028812d6.scope - libcontainer container 23f37f47e548face6d5cc193ebb9ea63bfc795067c1220130447b8a2028812d6. Jul 15 23:13:09.086456 containerd[1512]: time="2025-07-15T23:13:09.086392698Z" level=info msg="StartContainer for \"23f37f47e548face6d5cc193ebb9ea63bfc795067c1220130447b8a2028812d6\" returns successfully" Jul 15 23:13:09.240338 kubelet[2668]: I0715 23:13:09.240112 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-796dfd74c8-jmf6x" podStartSLOduration=1.7352619040000001 podStartE2EDuration="4.24009511s" podCreationTimestamp="2025-07-15 23:13:05 +0000 UTC" firstStartedPulling="2025-07-15 23:13:06.446619512 +0000 UTC m=+23.481159422" lastFinishedPulling="2025-07-15 23:13:08.951452718 +0000 UTC m=+25.985992628" observedRunningTime="2025-07-15 23:13:09.238789966 +0000 UTC m=+26.273329916" watchObservedRunningTime="2025-07-15 23:13:09.24009511 +0000 UTC m=+26.274635020" Jul 15 23:13:09.245999 kubelet[2668]: E0715 23:13:09.245936 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.245999 kubelet[2668]: W0715 23:13:09.245988 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.246257 kubelet[2668]: E0715 23:13:09.246011 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.246800 kubelet[2668]: E0715 23:13:09.246304 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.246800 kubelet[2668]: W0715 23:13:09.246321 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.246800 kubelet[2668]: E0715 23:13:09.246334 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.246800 kubelet[2668]: E0715 23:13:09.246611 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.246800 kubelet[2668]: W0715 23:13:09.246633 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.246800 kubelet[2668]: E0715 23:13:09.246645 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.247067 kubelet[2668]: E0715 23:13:09.246943 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.247067 kubelet[2668]: W0715 23:13:09.246954 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.247067 kubelet[2668]: E0715 23:13:09.246965 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.247232 kubelet[2668]: E0715 23:13:09.247211 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.247232 kubelet[2668]: W0715 23:13:09.247226 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.247374 kubelet[2668]: E0715 23:13:09.247236 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.247668 kubelet[2668]: E0715 23:13:09.247622 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.247668 kubelet[2668]: W0715 23:13:09.247643 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.247668 kubelet[2668]: E0715 23:13:09.247668 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.249031 kubelet[2668]: E0715 23:13:09.249002 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.249031 kubelet[2668]: W0715 23:13:09.249020 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.249031 kubelet[2668]: E0715 23:13:09.249037 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.249984 kubelet[2668]: E0715 23:13:09.249959 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.249984 kubelet[2668]: W0715 23:13:09.249976 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.249984 kubelet[2668]: E0715 23:13:09.249990 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.250175 kubelet[2668]: E0715 23:13:09.250156 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.250175 kubelet[2668]: W0715 23:13:09.250168 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.250319 kubelet[2668]: E0715 23:13:09.250177 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.250991 kubelet[2668]: E0715 23:13:09.250965 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.251072 kubelet[2668]: W0715 23:13:09.250998 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.251072 kubelet[2668]: E0715 23:13:09.251016 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.251216 kubelet[2668]: E0715 23:13:09.251199 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.251216 kubelet[2668]: W0715 23:13:09.251213 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.251276 kubelet[2668]: E0715 23:13:09.251223 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.251276 kubelet[2668]: E0715 23:13:09.251376 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.251276 kubelet[2668]: W0715 23:13:09.251383 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.251276 kubelet[2668]: E0715 23:13:09.251407 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.251276 kubelet[2668]: E0715 23:13:09.251600 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.251276 kubelet[2668]: W0715 23:13:09.251610 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.251276 kubelet[2668]: E0715 23:13:09.251619 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.252078 kubelet[2668]: E0715 23:13:09.252057 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.252078 kubelet[2668]: W0715 23:13:09.252073 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.252150 kubelet[2668]: E0715 23:13:09.252088 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.252296 kubelet[2668]: E0715 23:13:09.252264 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.252296 kubelet[2668]: W0715 23:13:09.252279 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.252296 kubelet[2668]: E0715 23:13:09.252288 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.345244 kubelet[2668]: E0715 23:13:09.345198 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.345244 kubelet[2668]: W0715 23:13:09.345226 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.345244 kubelet[2668]: E0715 23:13:09.345248 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.345908 kubelet[2668]: E0715 23:13:09.345884 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.345908 kubelet[2668]: W0715 23:13:09.345904 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.345982 kubelet[2668]: E0715 23:13:09.345935 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.346172 kubelet[2668]: E0715 23:13:09.346157 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.346172 kubelet[2668]: W0715 23:13:09.346170 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.346953 kubelet[2668]: E0715 23:13:09.346270 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.347150 kubelet[2668]: E0715 23:13:09.347116 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.347150 kubelet[2668]: W0715 23:13:09.347146 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.347252 kubelet[2668]: E0715 23:13:09.347171 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.347364 kubelet[2668]: E0715 23:13:09.347345 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.347527 kubelet[2668]: W0715 23:13:09.347375 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.347527 kubelet[2668]: E0715 23:13:09.347414 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.347631 kubelet[2668]: E0715 23:13:09.347547 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.347631 kubelet[2668]: W0715 23:13:09.347555 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.347769 kubelet[2668]: E0715 23:13:09.347662 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.347882 kubelet[2668]: E0715 23:13:09.347859 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.347882 kubelet[2668]: W0715 23:13:09.347875 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.347882 kubelet[2668]: E0715 23:13:09.347907 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.348068 kubelet[2668]: E0715 23:13:09.348022 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.348068 kubelet[2668]: W0715 23:13:09.348030 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.348958 kubelet[2668]: E0715 23:13:09.348895 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.349194 kubelet[2668]: E0715 23:13:09.349106 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.349194 kubelet[2668]: W0715 23:13:09.349117 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.349194 kubelet[2668]: E0715 23:13:09.349135 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.350985 kubelet[2668]: E0715 23:13:09.350952 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.350985 kubelet[2668]: W0715 23:13:09.350977 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.351153 kubelet[2668]: E0715 23:13:09.351114 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.351153 kubelet[2668]: E0715 23:13:09.351145 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.351153 kubelet[2668]: W0715 23:13:09.351153 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.351254 kubelet[2668]: E0715 23:13:09.351232 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.351358 kubelet[2668]: E0715 23:13:09.351337 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.351358 kubelet[2668]: W0715 23:13:09.351350 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.351358 kubelet[2668]: E0715 23:13:09.351373 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.351868 kubelet[2668]: E0715 23:13:09.351837 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.351868 kubelet[2668]: W0715 23:13:09.351865 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.352002 kubelet[2668]: E0715 23:13:09.351889 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.353917 kubelet[2668]: E0715 23:13:09.353890 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.353917 kubelet[2668]: W0715 23:13:09.353909 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.353917 kubelet[2668]: E0715 23:13:09.353929 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.354179 kubelet[2668]: E0715 23:13:09.354165 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.354179 kubelet[2668]: W0715 23:13:09.354178 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.354275 kubelet[2668]: E0715 23:13:09.354260 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.354525 kubelet[2668]: E0715 23:13:09.354502 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.354525 kubelet[2668]: W0715 23:13:09.354517 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.354525 kubelet[2668]: E0715 23:13:09.354548 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.354808 kubelet[2668]: E0715 23:13:09.354779 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.354808 kubelet[2668]: W0715 23:13:09.354801 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.354979 kubelet[2668]: E0715 23:13:09.354820 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:09.355071 kubelet[2668]: E0715 23:13:09.355053 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:09.355071 kubelet[2668]: W0715 23:13:09.355069 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:09.355124 kubelet[2668]: E0715 23:13:09.355080 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.089204 kubelet[2668]: E0715 23:13:10.089091 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hddsp" podUID="2a38c65f-24e4-465a-afbd-242e66579eef" Jul 15 23:13:10.223206 kubelet[2668]: I0715 23:13:10.223127 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:13:10.259748 kubelet[2668]: E0715 23:13:10.259689 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.259748 kubelet[2668]: W0715 23:13:10.259743 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.260251 kubelet[2668]: E0715 23:13:10.259827 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.260251 kubelet[2668]: E0715 23:13:10.260220 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.260251 kubelet[2668]: W0715 23:13:10.260241 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.260361 kubelet[2668]: E0715 23:13:10.260262 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.261005 kubelet[2668]: E0715 23:13:10.260579 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.261005 kubelet[2668]: W0715 23:13:10.260606 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.261005 kubelet[2668]: E0715 23:13:10.260649 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.261005 kubelet[2668]: E0715 23:13:10.260977 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.261005 kubelet[2668]: W0715 23:13:10.260992 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.261466 kubelet[2668]: E0715 23:13:10.261010 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.261466 kubelet[2668]: E0715 23:13:10.261340 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.261466 kubelet[2668]: W0715 23:13:10.261373 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.261466 kubelet[2668]: E0715 23:13:10.261393 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.261924 kubelet[2668]: E0715 23:13:10.261796 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.261924 kubelet[2668]: W0715 23:13:10.261816 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.261924 kubelet[2668]: E0715 23:13:10.261836 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.262265 kubelet[2668]: E0715 23:13:10.262166 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.262265 kubelet[2668]: W0715 23:13:10.262183 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.262265 kubelet[2668]: E0715 23:13:10.262203 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.262607 kubelet[2668]: E0715 23:13:10.262452 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.262607 kubelet[2668]: W0715 23:13:10.262467 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.262607 kubelet[2668]: E0715 23:13:10.262478 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.262751 kubelet[2668]: E0715 23:13:10.262656 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.262751 kubelet[2668]: W0715 23:13:10.262666 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.262751 kubelet[2668]: E0715 23:13:10.262676 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.262861 kubelet[2668]: E0715 23:13:10.262829 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.263106 kubelet[2668]: W0715 23:13:10.262838 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.263106 kubelet[2668]: E0715 23:13:10.262948 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.263106 kubelet[2668]: E0715 23:13:10.263124 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.263402 kubelet[2668]: W0715 23:13:10.263134 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.263402 kubelet[2668]: E0715 23:13:10.263144 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.263402 kubelet[2668]: E0715 23:13:10.263296 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.263402 kubelet[2668]: W0715 23:13:10.263304 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.263402 kubelet[2668]: E0715 23:13:10.263314 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.263839 kubelet[2668]: E0715 23:13:10.263464 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.263839 kubelet[2668]: W0715 23:13:10.263473 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.263839 kubelet[2668]: E0715 23:13:10.263501 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.263839 kubelet[2668]: E0715 23:13:10.263674 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.263839 kubelet[2668]: W0715 23:13:10.263683 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.263839 kubelet[2668]: E0715 23:13:10.263704 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.263839 kubelet[2668]: E0715 23:13:10.263909 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.263839 kubelet[2668]: W0715 23:13:10.263920 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.263839 kubelet[2668]: E0715 23:13:10.263930 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.354694 kubelet[2668]: E0715 23:13:10.353476 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.354694 kubelet[2668]: W0715 23:13:10.353501 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.354694 kubelet[2668]: E0715 23:13:10.353603 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.354694 kubelet[2668]: E0715 23:13:10.353905 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.354694 kubelet[2668]: W0715 23:13:10.353918 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.354694 kubelet[2668]: E0715 23:13:10.353935 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.354694 kubelet[2668]: E0715 23:13:10.354125 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.354694 kubelet[2668]: W0715 23:13:10.354146 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.354694 kubelet[2668]: E0715 23:13:10.354158 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.354694 kubelet[2668]: E0715 23:13:10.354363 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.355154 kubelet[2668]: W0715 23:13:10.354373 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.355154 kubelet[2668]: E0715 23:13:10.354385 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.355154 kubelet[2668]: E0715 23:13:10.354559 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.355154 kubelet[2668]: W0715 23:13:10.354571 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.355154 kubelet[2668]: E0715 23:13:10.354583 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.355425 kubelet[2668]: E0715 23:13:10.355398 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.355425 kubelet[2668]: W0715 23:13:10.355419 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.355775 kubelet[2668]: E0715 23:13:10.355451 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.356179 kubelet[2668]: E0715 23:13:10.355990 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.356179 kubelet[2668]: W0715 23:13:10.356016 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.356179 kubelet[2668]: E0715 23:13:10.356164 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.356599 kubelet[2668]: E0715 23:13:10.356551 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.356599 kubelet[2668]: W0715 23:13:10.356572 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.356952 kubelet[2668]: E0715 23:13:10.356927 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.357126 kubelet[2668]: E0715 23:13:10.356942 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.357126 kubelet[2668]: W0715 23:13:10.357007 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.357394 kubelet[2668]: E0715 23:13:10.357367 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.357474 kubelet[2668]: W0715 23:13:10.357440 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.357553 kubelet[2668]: E0715 23:13:10.357513 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.357770 kubelet[2668]: E0715 23:13:10.357036 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.357934 kubelet[2668]: E0715 23:13:10.357829 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.358051 kubelet[2668]: W0715 23:13:10.357840 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.358051 kubelet[2668]: E0715 23:13:10.358007 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.358284 kubelet[2668]: E0715 23:13:10.358262 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.358284 kubelet[2668]: W0715 23:13:10.358280 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.358347 kubelet[2668]: E0715 23:13:10.358294 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.358484 kubelet[2668]: E0715 23:13:10.358433 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.358484 kubelet[2668]: W0715 23:13:10.358445 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.358484 kubelet[2668]: E0715 23:13:10.358456 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.358679 kubelet[2668]: E0715 23:13:10.358648 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.358679 kubelet[2668]: W0715 23:13:10.358664 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.358793 kubelet[2668]: E0715 23:13:10.358773 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.359008 kubelet[2668]: E0715 23:13:10.358903 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.359096 kubelet[2668]: W0715 23:13:10.359080 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.359300 kubelet[2668]: E0715 23:13:10.359166 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.359471 kubelet[2668]: E0715 23:13:10.359456 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.359607 kubelet[2668]: W0715 23:13:10.359536 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.359607 kubelet[2668]: E0715 23:13:10.359561 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.359776 kubelet[2668]: E0715 23:13:10.359750 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.359819 kubelet[2668]: W0715 23:13:10.359776 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.359819 kubelet[2668]: E0715 23:13:10.359799 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.360321 kubelet[2668]: E0715 23:13:10.360262 2668 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jul 15 23:13:10.360321 kubelet[2668]: W0715 23:13:10.360278 2668 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jul 15 23:13:10.360321 kubelet[2668]: E0715 23:13:10.360295 2668 plugins.go:691] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jul 15 23:13:10.439984 containerd[1512]: time="2025-07-15T23:13:10.439913813Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:10.441728 containerd[1512]: time="2025-07-15T23:13:10.441665106Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Jul 15 23:13:10.443423 containerd[1512]: time="2025-07-15T23:13:10.443110136Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:10.448486 containerd[1512]: time="2025-07-15T23:13:10.448414300Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:10.450299 containerd[1512]: time="2025-07-15T23:13:10.450164674Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 1.498019378s" Jul 15 23:13:10.450299 containerd[1512]: time="2025-07-15T23:13:10.450203677Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Jul 15 23:13:10.455117 containerd[1512]: time="2025-07-15T23:13:10.455075648Z" level=info msg="CreateContainer within sandbox \"c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jul 15 23:13:10.470808 containerd[1512]: time="2025-07-15T23:13:10.469955382Z" level=info msg="Container a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:10.481951 containerd[1512]: time="2025-07-15T23:13:10.480864974Z" level=info msg="CreateContainer within sandbox \"c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39\"" Jul 15 23:13:10.483483 containerd[1512]: time="2025-07-15T23:13:10.483142187Z" level=info msg="StartContainer for \"a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39\"" Jul 15 23:13:10.487638 containerd[1512]: time="2025-07-15T23:13:10.487560684Z" level=info msg="connecting to shim a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39" address="unix:///run/containerd/s/ea767c74e6e1cbe0b09e85d791c110057ca3d59eadcba66ba79f15bde271e199" protocol=ttrpc version=3 Jul 15 23:13:10.530052 systemd[1]: Started cri-containerd-a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39.scope - libcontainer container a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39. Jul 15 23:13:10.595653 containerd[1512]: time="2025-07-15T23:13:10.595604159Z" level=info msg="StartContainer for \"a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39\" returns successfully" Jul 15 23:13:10.620341 systemd[1]: cri-containerd-a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39.scope: Deactivated successfully. Jul 15 23:13:10.625807 containerd[1512]: time="2025-07-15T23:13:10.625655329Z" level=info msg="received exit event container_id:\"a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39\" id:\"a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39\" pid:3343 exited_at:{seconds:1752621190 nanos:625279820}" Jul 15 23:13:10.626428 containerd[1512]: time="2025-07-15T23:13:10.626389105Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39\" id:\"a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39\" pid:3343 exited_at:{seconds:1752621190 nanos:625279820}" Jul 15 23:13:10.661523 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a11b016d598f618a423be7e763e93b50df5b36640ee86db9d13f637831e4fb39-rootfs.mount: Deactivated successfully. Jul 15 23:13:11.232682 containerd[1512]: time="2025-07-15T23:13:11.231918299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Jul 15 23:13:12.089125 kubelet[2668]: E0715 23:13:12.088981 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hddsp" podUID="2a38c65f-24e4-465a-afbd-242e66579eef" Jul 15 23:13:13.566481 kubelet[2668]: I0715 23:13:13.566084 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:13:14.088836 kubelet[2668]: E0715 23:13:14.088746 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-hddsp" podUID="2a38c65f-24e4-465a-afbd-242e66579eef" Jul 15 23:13:14.858404 containerd[1512]: time="2025-07-15T23:13:14.858348982Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:14.860118 containerd[1512]: time="2025-07-15T23:13:14.860074052Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Jul 15 23:13:14.860587 containerd[1512]: time="2025-07-15T23:13:14.860550843Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:14.862950 containerd[1512]: time="2025-07-15T23:13:14.862921276Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:14.864382 containerd[1512]: time="2025-07-15T23:13:14.863667683Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 3.631709422s" Jul 15 23:13:14.864382 containerd[1512]: time="2025-07-15T23:13:14.864112552Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Jul 15 23:13:14.867113 containerd[1512]: time="2025-07-15T23:13:14.867087223Z" level=info msg="CreateContainer within sandbox \"c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jul 15 23:13:14.878020 containerd[1512]: time="2025-07-15T23:13:14.877979844Z" level=info msg="Container 6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:14.896915 containerd[1512]: time="2025-07-15T23:13:14.896828136Z" level=info msg="CreateContainer within sandbox \"c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7\"" Jul 15 23:13:14.898077 containerd[1512]: time="2025-07-15T23:13:14.898024372Z" level=info msg="StartContainer for \"6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7\"" Jul 15 23:13:14.902707 containerd[1512]: time="2025-07-15T23:13:14.902150238Z" level=info msg="connecting to shim 6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7" address="unix:///run/containerd/s/ea767c74e6e1cbe0b09e85d791c110057ca3d59eadcba66ba79f15bde271e199" protocol=ttrpc version=3 Jul 15 23:13:14.930068 systemd[1]: Started cri-containerd-6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7.scope - libcontainer container 6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7. Jul 15 23:13:14.977397 containerd[1512]: time="2025-07-15T23:13:14.977234465Z" level=info msg="StartContainer for \"6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7\" returns successfully" Jul 15 23:13:15.493427 containerd[1512]: time="2025-07-15T23:13:15.493364363Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jul 15 23:13:15.496553 systemd[1]: cri-containerd-6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7.scope: Deactivated successfully. Jul 15 23:13:15.496931 systemd[1]: cri-containerd-6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7.scope: Consumed 507ms CPU time, 185.6M memory peak, 165.8M written to disk. Jul 15 23:13:15.501020 containerd[1512]: time="2025-07-15T23:13:15.500965033Z" level=info msg="received exit event container_id:\"6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7\" id:\"6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7\" pid:3404 exited_at:{seconds:1752621195 nanos:500591650}" Jul 15 23:13:15.501733 containerd[1512]: time="2025-07-15T23:13:15.501632714Z" level=info msg="TaskExit event in podsandbox handler container_id:\"6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7\" id:\"6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7\" pid:3404 exited_at:{seconds:1752621195 nanos:500591650}" Jul 15 23:13:15.529078 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-6e162f52c1584aa5797a057b4af3a590733a40053b2f9ff22977697f0e8deed7-rootfs.mount: Deactivated successfully. Jul 15 23:13:15.569406 kubelet[2668]: I0715 23:13:15.567426 2668 kubelet_node_status.go:488] "Fast updating node status as it just became ready" Jul 15 23:13:15.628114 systemd[1]: Created slice kubepods-burstable-pod67a52572_ea42_4651_8164_6aaa7db16b6f.slice - libcontainer container kubepods-burstable-pod67a52572_ea42_4651_8164_6aaa7db16b6f.slice. Jul 15 23:13:15.633493 kubelet[2668]: W0715 23:13:15.633377 2668 reflector.go:561] object-"calico-apiserver"/"calico-apiserver-certs": failed to list *v1.Secret: secrets "calico-apiserver-certs" is forbidden: User "system:node:ci-4372-0-1-n-91aeaf5bee" cannot list resource "secrets" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4372-0-1-n-91aeaf5bee' and this object Jul 15 23:13:15.633683 kubelet[2668]: E0715 23:13:15.633503 2668 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"calico-apiserver-certs\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"calico-apiserver-certs\" is forbidden: User \"system:node:ci-4372-0-1-n-91aeaf5bee\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4372-0-1-n-91aeaf5bee' and this object" logger="UnhandledError" Jul 15 23:13:15.633683 kubelet[2668]: W0715 23:13:15.633572 2668 reflector.go:561] object-"calico-apiserver"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:ci-4372-0-1-n-91aeaf5bee" cannot list resource "configmaps" in API group "" in the namespace "calico-apiserver": no relationship found between node 'ci-4372-0-1-n-91aeaf5bee' and this object Jul 15 23:13:15.633683 kubelet[2668]: E0715 23:13:15.633589 2668 reflector.go:158] "Unhandled Error" err="object-\"calico-apiserver\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4372-0-1-n-91aeaf5bee\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-apiserver\": no relationship found between node 'ci-4372-0-1-n-91aeaf5bee' and this object" logger="UnhandledError" Jul 15 23:13:15.640365 systemd[1]: Created slice kubepods-burstable-pod19596002_f6b8_45b5_b851_b0ec002e6602.slice - libcontainer container kubepods-burstable-pod19596002_f6b8_45b5_b851_b0ec002e6602.slice. Jul 15 23:13:15.648273 systemd[1]: Created slice kubepods-besteffort-pod1a4cf393_394d_46d2_b039_ad46900b55f7.slice - libcontainer container kubepods-besteffort-pod1a4cf393_394d_46d2_b039_ad46900b55f7.slice. Jul 15 23:13:15.666906 systemd[1]: Created slice kubepods-besteffort-pod094991fd_9793_44ba_9e9f_9e777da29e64.slice - libcontainer container kubepods-besteffort-pod094991fd_9793_44ba_9e9f_9e777da29e64.slice. Jul 15 23:13:15.678801 systemd[1]: Created slice kubepods-besteffort-pod29bdaeaf_6929_486a_a35b_f522167f96fa.slice - libcontainer container kubepods-besteffort-pod29bdaeaf_6929_486a_a35b_f522167f96fa.slice. Jul 15 23:13:15.685402 systemd[1]: Created slice kubepods-besteffort-pod673144b1_613a_4f74_af82_9d2cb7db4571.slice - libcontainer container kubepods-besteffort-pod673144b1_613a_4f74_af82_9d2cb7db4571.slice. Jul 15 23:13:15.693662 systemd[1]: Created slice kubepods-besteffort-pod36e81425_6b32_4185_bd01_a2d30ff92c2c.slice - libcontainer container kubepods-besteffort-pod36e81425_6b32_4185_bd01_a2d30ff92c2c.slice. Jul 15 23:13:15.703891 systemd[1]: Created slice kubepods-besteffort-podec396930_4143_4faa_8947_7b5623e308ef.slice - libcontainer container kubepods-besteffort-podec396930_4143_4faa_8947_7b5623e308ef.slice. Jul 15 23:13:15.794049 kubelet[2668]: I0715 23:13:15.793681 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxrrt\" (UniqueName: \"kubernetes.io/projected/ec396930-4143-4faa-8947-7b5623e308ef-kube-api-access-dxrrt\") pod \"whisker-689b6ffdb4-bpvvf\" (UID: \"ec396930-4143-4faa-8947-7b5623e308ef\") " pod="calico-system/whisker-689b6ffdb4-bpvvf" Jul 15 23:13:15.796951 kubelet[2668]: I0715 23:13:15.794280 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ec396930-4143-4faa-8947-7b5623e308ef-whisker-backend-key-pair\") pod \"whisker-689b6ffdb4-bpvvf\" (UID: \"ec396930-4143-4faa-8947-7b5623e308ef\") " pod="calico-system/whisker-689b6ffdb4-bpvvf" Jul 15 23:13:15.796951 kubelet[2668]: I0715 23:13:15.795509 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/29bdaeaf-6929-486a-a35b-f522167f96fa-calico-apiserver-certs\") pod \"calico-apiserver-f5986dd7d-tb5xs\" (UID: \"29bdaeaf-6929-486a-a35b-f522167f96fa\") " pod="calico-apiserver/calico-apiserver-f5986dd7d-tb5xs" Jul 15 23:13:15.796951 kubelet[2668]: I0715 23:13:15.795583 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/19596002-f6b8-45b5-b851-b0ec002e6602-config-volume\") pod \"coredns-7c65d6cfc9-zhnvj\" (UID: \"19596002-f6b8-45b5-b851-b0ec002e6602\") " pod="kube-system/coredns-7c65d6cfc9-zhnvj" Jul 15 23:13:15.796951 kubelet[2668]: I0715 23:13:15.795626 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kkq7\" (UniqueName: \"kubernetes.io/projected/29bdaeaf-6929-486a-a35b-f522167f96fa-kube-api-access-6kkq7\") pod \"calico-apiserver-f5986dd7d-tb5xs\" (UID: \"29bdaeaf-6929-486a-a35b-f522167f96fa\") " pod="calico-apiserver/calico-apiserver-f5986dd7d-tb5xs" Jul 15 23:13:15.796951 kubelet[2668]: I0715 23:13:15.795664 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvx7r\" (UniqueName: \"kubernetes.io/projected/19596002-f6b8-45b5-b851-b0ec002e6602-kube-api-access-bvx7r\") pod \"coredns-7c65d6cfc9-zhnvj\" (UID: \"19596002-f6b8-45b5-b851-b0ec002e6602\") " pod="kube-system/coredns-7c65d6cfc9-zhnvj" Jul 15 23:13:15.797561 kubelet[2668]: I0715 23:13:15.795733 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/673144b1-613a-4f74-af82-9d2cb7db4571-config\") pod \"goldmane-58fd7646b9-mz25w\" (UID: \"673144b1-613a-4f74-af82-9d2cb7db4571\") " pod="calico-system/goldmane-58fd7646b9-mz25w" Jul 15 23:13:15.797561 kubelet[2668]: I0715 23:13:15.795779 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/67a52572-ea42-4651-8164-6aaa7db16b6f-config-volume\") pod \"coredns-7c65d6cfc9-gkk8k\" (UID: \"67a52572-ea42-4651-8164-6aaa7db16b6f\") " pod="kube-system/coredns-7c65d6cfc9-gkk8k" Jul 15 23:13:15.797561 kubelet[2668]: I0715 23:13:15.795862 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/673144b1-613a-4f74-af82-9d2cb7db4571-goldmane-ca-bundle\") pod \"goldmane-58fd7646b9-mz25w\" (UID: \"673144b1-613a-4f74-af82-9d2cb7db4571\") " pod="calico-system/goldmane-58fd7646b9-mz25w" Jul 15 23:13:15.797561 kubelet[2668]: I0715 23:13:15.795907 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1a4cf393-394d-46d2-b039-ad46900b55f7-calico-apiserver-certs\") pod \"calico-apiserver-f5986dd7d-qm2lh\" (UID: \"1a4cf393-394d-46d2-b039-ad46900b55f7\") " pod="calico-apiserver/calico-apiserver-f5986dd7d-qm2lh" Jul 15 23:13:15.797561 kubelet[2668]: I0715 23:13:15.795951 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgq6j\" (UniqueName: \"kubernetes.io/projected/67a52572-ea42-4651-8164-6aaa7db16b6f-kube-api-access-xgq6j\") pod \"coredns-7c65d6cfc9-gkk8k\" (UID: \"67a52572-ea42-4651-8164-6aaa7db16b6f\") " pod="kube-system/coredns-7c65d6cfc9-gkk8k" Jul 15 23:13:15.798157 kubelet[2668]: I0715 23:13:15.795999 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36e81425-6b32-4185-bd01-a2d30ff92c2c-tigera-ca-bundle\") pod \"calico-kube-controllers-8bf995db6-rbt25\" (UID: \"36e81425-6b32-4185-bd01-a2d30ff92c2c\") " pod="calico-system/calico-kube-controllers-8bf995db6-rbt25" Jul 15 23:13:15.798157 kubelet[2668]: I0715 23:13:15.796036 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec396930-4143-4faa-8947-7b5623e308ef-whisker-ca-bundle\") pod \"whisker-689b6ffdb4-bpvvf\" (UID: \"ec396930-4143-4faa-8947-7b5623e308ef\") " pod="calico-system/whisker-689b6ffdb4-bpvvf" Jul 15 23:13:15.798157 kubelet[2668]: I0715 23:13:15.796078 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/094991fd-9793-44ba-9e9f-9e777da29e64-calico-apiserver-certs\") pod \"calico-apiserver-797fd56c96-7vvww\" (UID: \"094991fd-9793-44ba-9e9f-9e777da29e64\") " pod="calico-apiserver/calico-apiserver-797fd56c96-7vvww" Jul 15 23:13:15.798157 kubelet[2668]: I0715 23:13:15.796118 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/673144b1-613a-4f74-af82-9d2cb7db4571-goldmane-key-pair\") pod \"goldmane-58fd7646b9-mz25w\" (UID: \"673144b1-613a-4f74-af82-9d2cb7db4571\") " pod="calico-system/goldmane-58fd7646b9-mz25w" Jul 15 23:13:15.798157 kubelet[2668]: I0715 23:13:15.796161 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzhs2\" (UniqueName: \"kubernetes.io/projected/36e81425-6b32-4185-bd01-a2d30ff92c2c-kube-api-access-tzhs2\") pod \"calico-kube-controllers-8bf995db6-rbt25\" (UID: \"36e81425-6b32-4185-bd01-a2d30ff92c2c\") " pod="calico-system/calico-kube-controllers-8bf995db6-rbt25" Jul 15 23:13:15.798418 kubelet[2668]: I0715 23:13:15.796203 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s9xv\" (UniqueName: \"kubernetes.io/projected/673144b1-613a-4f74-af82-9d2cb7db4571-kube-api-access-7s9xv\") pod \"goldmane-58fd7646b9-mz25w\" (UID: \"673144b1-613a-4f74-af82-9d2cb7db4571\") " pod="calico-system/goldmane-58fd7646b9-mz25w" Jul 15 23:13:15.798418 kubelet[2668]: I0715 23:13:15.796240 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4d72z\" (UniqueName: \"kubernetes.io/projected/094991fd-9793-44ba-9e9f-9e777da29e64-kube-api-access-4d72z\") pod \"calico-apiserver-797fd56c96-7vvww\" (UID: \"094991fd-9793-44ba-9e9f-9e777da29e64\") " pod="calico-apiserver/calico-apiserver-797fd56c96-7vvww" Jul 15 23:13:15.798418 kubelet[2668]: I0715 23:13:15.796276 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knfns\" (UniqueName: \"kubernetes.io/projected/1a4cf393-394d-46d2-b039-ad46900b55f7-kube-api-access-knfns\") pod \"calico-apiserver-f5986dd7d-qm2lh\" (UID: \"1a4cf393-394d-46d2-b039-ad46900b55f7\") " pod="calico-apiserver/calico-apiserver-f5986dd7d-qm2lh" Jul 15 23:13:15.951091 containerd[1512]: time="2025-07-15T23:13:15.951052152Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zhnvj,Uid:19596002-f6b8-45b5-b851-b0ec002e6602,Namespace:kube-system,Attempt:0,}" Jul 15 23:13:15.992951 containerd[1512]: time="2025-07-15T23:13:15.992740927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-mz25w,Uid:673144b1-613a-4f74-af82-9d2cb7db4571,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:16.000974 containerd[1512]: time="2025-07-15T23:13:16.000910871Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8bf995db6-rbt25,Uid:36e81425-6b32-4185-bd01-a2d30ff92c2c,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:16.012464 containerd[1512]: time="2025-07-15T23:13:16.012357791Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-689b6ffdb4-bpvvf,Uid:ec396930-4143-4faa-8947-7b5623e308ef,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:16.085726 containerd[1512]: time="2025-07-15T23:13:16.085336685Z" level=error msg="Failed to destroy network for sandbox \"659560c749eabac3fc9f6c9c1054574fe49e6e0160a69c1cd4f50fa395330790\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.088699 containerd[1512]: time="2025-07-15T23:13:16.088654242Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zhnvj,Uid:19596002-f6b8-45b5-b851-b0ec002e6602,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"659560c749eabac3fc9f6c9c1054574fe49e6e0160a69c1cd4f50fa395330790\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.089383 kubelet[2668]: E0715 23:13:16.089285 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"659560c749eabac3fc9f6c9c1054574fe49e6e0160a69c1cd4f50fa395330790\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.089480 kubelet[2668]: E0715 23:13:16.089451 2668 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"659560c749eabac3fc9f6c9c1054574fe49e6e0160a69c1cd4f50fa395330790\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-zhnvj" Jul 15 23:13:16.089678 kubelet[2668]: E0715 23:13:16.089472 2668 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"659560c749eabac3fc9f6c9c1054574fe49e6e0160a69c1cd4f50fa395330790\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-zhnvj" Jul 15 23:13:16.089728 kubelet[2668]: E0715 23:13:16.089680 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-zhnvj_kube-system(19596002-f6b8-45b5-b851-b0ec002e6602)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-zhnvj_kube-system(19596002-f6b8-45b5-b851-b0ec002e6602)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"659560c749eabac3fc9f6c9c1054574fe49e6e0160a69c1cd4f50fa395330790\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-zhnvj" podUID="19596002-f6b8-45b5-b851-b0ec002e6602" Jul 15 23:13:16.101210 containerd[1512]: time="2025-07-15T23:13:16.101155305Z" level=error msg="Failed to destroy network for sandbox \"10dec63146ec779294ba99755cd8f726105e06ddc6dcc7969d5ffcd1bec0c57b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.101298 systemd[1]: Created slice kubepods-besteffort-pod2a38c65f_24e4_465a_afbd_242e66579eef.slice - libcontainer container kubepods-besteffort-pod2a38c65f_24e4_465a_afbd_242e66579eef.slice. Jul 15 23:13:16.105771 containerd[1512]: time="2025-07-15T23:13:16.105703495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hddsp,Uid:2a38c65f-24e4-465a-afbd-242e66579eef,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:16.106301 containerd[1512]: time="2025-07-15T23:13:16.106218646Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-mz25w,Uid:673144b1-613a-4f74-af82-9d2cb7db4571,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"10dec63146ec779294ba99755cd8f726105e06ddc6dcc7969d5ffcd1bec0c57b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.106558 kubelet[2668]: E0715 23:13:16.106404 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10dec63146ec779294ba99755cd8f726105e06ddc6dcc7969d5ffcd1bec0c57b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.106558 kubelet[2668]: E0715 23:13:16.106452 2668 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10dec63146ec779294ba99755cd8f726105e06ddc6dcc7969d5ffcd1bec0c57b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-mz25w" Jul 15 23:13:16.106558 kubelet[2668]: E0715 23:13:16.106469 2668 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"10dec63146ec779294ba99755cd8f726105e06ddc6dcc7969d5ffcd1bec0c57b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-58fd7646b9-mz25w" Jul 15 23:13:16.106650 kubelet[2668]: E0715 23:13:16.106516 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-58fd7646b9-mz25w_calico-system(673144b1-613a-4f74-af82-9d2cb7db4571)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-58fd7646b9-mz25w_calico-system(673144b1-613a-4f74-af82-9d2cb7db4571)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"10dec63146ec779294ba99755cd8f726105e06ddc6dcc7969d5ffcd1bec0c57b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-58fd7646b9-mz25w" podUID="673144b1-613a-4f74-af82-9d2cb7db4571" Jul 15 23:13:16.135660 containerd[1512]: time="2025-07-15T23:13:16.135539987Z" level=error msg="Failed to destroy network for sandbox \"4fb9c58fc79dca60ad7aab410a17c97dbe7538c9f2561fbdc721722574e6c18d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.137676 containerd[1512]: time="2025-07-15T23:13:16.137581588Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-689b6ffdb4-bpvvf,Uid:ec396930-4143-4faa-8947-7b5623e308ef,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fb9c58fc79dca60ad7aab410a17c97dbe7538c9f2561fbdc721722574e6c18d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.137895 kubelet[2668]: E0715 23:13:16.137794 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fb9c58fc79dca60ad7aab410a17c97dbe7538c9f2561fbdc721722574e6c18d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.137962 kubelet[2668]: E0715 23:13:16.137894 2668 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fb9c58fc79dca60ad7aab410a17c97dbe7538c9f2561fbdc721722574e6c18d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-689b6ffdb4-bpvvf" Jul 15 23:13:16.137962 kubelet[2668]: E0715 23:13:16.137912 2668 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4fb9c58fc79dca60ad7aab410a17c97dbe7538c9f2561fbdc721722574e6c18d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-689b6ffdb4-bpvvf" Jul 15 23:13:16.138020 kubelet[2668]: E0715 23:13:16.137972 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-689b6ffdb4-bpvvf_calico-system(ec396930-4143-4faa-8947-7b5623e308ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-689b6ffdb4-bpvvf_calico-system(ec396930-4143-4faa-8947-7b5623e308ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4fb9c58fc79dca60ad7aab410a17c97dbe7538c9f2561fbdc721722574e6c18d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-689b6ffdb4-bpvvf" podUID="ec396930-4143-4faa-8947-7b5623e308ef" Jul 15 23:13:16.148318 containerd[1512]: time="2025-07-15T23:13:16.148179498Z" level=error msg="Failed to destroy network for sandbox \"0bff24976e9365d8e7b298371ca24b49ae10d08f9adafed19232bbcec5dc27d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.150282 containerd[1512]: time="2025-07-15T23:13:16.150185177Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8bf995db6-rbt25,Uid:36e81425-6b32-4185-bd01-a2d30ff92c2c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bff24976e9365d8e7b298371ca24b49ae10d08f9adafed19232bbcec5dc27d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.150961 kubelet[2668]: E0715 23:13:16.150584 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bff24976e9365d8e7b298371ca24b49ae10d08f9adafed19232bbcec5dc27d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.151156 kubelet[2668]: E0715 23:13:16.150998 2668 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bff24976e9365d8e7b298371ca24b49ae10d08f9adafed19232bbcec5dc27d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8bf995db6-rbt25" Jul 15 23:13:16.151156 kubelet[2668]: E0715 23:13:16.151039 2668 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"0bff24976e9365d8e7b298371ca24b49ae10d08f9adafed19232bbcec5dc27d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-8bf995db6-rbt25" Jul 15 23:13:16.151156 kubelet[2668]: E0715 23:13:16.151118 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-8bf995db6-rbt25_calico-system(36e81425-6b32-4185-bd01-a2d30ff92c2c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-8bf995db6-rbt25_calico-system(36e81425-6b32-4185-bd01-a2d30ff92c2c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"0bff24976e9365d8e7b298371ca24b49ae10d08f9adafed19232bbcec5dc27d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-8bf995db6-rbt25" podUID="36e81425-6b32-4185-bd01-a2d30ff92c2c" Jul 15 23:13:16.190832 containerd[1512]: time="2025-07-15T23:13:16.190752106Z" level=error msg="Failed to destroy network for sandbox \"ed3b5f9b3ee84cb5d80115e6692c8e8523f2f2c83fa543e5a93b33a47e72db9c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.193211 containerd[1512]: time="2025-07-15T23:13:16.193138568Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hddsp,Uid:2a38c65f-24e4-465a-afbd-242e66579eef,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed3b5f9b3ee84cb5d80115e6692c8e8523f2f2c83fa543e5a93b33a47e72db9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.193546 kubelet[2668]: E0715 23:13:16.193482 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed3b5f9b3ee84cb5d80115e6692c8e8523f2f2c83fa543e5a93b33a47e72db9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.193633 kubelet[2668]: E0715 23:13:16.193576 2668 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed3b5f9b3ee84cb5d80115e6692c8e8523f2f2c83fa543e5a93b33a47e72db9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hddsp" Jul 15 23:13:16.193633 kubelet[2668]: E0715 23:13:16.193612 2668 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"ed3b5f9b3ee84cb5d80115e6692c8e8523f2f2c83fa543e5a93b33a47e72db9c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-hddsp" Jul 15 23:13:16.193734 kubelet[2668]: E0715 23:13:16.193654 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-hddsp_calico-system(2a38c65f-24e4-465a-afbd-242e66579eef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-hddsp_calico-system(2a38c65f-24e4-465a-afbd-242e66579eef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"ed3b5f9b3ee84cb5d80115e6692c8e8523f2f2c83fa543e5a93b33a47e72db9c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-hddsp" podUID="2a38c65f-24e4-465a-afbd-242e66579eef" Jul 15 23:13:16.237696 containerd[1512]: time="2025-07-15T23:13:16.237641931Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gkk8k,Uid:67a52572-ea42-4651-8164-6aaa7db16b6f,Namespace:kube-system,Attempt:0,}" Jul 15 23:13:16.266029 containerd[1512]: time="2025-07-15T23:13:16.265925411Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Jul 15 23:13:16.312855 containerd[1512]: time="2025-07-15T23:13:16.312739031Z" level=error msg="Failed to destroy network for sandbox \"49abf0136356f9e1f5848e0400a81a1493e101d7d3769055df3c7e29393b6826\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.314623 containerd[1512]: time="2025-07-15T23:13:16.314512977Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gkk8k,Uid:67a52572-ea42-4651-8164-6aaa7db16b6f,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"49abf0136356f9e1f5848e0400a81a1493e101d7d3769055df3c7e29393b6826\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.315653 kubelet[2668]: E0715 23:13:16.315612 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49abf0136356f9e1f5848e0400a81a1493e101d7d3769055df3c7e29393b6826\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.315919 kubelet[2668]: E0715 23:13:16.315791 2668 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49abf0136356f9e1f5848e0400a81a1493e101d7d3769055df3c7e29393b6826\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-gkk8k" Jul 15 23:13:16.315919 kubelet[2668]: E0715 23:13:16.315831 2668 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"49abf0136356f9e1f5848e0400a81a1493e101d7d3769055df3c7e29393b6826\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-7c65d6cfc9-gkk8k" Jul 15 23:13:16.316093 kubelet[2668]: E0715 23:13:16.316065 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-7c65d6cfc9-gkk8k_kube-system(67a52572-ea42-4651-8164-6aaa7db16b6f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-7c65d6cfc9-gkk8k_kube-system(67a52572-ea42-4651-8164-6aaa7db16b6f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"49abf0136356f9e1f5848e0400a81a1493e101d7d3769055df3c7e29393b6826\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-7c65d6cfc9-gkk8k" podUID="67a52572-ea42-4651-8164-6aaa7db16b6f" Jul 15 23:13:16.882081 containerd[1512]: time="2025-07-15T23:13:16.881987120Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797fd56c96-7vvww,Uid:094991fd-9793-44ba-9e9f-9e777da29e64,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:13:16.884240 containerd[1512]: time="2025-07-15T23:13:16.884199412Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f5986dd7d-tb5xs,Uid:29bdaeaf-6929-486a-a35b-f522167f96fa,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:13:16.964457 containerd[1512]: time="2025-07-15T23:13:16.964394535Z" level=error msg="Failed to destroy network for sandbox \"c949c756505846672066481e4bed2726793336ffcdf98a8c15bc22cfaa1daf6c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.968024 containerd[1512]: time="2025-07-15T23:13:16.967973227Z" level=error msg="Failed to destroy network for sandbox \"5115a2adc8c50619a7e88987a0025feac7d244e12ccb7f7d4262c8993f2305e2\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.968484 systemd[1]: run-netns-cni\x2deb18b39d\x2d7da7\x2d52e4\x2d2803\x2dfae8c5427270.mount: Deactivated successfully. Jul 15 23:13:16.971907 kubelet[2668]: E0715 23:13:16.970258 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c949c756505846672066481e4bed2726793336ffcdf98a8c15bc22cfaa1daf6c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.971907 kubelet[2668]: E0715 23:13:16.970317 2668 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c949c756505846672066481e4bed2726793336ffcdf98a8c15bc22cfaa1daf6c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f5986dd7d-tb5xs" Jul 15 23:13:16.971907 kubelet[2668]: E0715 23:13:16.970358 2668 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c949c756505846672066481e4bed2726793336ffcdf98a8c15bc22cfaa1daf6c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f5986dd7d-tb5xs" Jul 15 23:13:16.972230 containerd[1512]: time="2025-07-15T23:13:16.969092094Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f5986dd7d-tb5xs,Uid:29bdaeaf-6929-486a-a35b-f522167f96fa,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c949c756505846672066481e4bed2726793336ffcdf98a8c15bc22cfaa1daf6c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.972302 kubelet[2668]: E0715 23:13:16.970419 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f5986dd7d-tb5xs_calico-apiserver(29bdaeaf-6929-486a-a35b-f522167f96fa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f5986dd7d-tb5xs_calico-apiserver(29bdaeaf-6929-486a-a35b-f522167f96fa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c949c756505846672066481e4bed2726793336ffcdf98a8c15bc22cfaa1daf6c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f5986dd7d-tb5xs" podUID="29bdaeaf-6929-486a-a35b-f522167f96fa" Jul 15 23:13:16.973949 systemd[1]: run-netns-cni\x2d555e640d\x2d9022\x2da145\x2d25c8\x2d668dc207255f.mount: Deactivated successfully. Jul 15 23:13:16.974104 containerd[1512]: time="2025-07-15T23:13:16.973817174Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797fd56c96-7vvww,Uid:094991fd-9793-44ba-9e9f-9e777da29e64,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5115a2adc8c50619a7e88987a0025feac7d244e12ccb7f7d4262c8993f2305e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.975189 kubelet[2668]: E0715 23:13:16.975147 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5115a2adc8c50619a7e88987a0025feac7d244e12ccb7f7d4262c8993f2305e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:16.975374 kubelet[2668]: E0715 23:13:16.975329 2668 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5115a2adc8c50619a7e88987a0025feac7d244e12ccb7f7d4262c8993f2305e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-797fd56c96-7vvww" Jul 15 23:13:16.975374 kubelet[2668]: E0715 23:13:16.975370 2668 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5115a2adc8c50619a7e88987a0025feac7d244e12ccb7f7d4262c8993f2305e2\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-797fd56c96-7vvww" Jul 15 23:13:16.975455 kubelet[2668]: E0715 23:13:16.975416 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-797fd56c96-7vvww_calico-apiserver(094991fd-9793-44ba-9e9f-9e777da29e64)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-797fd56c96-7vvww_calico-apiserver(094991fd-9793-44ba-9e9f-9e777da29e64)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5115a2adc8c50619a7e88987a0025feac7d244e12ccb7f7d4262c8993f2305e2\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-797fd56c96-7vvww" podUID="094991fd-9793-44ba-9e9f-9e777da29e64" Jul 15 23:13:17.162975 containerd[1512]: time="2025-07-15T23:13:17.162702193Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f5986dd7d-qm2lh,Uid:1a4cf393-394d-46d2-b039-ad46900b55f7,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:13:17.224916 containerd[1512]: time="2025-07-15T23:13:17.224741340Z" level=error msg="Failed to destroy network for sandbox \"bfa70ab3093e7c04ec5718d5354e07f7fb93941fb90d37e1e7e8968c8118c0a9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:17.229318 systemd[1]: run-netns-cni\x2dc55fa314\x2d6681\x2d05e8\x2d59b8\x2de47d39ad63e7.mount: Deactivated successfully. Jul 15 23:13:17.230916 containerd[1512]: time="2025-07-15T23:13:17.230320619Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f5986dd7d-qm2lh,Uid:1a4cf393-394d-46d2-b039-ad46900b55f7,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bfa70ab3093e7c04ec5718d5354e07f7fb93941fb90d37e1e7e8968c8118c0a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:17.231600 kubelet[2668]: E0715 23:13:17.231259 2668 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bfa70ab3093e7c04ec5718d5354e07f7fb93941fb90d37e1e7e8968c8118c0a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jul 15 23:13:17.231600 kubelet[2668]: E0715 23:13:17.231323 2668 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bfa70ab3093e7c04ec5718d5354e07f7fb93941fb90d37e1e7e8968c8118c0a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f5986dd7d-qm2lh" Jul 15 23:13:17.231600 kubelet[2668]: E0715 23:13:17.231341 2668 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bfa70ab3093e7c04ec5718d5354e07f7fb93941fb90d37e1e7e8968c8118c0a9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-f5986dd7d-qm2lh" Jul 15 23:13:17.231734 kubelet[2668]: E0715 23:13:17.231377 2668 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-f5986dd7d-qm2lh_calico-apiserver(1a4cf393-394d-46d2-b039-ad46900b55f7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-f5986dd7d-qm2lh_calico-apiserver(1a4cf393-394d-46d2-b039-ad46900b55f7)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bfa70ab3093e7c04ec5718d5354e07f7fb93941fb90d37e1e7e8968c8118c0a9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-f5986dd7d-qm2lh" podUID="1a4cf393-394d-46d2-b039-ad46900b55f7" Jul 15 23:13:23.830253 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2143378977.mount: Deactivated successfully. Jul 15 23:13:23.862917 containerd[1512]: time="2025-07-15T23:13:23.862150593Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:23.863684 containerd[1512]: time="2025-07-15T23:13:23.863436293Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Jul 15 23:13:23.864901 containerd[1512]: time="2025-07-15T23:13:23.864858839Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:23.867286 containerd[1512]: time="2025-07-15T23:13:23.867229709Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:23.868771 containerd[1512]: time="2025-07-15T23:13:23.868361801Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 7.602388388s" Jul 15 23:13:23.868771 containerd[1512]: time="2025-07-15T23:13:23.868402763Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Jul 15 23:13:23.887516 containerd[1512]: time="2025-07-15T23:13:23.887404726Z" level=info msg="CreateContainer within sandbox \"c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jul 15 23:13:23.903331 containerd[1512]: time="2025-07-15T23:13:23.903278024Z" level=info msg="Container 461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:23.908307 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2863830172.mount: Deactivated successfully. Jul 15 23:13:23.917809 containerd[1512]: time="2025-07-15T23:13:23.917748896Z" level=info msg="CreateContainer within sandbox \"c40679e57c489efee851cba669f4558235070cc5fdde8c59c829f65bb8037b5a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\"" Jul 15 23:13:23.918541 containerd[1512]: time="2025-07-15T23:13:23.918490730Z" level=info msg="StartContainer for \"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\"" Jul 15 23:13:23.922229 containerd[1512]: time="2025-07-15T23:13:23.922164901Z" level=info msg="connecting to shim 461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de" address="unix:///run/containerd/s/ea767c74e6e1cbe0b09e85d791c110057ca3d59eadcba66ba79f15bde271e199" protocol=ttrpc version=3 Jul 15 23:13:23.949031 systemd[1]: Started cri-containerd-461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de.scope - libcontainer container 461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de. Jul 15 23:13:24.015498 containerd[1512]: time="2025-07-15T23:13:24.015396214Z" level=info msg="StartContainer for \"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\" returns successfully" Jul 15 23:13:24.164922 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jul 15 23:13:24.165043 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jul 15 23:13:24.348197 kubelet[2668]: I0715 23:13:24.347642 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-sskmx" podStartSLOduration=2.092111974 podStartE2EDuration="19.347622619s" podCreationTimestamp="2025-07-15 23:13:05 +0000 UTC" firstStartedPulling="2025-07-15 23:13:06.613954328 +0000 UTC m=+23.648494238" lastFinishedPulling="2025-07-15 23:13:23.869464973 +0000 UTC m=+40.904004883" observedRunningTime="2025-07-15 23:13:24.343597717 +0000 UTC m=+41.378137627" watchObservedRunningTime="2025-07-15 23:13:24.347622619 +0000 UTC m=+41.382162529" Jul 15 23:13:24.462171 kubelet[2668]: I0715 23:13:24.461631 2668 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec396930-4143-4faa-8947-7b5623e308ef-whisker-ca-bundle\") pod \"ec396930-4143-4faa-8947-7b5623e308ef\" (UID: \"ec396930-4143-4faa-8947-7b5623e308ef\") " Jul 15 23:13:24.462171 kubelet[2668]: I0715 23:13:24.461688 2668 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxrrt\" (UniqueName: \"kubernetes.io/projected/ec396930-4143-4faa-8947-7b5623e308ef-kube-api-access-dxrrt\") pod \"ec396930-4143-4faa-8947-7b5623e308ef\" (UID: \"ec396930-4143-4faa-8947-7b5623e308ef\") " Jul 15 23:13:24.462171 kubelet[2668]: I0715 23:13:24.461707 2668 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ec396930-4143-4faa-8947-7b5623e308ef-whisker-backend-key-pair\") pod \"ec396930-4143-4faa-8947-7b5623e308ef\" (UID: \"ec396930-4143-4faa-8947-7b5623e308ef\") " Jul 15 23:13:24.467637 kubelet[2668]: I0715 23:13:24.467237 2668 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ec396930-4143-4faa-8947-7b5623e308ef-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "ec396930-4143-4faa-8947-7b5623e308ef" (UID: "ec396930-4143-4faa-8947-7b5623e308ef"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jul 15 23:13:24.470391 kubelet[2668]: I0715 23:13:24.470336 2668 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec396930-4143-4faa-8947-7b5623e308ef-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "ec396930-4143-4faa-8947-7b5623e308ef" (UID: "ec396930-4143-4faa-8947-7b5623e308ef"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 23:13:24.470690 kubelet[2668]: I0715 23:13:24.470643 2668 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec396930-4143-4faa-8947-7b5623e308ef-kube-api-access-dxrrt" (OuterVolumeSpecName: "kube-api-access-dxrrt") pod "ec396930-4143-4faa-8947-7b5623e308ef" (UID: "ec396930-4143-4faa-8947-7b5623e308ef"). InnerVolumeSpecName "kube-api-access-dxrrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 23:13:24.514561 containerd[1512]: time="2025-07-15T23:13:24.514513536Z" level=info msg="TaskExit event in podsandbox handler container_id:\"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\" id:\"e1684ba0a6054db6c95fa2a4de406c371e0493017f85d85aa23f9e9222cefb82\" pid:3740 exit_status:1 exited_at:{seconds:1752621204 nanos:514217363}" Jul 15 23:13:24.562415 kubelet[2668]: I0715 23:13:24.562369 2668 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxrrt\" (UniqueName: \"kubernetes.io/projected/ec396930-4143-4faa-8947-7b5623e308ef-kube-api-access-dxrrt\") on node \"ci-4372-0-1-n-91aeaf5bee\" DevicePath \"\"" Jul 15 23:13:24.562415 kubelet[2668]: I0715 23:13:24.562407 2668 reconciler_common.go:293] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/ec396930-4143-4faa-8947-7b5623e308ef-whisker-backend-key-pair\") on node \"ci-4372-0-1-n-91aeaf5bee\" DevicePath \"\"" Jul 15 23:13:24.562415 kubelet[2668]: I0715 23:13:24.562419 2668 reconciler_common.go:293] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec396930-4143-4faa-8947-7b5623e308ef-whisker-ca-bundle\") on node \"ci-4372-0-1-n-91aeaf5bee\" DevicePath \"\"" Jul 15 23:13:24.831379 systemd[1]: var-lib-kubelet-pods-ec396930\x2d4143\x2d4faa\x2d8947\x2d7b5623e308ef-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2ddxrrt.mount: Deactivated successfully. Jul 15 23:13:24.831507 systemd[1]: var-lib-kubelet-pods-ec396930\x2d4143\x2d4faa\x2d8947\x2d7b5623e308ef-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jul 15 23:13:25.103042 systemd[1]: Removed slice kubepods-besteffort-podec396930_4143_4faa_8947_7b5623e308ef.slice - libcontainer container kubepods-besteffort-podec396930_4143_4faa_8947_7b5623e308ef.slice. Jul 15 23:13:25.406329 systemd[1]: Created slice kubepods-besteffort-pod51bbcd8b_b393_46be_bc79_166623eb6144.slice - libcontainer container kubepods-besteffort-pod51bbcd8b_b393_46be_bc79_166623eb6144.slice. Jul 15 23:13:25.456861 containerd[1512]: time="2025-07-15T23:13:25.456770616Z" level=info msg="TaskExit event in podsandbox handler container_id:\"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\" id:\"1a0fae0cf0e23f8884ff5f74bb3f5a3e31317c76293d1ec1aba13fc54f05bec7\" pid:3782 exit_status:1 exited_at:{seconds:1752621205 nanos:456474563}" Jul 15 23:13:25.571434 kubelet[2668]: I0715 23:13:25.571256 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh9kt\" (UniqueName: \"kubernetes.io/projected/51bbcd8b-b393-46be-bc79-166623eb6144-kube-api-access-xh9kt\") pod \"whisker-659466c688-flsdz\" (UID: \"51bbcd8b-b393-46be-bc79-166623eb6144\") " pod="calico-system/whisker-659466c688-flsdz" Jul 15 23:13:25.571434 kubelet[2668]: I0715 23:13:25.571349 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/51bbcd8b-b393-46be-bc79-166623eb6144-whisker-backend-key-pair\") pod \"whisker-659466c688-flsdz\" (UID: \"51bbcd8b-b393-46be-bc79-166623eb6144\") " pod="calico-system/whisker-659466c688-flsdz" Jul 15 23:13:25.571434 kubelet[2668]: I0715 23:13:25.571392 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/51bbcd8b-b393-46be-bc79-166623eb6144-whisker-ca-bundle\") pod \"whisker-659466c688-flsdz\" (UID: \"51bbcd8b-b393-46be-bc79-166623eb6144\") " pod="calico-system/whisker-659466c688-flsdz" Jul 15 23:13:25.711972 containerd[1512]: time="2025-07-15T23:13:25.711241741Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-659466c688-flsdz,Uid:51bbcd8b-b393-46be-bc79-166623eb6144,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:25.998953 systemd-networkd[1424]: calif33d33854f6: Link UP Jul 15 23:13:26.006297 systemd-networkd[1424]: calif33d33854f6: Gained carrier Jul 15 23:13:26.036063 containerd[1512]: 2025-07-15 23:13:25.757 [INFO][3805] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jul 15 23:13:26.036063 containerd[1512]: 2025-07-15 23:13:25.811 [INFO][3805] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0 whisker-659466c688- calico-system 51bbcd8b-b393-46be-bc79-166623eb6144 932 0 2025-07-15 23:13:25 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:659466c688 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee whisker-659466c688-flsdz eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calif33d33854f6 [] [] }} ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Namespace="calico-system" Pod="whisker-659466c688-flsdz" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-" Jul 15 23:13:26.036063 containerd[1512]: 2025-07-15 23:13:25.811 [INFO][3805] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Namespace="calico-system" Pod="whisker-659466c688-flsdz" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" Jul 15 23:13:26.036063 containerd[1512]: 2025-07-15 23:13:25.876 [INFO][3891] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" HandleID="k8s-pod-network.9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" Jul 15 23:13:26.036392 containerd[1512]: 2025-07-15 23:13:25.876 [INFO][3891] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" HandleID="k8s-pod-network.9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400030c2a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"whisker-659466c688-flsdz", "timestamp":"2025-07-15 23:13:25.876273356 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:26.036392 containerd[1512]: 2025-07-15 23:13:25.876 [INFO][3891] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:26.036392 containerd[1512]: 2025-07-15 23:13:25.876 [INFO][3891] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:26.036392 containerd[1512]: 2025-07-15 23:13:25.876 [INFO][3891] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:26.036392 containerd[1512]: 2025-07-15 23:13:25.903 [INFO][3891] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:26.036392 containerd[1512]: 2025-07-15 23:13:25.914 [INFO][3891] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:26.036392 containerd[1512]: 2025-07-15 23:13:25.928 [INFO][3891] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:26.036392 containerd[1512]: 2025-07-15 23:13:25.931 [INFO][3891] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:26.036392 containerd[1512]: 2025-07-15 23:13:25.938 [INFO][3891] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:26.036588 containerd[1512]: 2025-07-15 23:13:25.938 [INFO][3891] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:26.036588 containerd[1512]: 2025-07-15 23:13:25.940 [INFO][3891] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9 Jul 15 23:13:26.036588 containerd[1512]: 2025-07-15 23:13:25.948 [INFO][3891] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:26.036588 containerd[1512]: 2025-07-15 23:13:25.969 [INFO][3891] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.65/26] block=192.168.56.64/26 handle="k8s-pod-network.9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:26.036588 containerd[1512]: 2025-07-15 23:13:25.969 [INFO][3891] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.65/26] handle="k8s-pod-network.9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:26.036588 containerd[1512]: 2025-07-15 23:13:25.969 [INFO][3891] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:26.036588 containerd[1512]: 2025-07-15 23:13:25.969 [INFO][3891] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.65/26] IPv6=[] ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" HandleID="k8s-pod-network.9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" Jul 15 23:13:26.036717 containerd[1512]: 2025-07-15 23:13:25.974 [INFO][3805] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Namespace="calico-system" Pod="whisker-659466c688-flsdz" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0", GenerateName:"whisker-659466c688-", Namespace:"calico-system", SelfLink:"", UID:"51bbcd8b-b393-46be-bc79-166623eb6144", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"659466c688", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"whisker-659466c688-flsdz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.56.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif33d33854f6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:26.036717 containerd[1512]: 2025-07-15 23:13:25.974 [INFO][3805] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.65/32] ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Namespace="calico-system" Pod="whisker-659466c688-flsdz" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" Jul 15 23:13:26.036784 containerd[1512]: 2025-07-15 23:13:25.974 [INFO][3805] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif33d33854f6 ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Namespace="calico-system" Pod="whisker-659466c688-flsdz" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" Jul 15 23:13:26.036784 containerd[1512]: 2025-07-15 23:13:26.009 [INFO][3805] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Namespace="calico-system" Pod="whisker-659466c688-flsdz" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" Jul 15 23:13:26.036825 containerd[1512]: 2025-07-15 23:13:26.013 [INFO][3805] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Namespace="calico-system" Pod="whisker-659466c688-flsdz" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0", GenerateName:"whisker-659466c688-", Namespace:"calico-system", SelfLink:"", UID:"51bbcd8b-b393-46be-bc79-166623eb6144", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 25, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"659466c688", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9", Pod:"whisker-659466c688-flsdz", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.56.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calif33d33854f6", MAC:"aa:e6:f4:a4:49:19", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:26.036983 containerd[1512]: 2025-07-15 23:13:26.030 [INFO][3805] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" Namespace="calico-system" Pod="whisker-659466c688-flsdz" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-whisker--659466c688--flsdz-eth0" Jul 15 23:13:26.068869 containerd[1512]: time="2025-07-15T23:13:26.068498555Z" level=info msg="connecting to shim 9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9" address="unix:///run/containerd/s/9e3ea51e2d7a1dfa86554e4163b65de38d33f6747ac630a9bbd610d397d97251" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:26.110086 systemd[1]: Started cri-containerd-9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9.scope - libcontainer container 9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9. Jul 15 23:13:26.178375 containerd[1512]: time="2025-07-15T23:13:26.177989326Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-659466c688-flsdz,Uid:51bbcd8b-b393-46be-bc79-166623eb6144,Namespace:calico-system,Attempt:0,} returns sandbox id \"9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9\"" Jul 15 23:13:26.186163 containerd[1512]: time="2025-07-15T23:13:26.185827739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Jul 15 23:13:26.529131 systemd-networkd[1424]: vxlan.calico: Link UP Jul 15 23:13:26.529182 systemd-networkd[1424]: vxlan.calico: Gained carrier Jul 15 23:13:27.094367 kubelet[2668]: I0715 23:13:27.094074 2668 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec396930-4143-4faa-8947-7b5623e308ef" path="/var/lib/kubelet/pods/ec396930-4143-4faa-8947-7b5623e308ef/volumes" Jul 15 23:13:27.533566 systemd-networkd[1424]: calif33d33854f6: Gained IPv6LL Jul 15 23:13:27.986198 containerd[1512]: time="2025-07-15T23:13:27.985051412Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:27.986975 containerd[1512]: time="2025-07-15T23:13:27.986939930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Jul 15 23:13:27.988262 containerd[1512]: time="2025-07-15T23:13:27.988232663Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:27.991490 containerd[1512]: time="2025-07-15T23:13:27.991455236Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:27.992100 containerd[1512]: time="2025-07-15T23:13:27.992062381Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 1.80617884s" Jul 15 23:13:27.992100 containerd[1512]: time="2025-07-15T23:13:27.992098703Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Jul 15 23:13:27.995560 containerd[1512]: time="2025-07-15T23:13:27.995528165Z" level=info msg="CreateContainer within sandbox \"9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Jul 15 23:13:28.004872 containerd[1512]: time="2025-07-15T23:13:28.003102915Z" level=info msg="Container 28011137eb0919e85e62d5f878b1723b2276ddee96ef5735ef8f755522892109: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:28.023206 containerd[1512]: time="2025-07-15T23:13:28.023131881Z" level=info msg="CreateContainer within sandbox \"9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"28011137eb0919e85e62d5f878b1723b2276ddee96ef5735ef8f755522892109\"" Jul 15 23:13:28.025431 containerd[1512]: time="2025-07-15T23:13:28.025239925Z" level=info msg="StartContainer for \"28011137eb0919e85e62d5f878b1723b2276ddee96ef5735ef8f755522892109\"" Jul 15 23:13:28.027570 containerd[1512]: time="2025-07-15T23:13:28.027224925Z" level=info msg="connecting to shim 28011137eb0919e85e62d5f878b1723b2276ddee96ef5735ef8f755522892109" address="unix:///run/containerd/s/9e3ea51e2d7a1dfa86554e4163b65de38d33f6747ac630a9bbd610d397d97251" protocol=ttrpc version=3 Jul 15 23:13:28.056217 systemd[1]: Started cri-containerd-28011137eb0919e85e62d5f878b1723b2276ddee96ef5735ef8f755522892109.scope - libcontainer container 28011137eb0919e85e62d5f878b1723b2276ddee96ef5735ef8f755522892109. Jul 15 23:13:28.092071 containerd[1512]: time="2025-07-15T23:13:28.091366865Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8bf995db6-rbt25,Uid:36e81425-6b32-4185-bd01-a2d30ff92c2c,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:28.092617 containerd[1512]: time="2025-07-15T23:13:28.092535032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797fd56c96-7vvww,Uid:094991fd-9793-44ba-9e9f-9e777da29e64,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:13:28.092922 containerd[1512]: time="2025-07-15T23:13:28.092883566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-mz25w,Uid:673144b1-613a-4f74-af82-9d2cb7db4571,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:28.093159 containerd[1512]: time="2025-07-15T23:13:28.093129655Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f5986dd7d-qm2lh,Uid:1a4cf393-394d-46d2-b039-ad46900b55f7,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:13:28.154421 containerd[1512]: time="2025-07-15T23:13:28.153454121Z" level=info msg="StartContainer for \"28011137eb0919e85e62d5f878b1723b2276ddee96ef5735ef8f755522892109\" returns successfully" Jul 15 23:13:28.162066 containerd[1512]: time="2025-07-15T23:13:28.162026866Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Jul 15 23:13:28.174530 systemd-networkd[1424]: vxlan.calico: Gained IPv6LL Jul 15 23:13:28.357457 systemd-networkd[1424]: cali806be1891fc: Link UP Jul 15 23:13:28.357952 systemd-networkd[1424]: cali806be1891fc: Gained carrier Jul 15 23:13:28.376857 containerd[1512]: 2025-07-15 23:13:28.224 [INFO][4124] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0 calico-apiserver-f5986dd7d- calico-apiserver 1a4cf393-394d-46d2-b039-ad46900b55f7 850 0 2025-07-15 23:12:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f5986dd7d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee calico-apiserver-f5986dd7d-qm2lh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali806be1891fc [] [] }} ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-qm2lh" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-" Jul 15 23:13:28.376857 containerd[1512]: 2025-07-15 23:13:28.224 [INFO][4124] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-qm2lh" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:28.376857 containerd[1512]: 2025-07-15 23:13:28.286 [INFO][4157] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:28.377234 containerd[1512]: 2025-07-15 23:13:28.286 [INFO][4157] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3660), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"calico-apiserver-f5986dd7d-qm2lh", "timestamp":"2025-07-15 23:13:28.286078135 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:28.377234 containerd[1512]: 2025-07-15 23:13:28.286 [INFO][4157] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:28.377234 containerd[1512]: 2025-07-15 23:13:28.287 [INFO][4157] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:28.377234 containerd[1512]: 2025-07-15 23:13:28.287 [INFO][4157] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:28.377234 containerd[1512]: 2025-07-15 23:13:28.302 [INFO][4157] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.377234 containerd[1512]: 2025-07-15 23:13:28.315 [INFO][4157] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.377234 containerd[1512]: 2025-07-15 23:13:28.320 [INFO][4157] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.377234 containerd[1512]: 2025-07-15 23:13:28.323 [INFO][4157] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.377234 containerd[1512]: 2025-07-15 23:13:28.326 [INFO][4157] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.377456 containerd[1512]: 2025-07-15 23:13:28.326 [INFO][4157] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.377456 containerd[1512]: 2025-07-15 23:13:28.328 [INFO][4157] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d Jul 15 23:13:28.377456 containerd[1512]: 2025-07-15 23:13:28.336 [INFO][4157] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.377456 containerd[1512]: 2025-07-15 23:13:28.344 [INFO][4157] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.66/26] block=192.168.56.64/26 handle="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.377456 containerd[1512]: 2025-07-15 23:13:28.344 [INFO][4157] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.66/26] handle="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.377456 containerd[1512]: 2025-07-15 23:13:28.345 [INFO][4157] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:28.377456 containerd[1512]: 2025-07-15 23:13:28.345 [INFO][4157] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.66/26] IPv6=[] ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:28.377593 containerd[1512]: 2025-07-15 23:13:28.351 [INFO][4124] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-qm2lh" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0", GenerateName:"calico-apiserver-f5986dd7d-", Namespace:"calico-apiserver", SelfLink:"", UID:"1a4cf393-394d-46d2-b039-ad46900b55f7", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 12, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f5986dd7d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"calico-apiserver-f5986dd7d-qm2lh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali806be1891fc", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:28.377643 containerd[1512]: 2025-07-15 23:13:28.353 [INFO][4124] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.66/32] ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-qm2lh" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:28.377643 containerd[1512]: 2025-07-15 23:13:28.353 [INFO][4124] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali806be1891fc ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-qm2lh" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:28.377643 containerd[1512]: 2025-07-15 23:13:28.357 [INFO][4124] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-qm2lh" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:28.378164 containerd[1512]: 2025-07-15 23:13:28.358 [INFO][4124] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-qm2lh" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0", GenerateName:"calico-apiserver-f5986dd7d-", Namespace:"calico-apiserver", SelfLink:"", UID:"1a4cf393-394d-46d2-b039-ad46900b55f7", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 12, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f5986dd7d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d", Pod:"calico-apiserver-f5986dd7d-qm2lh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali806be1891fc", MAC:"16:31:2f:d5:b7:b2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:28.378543 containerd[1512]: 2025-07-15 23:13:28.374 [INFO][4124] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-qm2lh" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:28.407778 containerd[1512]: time="2025-07-15T23:13:28.406393573Z" level=info msg="connecting to shim 4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" address="unix:///run/containerd/s/3733114f11b59697a1b675a2ce03183ef7f00b0a3fd6ecd87661876c0ff945a7" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:28.437561 systemd[1]: Started cri-containerd-4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d.scope - libcontainer container 4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d. Jul 15 23:13:28.467455 systemd-networkd[1424]: cali356977b971a: Link UP Jul 15 23:13:28.468269 systemd-networkd[1424]: cali356977b971a: Gained carrier Jul 15 23:13:28.494634 containerd[1512]: 2025-07-15 23:13:28.231 [INFO][4133] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0 goldmane-58fd7646b9- calico-system 673144b1-613a-4f74-af82-9d2cb7db4571 846 0 2025-07-15 23:13:06 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:58fd7646b9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee goldmane-58fd7646b9-mz25w eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali356977b971a [] [] }} ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Namespace="calico-system" Pod="goldmane-58fd7646b9-mz25w" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-" Jul 15 23:13:28.494634 containerd[1512]: 2025-07-15 23:13:28.231 [INFO][4133] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Namespace="calico-system" Pod="goldmane-58fd7646b9-mz25w" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" Jul 15 23:13:28.494634 containerd[1512]: 2025-07-15 23:13:28.296 [INFO][4159] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" HandleID="k8s-pod-network.2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" Jul 15 23:13:28.495308 containerd[1512]: 2025-07-15 23:13:28.297 [INFO][4159] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" HandleID="k8s-pod-network.2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d37d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"goldmane-58fd7646b9-mz25w", "timestamp":"2025-07-15 23:13:28.296087017 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:28.495308 containerd[1512]: 2025-07-15 23:13:28.297 [INFO][4159] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:28.495308 containerd[1512]: 2025-07-15 23:13:28.344 [INFO][4159] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:28.495308 containerd[1512]: 2025-07-15 23:13:28.345 [INFO][4159] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:28.495308 containerd[1512]: 2025-07-15 23:13:28.404 [INFO][4159] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.495308 containerd[1512]: 2025-07-15 23:13:28.414 [INFO][4159] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.495308 containerd[1512]: 2025-07-15 23:13:28.424 [INFO][4159] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.495308 containerd[1512]: 2025-07-15 23:13:28.428 [INFO][4159] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.495308 containerd[1512]: 2025-07-15 23:13:28.435 [INFO][4159] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.495500 containerd[1512]: 2025-07-15 23:13:28.436 [INFO][4159] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.495500 containerd[1512]: 2025-07-15 23:13:28.439 [INFO][4159] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5 Jul 15 23:13:28.495500 containerd[1512]: 2025-07-15 23:13:28.449 [INFO][4159] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.495500 containerd[1512]: 2025-07-15 23:13:28.458 [INFO][4159] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.67/26] block=192.168.56.64/26 handle="k8s-pod-network.2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.495500 containerd[1512]: 2025-07-15 23:13:28.459 [INFO][4159] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.67/26] handle="k8s-pod-network.2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.495500 containerd[1512]: 2025-07-15 23:13:28.459 [INFO][4159] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:28.495500 containerd[1512]: 2025-07-15 23:13:28.459 [INFO][4159] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.67/26] IPv6=[] ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" HandleID="k8s-pod-network.2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" Jul 15 23:13:28.495630 containerd[1512]: 2025-07-15 23:13:28.463 [INFO][4133] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Namespace="calico-system" Pod="goldmane-58fd7646b9-mz25w" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"673144b1-613a-4f74-af82-9d2cb7db4571", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"goldmane-58fd7646b9-mz25w", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.56.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali356977b971a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:28.495677 containerd[1512]: 2025-07-15 23:13:28.463 [INFO][4133] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.67/32] ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Namespace="calico-system" Pod="goldmane-58fd7646b9-mz25w" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" Jul 15 23:13:28.495677 containerd[1512]: 2025-07-15 23:13:28.463 [INFO][4133] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali356977b971a ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Namespace="calico-system" Pod="goldmane-58fd7646b9-mz25w" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" Jul 15 23:13:28.495677 containerd[1512]: 2025-07-15 23:13:28.470 [INFO][4133] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Namespace="calico-system" Pod="goldmane-58fd7646b9-mz25w" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" Jul 15 23:13:28.495736 containerd[1512]: 2025-07-15 23:13:28.474 [INFO][4133] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Namespace="calico-system" Pod="goldmane-58fd7646b9-mz25w" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0", GenerateName:"goldmane-58fd7646b9-", Namespace:"calico-system", SelfLink:"", UID:"673144b1-613a-4f74-af82-9d2cb7db4571", ResourceVersion:"846", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"58fd7646b9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5", Pod:"goldmane-58fd7646b9-mz25w", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.56.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali356977b971a", MAC:"a2:0e:c6:27:5d:4a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:28.495782 containerd[1512]: 2025-07-15 23:13:28.489 [INFO][4133] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" Namespace="calico-system" Pod="goldmane-58fd7646b9-mz25w" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-goldmane--58fd7646b9--mz25w-eth0" Jul 15 23:13:28.530757 containerd[1512]: time="2025-07-15T23:13:28.530373999Z" level=info msg="connecting to shim 2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5" address="unix:///run/containerd/s/5b183cdfcca0527d59d86cd09d4738be96fbe28bd7c58f7a9552cc45114b4ba7" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:28.558876 containerd[1512]: time="2025-07-15T23:13:28.558539452Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f5986dd7d-qm2lh,Uid:1a4cf393-394d-46d2-b039-ad46900b55f7,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\"" Jul 15 23:13:28.564036 systemd[1]: Started cri-containerd-2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5.scope - libcontainer container 2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5. Jul 15 23:13:28.594083 systemd-networkd[1424]: cali43c5979d42e: Link UP Jul 15 23:13:28.601018 systemd-networkd[1424]: cali43c5979d42e: Gained carrier Jul 15 23:13:28.634286 containerd[1512]: 2025-07-15 23:13:28.235 [INFO][4113] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0 calico-apiserver-797fd56c96- calico-apiserver 094991fd-9793-44ba-9e9f-9e777da29e64 853 0 2025-07-15 23:13:02 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:797fd56c96 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee calico-apiserver-797fd56c96-7vvww eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali43c5979d42e [] [] }} ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-7vvww" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-" Jul 15 23:13:28.634286 containerd[1512]: 2025-07-15 23:13:28.237 [INFO][4113] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-7vvww" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" Jul 15 23:13:28.634286 containerd[1512]: 2025-07-15 23:13:28.314 [INFO][4168] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" HandleID="k8s-pod-network.1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" Jul 15 23:13:28.634825 containerd[1512]: 2025-07-15 23:13:28.314 [INFO][4168] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" HandleID="k8s-pod-network.1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"calico-apiserver-797fd56c96-7vvww", "timestamp":"2025-07-15 23:13:28.314644684 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:28.634825 containerd[1512]: 2025-07-15 23:13:28.314 [INFO][4168] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:28.634825 containerd[1512]: 2025-07-15 23:13:28.459 [INFO][4168] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:28.634825 containerd[1512]: 2025-07-15 23:13:28.459 [INFO][4168] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:28.634825 containerd[1512]: 2025-07-15 23:13:28.504 [INFO][4168] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.634825 containerd[1512]: 2025-07-15 23:13:28.520 [INFO][4168] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.634825 containerd[1512]: 2025-07-15 23:13:28.532 [INFO][4168] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.634825 containerd[1512]: 2025-07-15 23:13:28.537 [INFO][4168] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.634825 containerd[1512]: 2025-07-15 23:13:28.549 [INFO][4168] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.635478 containerd[1512]: 2025-07-15 23:13:28.549 [INFO][4168] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.635478 containerd[1512]: 2025-07-15 23:13:28.555 [INFO][4168] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578 Jul 15 23:13:28.635478 containerd[1512]: 2025-07-15 23:13:28.568 [INFO][4168] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.635478 containerd[1512]: 2025-07-15 23:13:28.578 [INFO][4168] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.68/26] block=192.168.56.64/26 handle="k8s-pod-network.1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.635478 containerd[1512]: 2025-07-15 23:13:28.578 [INFO][4168] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.68/26] handle="k8s-pod-network.1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.635478 containerd[1512]: 2025-07-15 23:13:28.579 [INFO][4168] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:28.635478 containerd[1512]: 2025-07-15 23:13:28.579 [INFO][4168] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.68/26] IPv6=[] ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" HandleID="k8s-pod-network.1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" Jul 15 23:13:28.635626 containerd[1512]: 2025-07-15 23:13:28.583 [INFO][4113] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-7vvww" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0", GenerateName:"calico-apiserver-797fd56c96-", Namespace:"calico-apiserver", SelfLink:"", UID:"094991fd-9793-44ba-9e9f-9e777da29e64", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"797fd56c96", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"calico-apiserver-797fd56c96-7vvww", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali43c5979d42e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:28.635686 containerd[1512]: 2025-07-15 23:13:28.583 [INFO][4113] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.68/32] ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-7vvww" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" Jul 15 23:13:28.635686 containerd[1512]: 2025-07-15 23:13:28.583 [INFO][4113] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali43c5979d42e ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-7vvww" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" Jul 15 23:13:28.635686 containerd[1512]: 2025-07-15 23:13:28.599 [INFO][4113] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-7vvww" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" Jul 15 23:13:28.635744 containerd[1512]: 2025-07-15 23:13:28.607 [INFO][4113] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-7vvww" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0", GenerateName:"calico-apiserver-797fd56c96-", Namespace:"calico-apiserver", SelfLink:"", UID:"094991fd-9793-44ba-9e9f-9e777da29e64", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 2, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"797fd56c96", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578", Pod:"calico-apiserver-797fd56c96-7vvww", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali43c5979d42e", MAC:"e2:ee:59:19:70:33", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:28.635787 containerd[1512]: 2025-07-15 23:13:28.627 [INFO][4113] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-7vvww" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--7vvww-eth0" Jul 15 23:13:28.680229 containerd[1512]: time="2025-07-15T23:13:28.680146382Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-58fd7646b9-mz25w,Uid:673144b1-613a-4f74-af82-9d2cb7db4571,Namespace:calico-system,Attempt:0,} returns sandbox id \"2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5\"" Jul 15 23:13:28.685696 containerd[1512]: time="2025-07-15T23:13:28.685630843Z" level=info msg="connecting to shim 1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578" address="unix:///run/containerd/s/4a500903ec49dd2f9df1147e648a3cdb7ccbf86cbb412faf900867d826a37734" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:28.705308 systemd-networkd[1424]: cali9715e5d8792: Link UP Jul 15 23:13:28.706113 systemd-networkd[1424]: cali9715e5d8792: Gained carrier Jul 15 23:13:28.728137 systemd[1]: Started cri-containerd-1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578.scope - libcontainer container 1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578. Jul 15 23:13:28.734131 containerd[1512]: 2025-07-15 23:13:28.258 [INFO][4100] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0 calico-kube-controllers-8bf995db6- calico-system 36e81425-6b32-4185-bd01-a2d30ff92c2c 854 0 2025-07-15 23:13:06 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:8bf995db6 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee calico-kube-controllers-8bf995db6-rbt25 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali9715e5d8792 [] [] }} ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Namespace="calico-system" Pod="calico-kube-controllers-8bf995db6-rbt25" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-" Jul 15 23:13:28.734131 containerd[1512]: 2025-07-15 23:13:28.259 [INFO][4100] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Namespace="calico-system" Pod="calico-kube-controllers-8bf995db6-rbt25" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" Jul 15 23:13:28.734131 containerd[1512]: 2025-07-15 23:13:28.313 [INFO][4174] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" HandleID="k8s-pod-network.e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" Jul 15 23:13:28.734390 containerd[1512]: 2025-07-15 23:13:28.317 [INFO][4174] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" HandleID="k8s-pod-network.e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3a20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"calico-kube-controllers-8bf995db6-rbt25", "timestamp":"2025-07-15 23:13:28.31379969 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:28.734390 containerd[1512]: 2025-07-15 23:13:28.317 [INFO][4174] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:28.734390 containerd[1512]: 2025-07-15 23:13:28.579 [INFO][4174] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:28.734390 containerd[1512]: 2025-07-15 23:13:28.579 [INFO][4174] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:28.734390 containerd[1512]: 2025-07-15 23:13:28.609 [INFO][4174] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.734390 containerd[1512]: 2025-07-15 23:13:28.618 [INFO][4174] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.734390 containerd[1512]: 2025-07-15 23:13:28.638 [INFO][4174] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.734390 containerd[1512]: 2025-07-15 23:13:28.643 [INFO][4174] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.734390 containerd[1512]: 2025-07-15 23:13:28.653 [INFO][4174] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.734577 containerd[1512]: 2025-07-15 23:13:28.655 [INFO][4174] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.734577 containerd[1512]: 2025-07-15 23:13:28.659 [INFO][4174] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88 Jul 15 23:13:28.734577 containerd[1512]: 2025-07-15 23:13:28.669 [INFO][4174] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.734577 containerd[1512]: 2025-07-15 23:13:28.688 [INFO][4174] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.69/26] block=192.168.56.64/26 handle="k8s-pod-network.e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.734577 containerd[1512]: 2025-07-15 23:13:28.689 [INFO][4174] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.69/26] handle="k8s-pod-network.e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:28.734577 containerd[1512]: 2025-07-15 23:13:28.689 [INFO][4174] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:28.734577 containerd[1512]: 2025-07-15 23:13:28.689 [INFO][4174] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.69/26] IPv6=[] ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" HandleID="k8s-pod-network.e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" Jul 15 23:13:28.734709 containerd[1512]: 2025-07-15 23:13:28.696 [INFO][4100] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Namespace="calico-system" Pod="calico-kube-controllers-8bf995db6-rbt25" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0", GenerateName:"calico-kube-controllers-8bf995db6-", Namespace:"calico-system", SelfLink:"", UID:"36e81425-6b32-4185-bd01-a2d30ff92c2c", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8bf995db6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"calico-kube-controllers-8bf995db6-rbt25", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.56.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9715e5d8792", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:28.734756 containerd[1512]: 2025-07-15 23:13:28.697 [INFO][4100] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.69/32] ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Namespace="calico-system" Pod="calico-kube-controllers-8bf995db6-rbt25" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" Jul 15 23:13:28.734756 containerd[1512]: 2025-07-15 23:13:28.697 [INFO][4100] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9715e5d8792 ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Namespace="calico-system" Pod="calico-kube-controllers-8bf995db6-rbt25" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" Jul 15 23:13:28.734756 containerd[1512]: 2025-07-15 23:13:28.705 [INFO][4100] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Namespace="calico-system" Pod="calico-kube-controllers-8bf995db6-rbt25" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" Jul 15 23:13:28.734819 containerd[1512]: 2025-07-15 23:13:28.707 [INFO][4100] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Namespace="calico-system" Pod="calico-kube-controllers-8bf995db6-rbt25" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0", GenerateName:"calico-kube-controllers-8bf995db6-", Namespace:"calico-system", SelfLink:"", UID:"36e81425-6b32-4185-bd01-a2d30ff92c2c", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"8bf995db6", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88", Pod:"calico-kube-controllers-8bf995db6-rbt25", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.56.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali9715e5d8792", MAC:"e6:09:25:33:c7:90", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:28.735587 containerd[1512]: 2025-07-15 23:13:28.729 [INFO][4100] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" Namespace="calico-system" Pod="calico-kube-controllers-8bf995db6-rbt25" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--kube--controllers--8bf995db6--rbt25-eth0" Jul 15 23:13:28.763884 containerd[1512]: time="2025-07-15T23:13:28.763794266Z" level=info msg="connecting to shim e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88" address="unix:///run/containerd/s/2b772c7d26bf8649f62dcc65fc6d98f51c3d0f62cc020ab85b16fa352afcc884" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:28.800077 systemd[1]: Started cri-containerd-e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88.scope - libcontainer container e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88. Jul 15 23:13:28.807855 containerd[1512]: time="2025-07-15T23:13:28.807802836Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797fd56c96-7vvww,Uid:094991fd-9793-44ba-9e9f-9e777da29e64,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578\"" Jul 15 23:13:28.858889 containerd[1512]: time="2025-07-15T23:13:28.858800207Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-8bf995db6-rbt25,Uid:36e81425-6b32-4185-bd01-a2d30ff92c2c,Namespace:calico-system,Attempt:0,} returns sandbox id \"e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88\"" Jul 15 23:13:29.092742 containerd[1512]: time="2025-07-15T23:13:29.092162058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gkk8k,Uid:67a52572-ea42-4651-8164-6aaa7db16b6f,Namespace:kube-system,Attempt:0,}" Jul 15 23:13:29.103865 containerd[1512]: time="2025-07-15T23:13:29.102346737Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f5986dd7d-tb5xs,Uid:29bdaeaf-6929-486a-a35b-f522167f96fa,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:13:29.274597 systemd-networkd[1424]: calif5deafd3f21: Link UP Jul 15 23:13:29.279026 systemd-networkd[1424]: calif5deafd3f21: Gained carrier Jul 15 23:13:29.305399 containerd[1512]: 2025-07-15 23:13:29.170 [INFO][4405] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0 coredns-7c65d6cfc9- kube-system 67a52572-ea42-4651-8164-6aaa7db16b6f 842 0 2025-07-15 23:12:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee coredns-7c65d6cfc9-gkk8k eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calif5deafd3f21 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gkk8k" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-" Jul 15 23:13:29.305399 containerd[1512]: 2025-07-15 23:13:29.171 [INFO][4405] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gkk8k" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" Jul 15 23:13:29.305399 containerd[1512]: 2025-07-15 23:13:29.210 [INFO][4428] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" HandleID="k8s-pod-network.fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" Jul 15 23:13:29.305614 containerd[1512]: 2025-07-15 23:13:29.210 [INFO][4428] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" HandleID="k8s-pod-network.fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb070), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"coredns-7c65d6cfc9-gkk8k", "timestamp":"2025-07-15 23:13:29.210073599 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:29.305614 containerd[1512]: 2025-07-15 23:13:29.210 [INFO][4428] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:29.305614 containerd[1512]: 2025-07-15 23:13:29.210 [INFO][4428] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:29.305614 containerd[1512]: 2025-07-15 23:13:29.210 [INFO][4428] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:29.305614 containerd[1512]: 2025-07-15 23:13:29.226 [INFO][4428] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.305614 containerd[1512]: 2025-07-15 23:13:29.232 [INFO][4428] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.305614 containerd[1512]: 2025-07-15 23:13:29.238 [INFO][4428] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.305614 containerd[1512]: 2025-07-15 23:13:29.240 [INFO][4428] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.305614 containerd[1512]: 2025-07-15 23:13:29.245 [INFO][4428] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.306138 containerd[1512]: 2025-07-15 23:13:29.245 [INFO][4428] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.306138 containerd[1512]: 2025-07-15 23:13:29.248 [INFO][4428] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a Jul 15 23:13:29.306138 containerd[1512]: 2025-07-15 23:13:29.253 [INFO][4428] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.306138 containerd[1512]: 2025-07-15 23:13:29.262 [INFO][4428] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.70/26] block=192.168.56.64/26 handle="k8s-pod-network.fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.306138 containerd[1512]: 2025-07-15 23:13:29.262 [INFO][4428] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.70/26] handle="k8s-pod-network.fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.306138 containerd[1512]: 2025-07-15 23:13:29.262 [INFO][4428] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:29.306138 containerd[1512]: 2025-07-15 23:13:29.262 [INFO][4428] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.70/26] IPv6=[] ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" HandleID="k8s-pod-network.fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" Jul 15 23:13:29.306300 containerd[1512]: 2025-07-15 23:13:29.266 [INFO][4405] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gkk8k" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"67a52572-ea42-4651-8164-6aaa7db16b6f", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 12, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"coredns-7c65d6cfc9-gkk8k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.56.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif5deafd3f21", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:29.306300 containerd[1512]: 2025-07-15 23:13:29.266 [INFO][4405] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.70/32] ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gkk8k" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" Jul 15 23:13:29.306300 containerd[1512]: 2025-07-15 23:13:29.268 [INFO][4405] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif5deafd3f21 ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gkk8k" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" Jul 15 23:13:29.306300 containerd[1512]: 2025-07-15 23:13:29.286 [INFO][4405] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gkk8k" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" Jul 15 23:13:29.306300 containerd[1512]: 2025-07-15 23:13:29.287 [INFO][4405] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gkk8k" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"67a52572-ea42-4651-8164-6aaa7db16b6f", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 12, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a", Pod:"coredns-7c65d6cfc9-gkk8k", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.56.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calif5deafd3f21", MAC:"8e:7d:4e:d8:52:5e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:29.306300 containerd[1512]: 2025-07-15 23:13:29.298 [INFO][4405] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" Namespace="kube-system" Pod="coredns-7c65d6cfc9-gkk8k" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--gkk8k-eth0" Jul 15 23:13:29.364400 containerd[1512]: time="2025-07-15T23:13:29.363915508Z" level=info msg="connecting to shim fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a" address="unix:///run/containerd/s/368f61c183e2415f50442e403e74cb514eec70a1a04593c267488bf0cc2daad7" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:29.407105 systemd[1]: Started cri-containerd-fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a.scope - libcontainer container fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a. Jul 15 23:13:29.414970 systemd-networkd[1424]: cali74b548697b0: Link UP Jul 15 23:13:29.416158 systemd-networkd[1424]: cali74b548697b0: Gained carrier Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.173 [INFO][4412] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0 calico-apiserver-f5986dd7d- calico-apiserver 29bdaeaf-6929-486a-a35b-f522167f96fa 851 0 2025-07-15 23:12:59 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:f5986dd7d projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee calico-apiserver-f5986dd7d-tb5xs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali74b548697b0 [] [] }} ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-tb5xs" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.173 [INFO][4412] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-tb5xs" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.210 [INFO][4430] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.210 [INFO][4430] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"calico-apiserver-f5986dd7d-tb5xs", "timestamp":"2025-07-15 23:13:29.21036673 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.210 [INFO][4430] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.262 [INFO][4430] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.263 [INFO][4430] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.333 [INFO][4430] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.349 [INFO][4430] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.358 [INFO][4430] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.362 [INFO][4430] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.369 [INFO][4430] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.369 [INFO][4430] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.377 [INFO][4430] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635 Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.388 [INFO][4430] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.401 [INFO][4430] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.71/26] block=192.168.56.64/26 handle="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.401 [INFO][4430] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.71/26] handle="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.401 [INFO][4430] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:29.444731 containerd[1512]: 2025-07-15 23:13:29.401 [INFO][4430] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.71/26] IPv6=[] ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:29.445959 containerd[1512]: 2025-07-15 23:13:29.409 [INFO][4412] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-tb5xs" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0", GenerateName:"calico-apiserver-f5986dd7d-", Namespace:"calico-apiserver", SelfLink:"", UID:"29bdaeaf-6929-486a-a35b-f522167f96fa", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 12, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f5986dd7d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"calico-apiserver-f5986dd7d-tb5xs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali74b548697b0", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:29.445959 containerd[1512]: 2025-07-15 23:13:29.409 [INFO][4412] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.71/32] ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-tb5xs" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:29.445959 containerd[1512]: 2025-07-15 23:13:29.409 [INFO][4412] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali74b548697b0 ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-tb5xs" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:29.445959 containerd[1512]: 2025-07-15 23:13:29.416 [INFO][4412] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-tb5xs" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:29.445959 containerd[1512]: 2025-07-15 23:13:29.417 [INFO][4412] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-tb5xs" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0", GenerateName:"calico-apiserver-f5986dd7d-", Namespace:"calico-apiserver", SelfLink:"", UID:"29bdaeaf-6929-486a-a35b-f522167f96fa", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 12, 59, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"f5986dd7d", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635", Pod:"calico-apiserver-f5986dd7d-tb5xs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali74b548697b0", MAC:"f6:dc:f3:eb:91:47", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:29.445959 containerd[1512]: 2025-07-15 23:13:29.438 [INFO][4412] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Namespace="calico-apiserver" Pod="calico-apiserver-f5986dd7d-tb5xs" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:29.476926 containerd[1512]: time="2025-07-15T23:13:29.475699369Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-gkk8k,Uid:67a52572-ea42-4651-8164-6aaa7db16b6f,Namespace:kube-system,Attempt:0,} returns sandbox id \"fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a\"" Jul 15 23:13:29.483233 containerd[1512]: time="2025-07-15T23:13:29.482984054Z" level=info msg="CreateContainer within sandbox \"fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 23:13:29.486758 containerd[1512]: time="2025-07-15T23:13:29.485996572Z" level=info msg="connecting to shim 5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" address="unix:///run/containerd/s/2cab5eba2143d3f1a554469d3ef3d34fb772a9debb2bb007ba2e78fb7c00a342" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:29.494778 containerd[1512]: time="2025-07-15T23:13:29.494685633Z" level=info msg="Container 4dea9c939b5f7eeb331ca4e536ad21eabcc0586282ba5ca2439ebd2ee6ef48d7: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:29.503734 containerd[1512]: time="2025-07-15T23:13:29.503668585Z" level=info msg="CreateContainer within sandbox \"fc9fea9dbc52a05e8c5a914ead9728b00841d9a1adfd5f7c05c28db1b7da206a\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4dea9c939b5f7eeb331ca4e536ad21eabcc0586282ba5ca2439ebd2ee6ef48d7\"" Jul 15 23:13:29.505453 containerd[1512]: time="2025-07-15T23:13:29.505409813Z" level=info msg="StartContainer for \"4dea9c939b5f7eeb331ca4e536ad21eabcc0586282ba5ca2439ebd2ee6ef48d7\"" Jul 15 23:13:29.506735 containerd[1512]: time="2025-07-15T23:13:29.506696663Z" level=info msg="connecting to shim 4dea9c939b5f7eeb331ca4e536ad21eabcc0586282ba5ca2439ebd2ee6ef48d7" address="unix:///run/containerd/s/368f61c183e2415f50442e403e74cb514eec70a1a04593c267488bf0cc2daad7" protocol=ttrpc version=3 Jul 15 23:13:29.525072 systemd[1]: Started cri-containerd-5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635.scope - libcontainer container 5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635. Jul 15 23:13:29.540482 systemd[1]: Started cri-containerd-4dea9c939b5f7eeb331ca4e536ad21eabcc0586282ba5ca2439ebd2ee6ef48d7.scope - libcontainer container 4dea9c939b5f7eeb331ca4e536ad21eabcc0586282ba5ca2439ebd2ee6ef48d7. Jul 15 23:13:29.591156 containerd[1512]: time="2025-07-15T23:13:29.590193896Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-f5986dd7d-tb5xs,Uid:29bdaeaf-6929-486a-a35b-f522167f96fa,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\"" Jul 15 23:13:29.597034 containerd[1512]: time="2025-07-15T23:13:29.597001843Z" level=info msg="StartContainer for \"4dea9c939b5f7eeb331ca4e536ad21eabcc0586282ba5ca2439ebd2ee6ef48d7\" returns successfully" Jul 15 23:13:29.773505 systemd-networkd[1424]: cali806be1891fc: Gained IPv6LL Jul 15 23:13:30.029201 systemd-networkd[1424]: cali9715e5d8792: Gained IPv6LL Jul 15 23:13:30.090570 containerd[1512]: time="2025-07-15T23:13:30.090519697Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zhnvj,Uid:19596002-f6b8-45b5-b851-b0ec002e6602,Namespace:kube-system,Attempt:0,}" Jul 15 23:13:30.273279 systemd-networkd[1424]: cali25030178747: Link UP Jul 15 23:13:30.274694 systemd-networkd[1424]: cali25030178747: Gained carrier Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.155 [INFO][4584] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0 coredns-7c65d6cfc9- kube-system 19596002-f6b8-45b5-b851-b0ec002e6602 852 0 2025-07-15 23:12:47 +0000 UTC map[k8s-app:kube-dns pod-template-hash:7c65d6cfc9 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee coredns-7c65d6cfc9-zhnvj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali25030178747 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zhnvj" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.155 [INFO][4584] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zhnvj" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.194 [INFO][4598] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" HandleID="k8s-pod-network.5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.194 [INFO][4598] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" HandleID="k8s-pod-network.5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b850), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"coredns-7c65d6cfc9-zhnvj", "timestamp":"2025-07-15 23:13:30.19445511 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.194 [INFO][4598] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.194 [INFO][4598] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.194 [INFO][4598] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.211 [INFO][4598] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.219 [INFO][4598] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.230 [INFO][4598] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.233 [INFO][4598] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.237 [INFO][4598] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.237 [INFO][4598] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.240 [INFO][4598] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07 Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.249 [INFO][4598] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.262 [INFO][4598] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.72/26] block=192.168.56.64/26 handle="k8s-pod-network.5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.262 [INFO][4598] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.72/26] handle="k8s-pod-network.5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.262 [INFO][4598] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:30.297056 containerd[1512]: 2025-07-15 23:13:30.262 [INFO][4598] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.72/26] IPv6=[] ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" HandleID="k8s-pod-network.5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" Jul 15 23:13:30.299638 containerd[1512]: 2025-07-15 23:13:30.265 [INFO][4584] cni-plugin/k8s.go 418: Populated endpoint ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zhnvj" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"19596002-f6b8-45b5-b851-b0ec002e6602", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 12, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"coredns-7c65d6cfc9-zhnvj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.56.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali25030178747", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:30.299638 containerd[1512]: 2025-07-15 23:13:30.266 [INFO][4584] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.72/32] ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zhnvj" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" Jul 15 23:13:30.299638 containerd[1512]: 2025-07-15 23:13:30.266 [INFO][4584] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali25030178747 ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zhnvj" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" Jul 15 23:13:30.299638 containerd[1512]: 2025-07-15 23:13:30.275 [INFO][4584] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zhnvj" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" Jul 15 23:13:30.299638 containerd[1512]: 2025-07-15 23:13:30.275 [INFO][4584] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zhnvj" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0", GenerateName:"coredns-7c65d6cfc9-", Namespace:"kube-system", SelfLink:"", UID:"19596002-f6b8-45b5-b851-b0ec002e6602", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 12, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"7c65d6cfc9", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07", Pod:"coredns-7c65d6cfc9-zhnvj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.56.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali25030178747", MAC:"32:de:7c:8d:52:6f", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:30.299638 containerd[1512]: 2025-07-15 23:13:30.291 [INFO][4584] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" Namespace="kube-system" Pod="coredns-7c65d6cfc9-zhnvj" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-coredns--7c65d6cfc9--zhnvj-eth0" Jul 15 23:13:30.328296 containerd[1512]: time="2025-07-15T23:13:30.328206943Z" level=info msg="connecting to shim 5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07" address="unix:///run/containerd/s/2406b4d32ca6071242e2fd206ad0fd341230f194ad51615102e4eee615134a0a" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:30.349475 systemd-networkd[1424]: cali356977b971a: Gained IPv6LL Jul 15 23:13:30.349739 systemd-networkd[1424]: calif5deafd3f21: Gained IPv6LL Jul 15 23:13:30.379424 systemd[1]: Started cri-containerd-5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07.scope - libcontainer container 5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07. Jul 15 23:13:30.392999 kubelet[2668]: I0715 23:13:30.392665 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-gkk8k" podStartSLOduration=43.392642207 podStartE2EDuration="43.392642207s" podCreationTimestamp="2025-07-15 23:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:13:30.388343242 +0000 UTC m=+47.422883152" watchObservedRunningTime="2025-07-15 23:13:30.392642207 +0000 UTC m=+47.427182117" Jul 15 23:13:30.463398 containerd[1512]: time="2025-07-15T23:13:30.463204064Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-7c65d6cfc9-zhnvj,Uid:19596002-f6b8-45b5-b851-b0ec002e6602,Namespace:kube-system,Attempt:0,} returns sandbox id \"5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07\"" Jul 15 23:13:30.472081 containerd[1512]: time="2025-07-15T23:13:30.472039962Z" level=info msg="CreateContainer within sandbox \"5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jul 15 23:13:30.490884 containerd[1512]: time="2025-07-15T23:13:30.490466226Z" level=info msg="Container fd61b162842e76ca5b64333940b1dcff30446bc1676546c76a220539087de9a5: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:30.498447 containerd[1512]: time="2025-07-15T23:13:30.498405610Z" level=info msg="CreateContainer within sandbox \"5340d7d301cc5181689f02b5a054b468f35aa6ae3d0e1b906a4bf8eb9f250d07\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"fd61b162842e76ca5b64333940b1dcff30446bc1676546c76a220539087de9a5\"" Jul 15 23:13:30.500982 containerd[1512]: time="2025-07-15T23:13:30.500424927Z" level=info msg="StartContainer for \"fd61b162842e76ca5b64333940b1dcff30446bc1676546c76a220539087de9a5\"" Jul 15 23:13:30.503045 containerd[1512]: time="2025-07-15T23:13:30.502917502Z" level=info msg="connecting to shim fd61b162842e76ca5b64333940b1dcff30446bc1676546c76a220539087de9a5" address="unix:///run/containerd/s/2406b4d32ca6071242e2fd206ad0fd341230f194ad51615102e4eee615134a0a" protocol=ttrpc version=3 Jul 15 23:13:30.533224 systemd[1]: Started cri-containerd-fd61b162842e76ca5b64333940b1dcff30446bc1676546c76a220539087de9a5.scope - libcontainer container fd61b162842e76ca5b64333940b1dcff30446bc1676546c76a220539087de9a5. Jul 15 23:13:30.572355 containerd[1512]: time="2025-07-15T23:13:30.572210871Z" level=info msg="StartContainer for \"fd61b162842e76ca5b64333940b1dcff30446bc1676546c76a220539087de9a5\" returns successfully" Jul 15 23:13:30.605782 systemd-networkd[1424]: cali43c5979d42e: Gained IPv6LL Jul 15 23:13:30.733793 systemd-networkd[1424]: cali74b548697b0: Gained IPv6LL Jul 15 23:13:31.092289 containerd[1512]: time="2025-07-15T23:13:31.091120307Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hddsp,Uid:2a38c65f-24e4-465a-afbd-242e66579eef,Namespace:calico-system,Attempt:0,}" Jul 15 23:13:31.265343 systemd-networkd[1424]: cali9aafe50cf78: Link UP Jul 15 23:13:31.265557 systemd-networkd[1424]: cali9aafe50cf78: Gained carrier Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.151 [INFO][4698] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0 csi-node-driver- calico-system 2a38c65f-24e4-465a-afbd-242e66579eef 713 0 2025-07-15 23:13:06 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:57bd658777 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee csi-node-driver-hddsp eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali9aafe50cf78 [] [] }} ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Namespace="calico-system" Pod="csi-node-driver-hddsp" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.152 [INFO][4698] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Namespace="calico-system" Pod="csi-node-driver-hddsp" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.198 [INFO][4709] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" HandleID="k8s-pod-network.f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.198 [INFO][4709] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" HandleID="k8s-pod-network.f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3220), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"csi-node-driver-hddsp", "timestamp":"2025-07-15 23:13:31.198600639 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.199 [INFO][4709] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.199 [INFO][4709] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.199 [INFO][4709] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.212 [INFO][4709] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.220 [INFO][4709] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.226 [INFO][4709] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.229 [INFO][4709] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.233 [INFO][4709] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.233 [INFO][4709] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.236 [INFO][4709] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.245 [INFO][4709] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.258 [INFO][4709] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.73/26] block=192.168.56.64/26 handle="k8s-pod-network.f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.258 [INFO][4709] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.73/26] handle="k8s-pod-network.f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.258 [INFO][4709] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:31.285406 containerd[1512]: 2025-07-15 23:13:31.258 [INFO][4709] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.73/26] IPv6=[] ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" HandleID="k8s-pod-network.f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" Jul 15 23:13:31.289834 containerd[1512]: 2025-07-15 23:13:31.261 [INFO][4698] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Namespace="calico-system" Pod="csi-node-driver-hddsp" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2a38c65f-24e4-465a-afbd-242e66579eef", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"csi-node-driver-hddsp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.56.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9aafe50cf78", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:31.289834 containerd[1512]: 2025-07-15 23:13:31.261 [INFO][4698] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.73/32] ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Namespace="calico-system" Pod="csi-node-driver-hddsp" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" Jul 15 23:13:31.289834 containerd[1512]: 2025-07-15 23:13:31.261 [INFO][4698] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9aafe50cf78 ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Namespace="calico-system" Pod="csi-node-driver-hddsp" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" Jul 15 23:13:31.289834 containerd[1512]: 2025-07-15 23:13:31.264 [INFO][4698] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Namespace="calico-system" Pod="csi-node-driver-hddsp" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" Jul 15 23:13:31.289834 containerd[1512]: 2025-07-15 23:13:31.264 [INFO][4698] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Namespace="calico-system" Pod="csi-node-driver-hddsp" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"2a38c65f-24e4-465a-afbd-242e66579eef", ResourceVersion:"713", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 6, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"57bd658777", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d", Pod:"csi-node-driver-hddsp", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.56.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali9aafe50cf78", MAC:"c2:80:aa:fa:46:22", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:31.289834 containerd[1512]: 2025-07-15 23:13:31.282 [INFO][4698] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" Namespace="calico-system" Pod="csi-node-driver-hddsp" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-csi--node--driver--hddsp-eth0" Jul 15 23:13:31.323013 containerd[1512]: time="2025-07-15T23:13:31.322105609Z" level=info msg="connecting to shim f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d" address="unix:///run/containerd/s/91d3871668df5dce09ae19ebfe4e47511eadd73a3faf16ae4b553c0ca8f7f510" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:31.351100 systemd[1]: Started cri-containerd-f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d.scope - libcontainer container f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d. Jul 15 23:13:31.411877 containerd[1512]: time="2025-07-15T23:13:31.411725594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-hddsp,Uid:2a38c65f-24e4-465a-afbd-242e66579eef,Namespace:calico-system,Attempt:0,} returns sandbox id \"f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d\"" Jul 15 23:13:31.432746 kubelet[2668]: I0715 23:13:31.432679 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-7c65d6cfc9-zhnvj" podStartSLOduration=44.432660576 podStartE2EDuration="44.432660576s" podCreationTimestamp="2025-07-15 23:12:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:13:31.400057999 +0000 UTC m=+48.434597949" watchObservedRunningTime="2025-07-15 23:13:31.432660576 +0000 UTC m=+48.467200486" Jul 15 23:13:32.013939 systemd-networkd[1424]: cali25030178747: Gained IPv6LL Jul 15 23:13:32.129406 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2420638820.mount: Deactivated successfully. Jul 15 23:13:32.152194 containerd[1512]: time="2025-07-15T23:13:32.152132945Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:32.153866 containerd[1512]: time="2025-07-15T23:13:32.153627999Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Jul 15 23:13:32.154230 containerd[1512]: time="2025-07-15T23:13:32.154188580Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:32.157461 containerd[1512]: time="2025-07-15T23:13:32.157396977Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:32.158616 containerd[1512]: time="2025-07-15T23:13:32.158123163Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 3.996059176s" Jul 15 23:13:32.158616 containerd[1512]: time="2025-07-15T23:13:32.158158045Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Jul 15 23:13:32.159360 containerd[1512]: time="2025-07-15T23:13:32.159336648Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 23:13:32.162318 containerd[1512]: time="2025-07-15T23:13:32.161954983Z" level=info msg="CreateContainer within sandbox \"9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Jul 15 23:13:32.172093 containerd[1512]: time="2025-07-15T23:13:32.172041191Z" level=info msg="Container 3f833510b8f86b6ad59e910f2808e8389601c8f0da2e01749e455c7c61cf7d76: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:32.195524 containerd[1512]: time="2025-07-15T23:13:32.195411684Z" level=info msg="CreateContainer within sandbox \"9e5ff36a8146d8245e3f8204af34884f835570539fe4c02d9e303c152d85c4a9\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"3f833510b8f86b6ad59e910f2808e8389601c8f0da2e01749e455c7c61cf7d76\"" Jul 15 23:13:32.197101 containerd[1512]: time="2025-07-15T23:13:32.197038623Z" level=info msg="StartContainer for \"3f833510b8f86b6ad59e910f2808e8389601c8f0da2e01749e455c7c61cf7d76\"" Jul 15 23:13:32.199055 containerd[1512]: time="2025-07-15T23:13:32.199024295Z" level=info msg="connecting to shim 3f833510b8f86b6ad59e910f2808e8389601c8f0da2e01749e455c7c61cf7d76" address="unix:///run/containerd/s/9e3ea51e2d7a1dfa86554e4163b65de38d33f6747ac630a9bbd610d397d97251" protocol=ttrpc version=3 Jul 15 23:13:32.232338 systemd[1]: Started cri-containerd-3f833510b8f86b6ad59e910f2808e8389601c8f0da2e01749e455c7c61cf7d76.scope - libcontainer container 3f833510b8f86b6ad59e910f2808e8389601c8f0da2e01749e455c7c61cf7d76. Jul 15 23:13:32.290821 containerd[1512]: time="2025-07-15T23:13:32.290505993Z" level=info msg="StartContainer for \"3f833510b8f86b6ad59e910f2808e8389601c8f0da2e01749e455c7c61cf7d76\" returns successfully" Jul 15 23:13:32.653135 systemd-networkd[1424]: cali9aafe50cf78: Gained IPv6LL Jul 15 23:13:35.902713 containerd[1512]: time="2025-07-15T23:13:35.901102588Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:35.903827 containerd[1512]: time="2025-07-15T23:13:35.903740678Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Jul 15 23:13:35.904320 containerd[1512]: time="2025-07-15T23:13:35.904291857Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:35.912462 containerd[1512]: time="2025-07-15T23:13:35.912389054Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:35.914069 containerd[1512]: time="2025-07-15T23:13:35.914030710Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 3.754582059s" Jul 15 23:13:35.914550 containerd[1512]: time="2025-07-15T23:13:35.914256718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 15 23:13:35.916361 containerd[1512]: time="2025-07-15T23:13:35.916270707Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Jul 15 23:13:35.919210 containerd[1512]: time="2025-07-15T23:13:35.919177807Z" level=info msg="CreateContainer within sandbox \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 23:13:35.939372 containerd[1512]: time="2025-07-15T23:13:35.937924289Z" level=info msg="Container c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:35.942671 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2833258098.mount: Deactivated successfully. Jul 15 23:13:35.956261 containerd[1512]: time="2025-07-15T23:13:35.956178474Z" level=info msg="CreateContainer within sandbox \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\"" Jul 15 23:13:35.957956 containerd[1512]: time="2025-07-15T23:13:35.957881173Z" level=info msg="StartContainer for \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\"" Jul 15 23:13:35.959706 containerd[1512]: time="2025-07-15T23:13:35.959658793Z" level=info msg="connecting to shim c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208" address="unix:///run/containerd/s/3733114f11b59697a1b675a2ce03183ef7f00b0a3fd6ecd87661876c0ff945a7" protocol=ttrpc version=3 Jul 15 23:13:35.998195 systemd[1]: Started cri-containerd-c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208.scope - libcontainer container c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208. Jul 15 23:13:36.059865 containerd[1512]: time="2025-07-15T23:13:36.059199725Z" level=info msg="StartContainer for \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" returns successfully" Jul 15 23:13:36.406657 kubelet[2668]: I0715 23:13:36.406514 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-659466c688-flsdz" podStartSLOduration=5.430926358 podStartE2EDuration="11.406487594s" podCreationTimestamp="2025-07-15 23:13:25 +0000 UTC" firstStartedPulling="2025-07-15 23:13:26.183665207 +0000 UTC m=+43.218205117" lastFinishedPulling="2025-07-15 23:13:32.159226443 +0000 UTC m=+49.193766353" observedRunningTime="2025-07-15 23:13:32.396126006 +0000 UTC m=+49.430665956" watchObservedRunningTime="2025-07-15 23:13:36.406487594 +0000 UTC m=+53.441027504" Jul 15 23:13:37.399525 kubelet[2668]: I0715 23:13:37.399480 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:13:38.626810 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4191263339.mount: Deactivated successfully. Jul 15 23:13:39.298611 containerd[1512]: time="2025-07-15T23:13:39.298548022Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:39.300344 containerd[1512]: time="2025-07-15T23:13:39.299961707Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Jul 15 23:13:39.301694 containerd[1512]: time="2025-07-15T23:13:39.301620640Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:39.306247 containerd[1512]: time="2025-07-15T23:13:39.306184625Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:39.308818 containerd[1512]: time="2025-07-15T23:13:39.308611343Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 3.392289554s" Jul 15 23:13:39.308818 containerd[1512]: time="2025-07-15T23:13:39.308672224Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Jul 15 23:13:39.311627 containerd[1512]: time="2025-07-15T23:13:39.311237066Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 23:13:39.316994 containerd[1512]: time="2025-07-15T23:13:39.316475353Z" level=info msg="CreateContainer within sandbox \"2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Jul 15 23:13:39.326497 containerd[1512]: time="2025-07-15T23:13:39.326462912Z" level=info msg="Container 97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:39.345146 containerd[1512]: time="2025-07-15T23:13:39.345081505Z" level=info msg="CreateContainer within sandbox \"2334744be501c7d7c0966af835720ff4bf0aaa34dff71eb74a239f4c05aecdd5\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\"" Jul 15 23:13:39.345825 containerd[1512]: time="2025-07-15T23:13:39.345781207Z" level=info msg="StartContainer for \"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\"" Jul 15 23:13:39.349445 containerd[1512]: time="2025-07-15T23:13:39.348827505Z" level=info msg="connecting to shim 97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01" address="unix:///run/containerd/s/5b183cdfcca0527d59d86cd09d4738be96fbe28bd7c58f7a9552cc45114b4ba7" protocol=ttrpc version=3 Jul 15 23:13:39.376050 systemd[1]: Started cri-containerd-97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01.scope - libcontainer container 97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01. Jul 15 23:13:39.427619 containerd[1512]: time="2025-07-15T23:13:39.427516253Z" level=info msg="StartContainer for \"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" returns successfully" Jul 15 23:13:39.714805 containerd[1512]: time="2025-07-15T23:13:39.713731137Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 15 23:13:39.714805 containerd[1512]: time="2025-07-15T23:13:39.714288555Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:39.716069 containerd[1512]: time="2025-07-15T23:13:39.716018370Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 403.948758ms" Jul 15 23:13:39.716198 containerd[1512]: time="2025-07-15T23:13:39.716158575Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 15 23:13:39.717854 containerd[1512]: time="2025-07-15T23:13:39.717809147Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Jul 15 23:13:39.719863 containerd[1512]: time="2025-07-15T23:13:39.719811811Z" level=info msg="CreateContainer within sandbox \"1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 23:13:39.734313 containerd[1512]: time="2025-07-15T23:13:39.734263752Z" level=info msg="Container 685689fcfab286a62703a1bb807040c7b3f57d2a7ef39b0f99dfdb855cabd7cc: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:39.742963 containerd[1512]: time="2025-07-15T23:13:39.742921748Z" level=info msg="CreateContainer within sandbox \"1824a44126633e1dd74388bea3d719ea128d1a4ede6316ab4c3b36a507a3b578\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"685689fcfab286a62703a1bb807040c7b3f57d2a7ef39b0f99dfdb855cabd7cc\"" Jul 15 23:13:39.745247 containerd[1512]: time="2025-07-15T23:13:39.744971453Z" level=info msg="StartContainer for \"685689fcfab286a62703a1bb807040c7b3f57d2a7ef39b0f99dfdb855cabd7cc\"" Jul 15 23:13:39.748926 containerd[1512]: time="2025-07-15T23:13:39.748876258Z" level=info msg="connecting to shim 685689fcfab286a62703a1bb807040c7b3f57d2a7ef39b0f99dfdb855cabd7cc" address="unix:///run/containerd/s/4a500903ec49dd2f9df1147e648a3cdb7ccbf86cbb412faf900867d826a37734" protocol=ttrpc version=3 Jul 15 23:13:39.779155 systemd[1]: Started cri-containerd-685689fcfab286a62703a1bb807040c7b3f57d2a7ef39b0f99dfdb855cabd7cc.scope - libcontainer container 685689fcfab286a62703a1bb807040c7b3f57d2a7ef39b0f99dfdb855cabd7cc. Jul 15 23:13:39.832770 containerd[1512]: time="2025-07-15T23:13:39.832725611Z" level=info msg="StartContainer for \"685689fcfab286a62703a1bb807040c7b3f57d2a7ef39b0f99dfdb855cabd7cc\" returns successfully" Jul 15 23:13:40.440867 kubelet[2668]: I0715 23:13:40.440182 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f5986dd7d-qm2lh" podStartSLOduration=34.086906756 podStartE2EDuration="41.440165194s" podCreationTimestamp="2025-07-15 23:12:59 +0000 UTC" firstStartedPulling="2025-07-15 23:13:28.562579494 +0000 UTC m=+45.597119404" lastFinishedPulling="2025-07-15 23:13:35.915837932 +0000 UTC m=+52.950377842" observedRunningTime="2025-07-15 23:13:36.409656541 +0000 UTC m=+53.444196451" watchObservedRunningTime="2025-07-15 23:13:40.440165194 +0000 UTC m=+57.474705104" Jul 15 23:13:40.440867 kubelet[2668]: I0715 23:13:40.440635 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-797fd56c96-7vvww" podStartSLOduration=27.533170117 podStartE2EDuration="38.440629208s" podCreationTimestamp="2025-07-15 23:13:02 +0000 UTC" firstStartedPulling="2025-07-15 23:13:28.81015281 +0000 UTC m=+45.844692720" lastFinishedPulling="2025-07-15 23:13:39.717611901 +0000 UTC m=+56.752151811" observedRunningTime="2025-07-15 23:13:40.437758798 +0000 UTC m=+57.472298828" watchObservedRunningTime="2025-07-15 23:13:40.440629208 +0000 UTC m=+57.475169118" Jul 15 23:13:40.469376 kubelet[2668]: I0715 23:13:40.469270 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-58fd7646b9-mz25w" podStartSLOduration=23.841965091 podStartE2EDuration="34.469246586s" podCreationTimestamp="2025-07-15 23:13:06 +0000 UTC" firstStartedPulling="2025-07-15 23:13:28.683424114 +0000 UTC m=+45.717964024" lastFinishedPulling="2025-07-15 23:13:39.310705569 +0000 UTC m=+56.345245519" observedRunningTime="2025-07-15 23:13:40.469200705 +0000 UTC m=+57.503740655" watchObservedRunningTime="2025-07-15 23:13:40.469246586 +0000 UTC m=+57.503786536" Jul 15 23:13:40.561916 containerd[1512]: time="2025-07-15T23:13:40.561866572Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"873aebc39f348c435dcdb3ec2967032b869494e6d16dabde5ab0289af7b230e1\" pid:4971 exit_status:1 exited_at:{seconds:1752621220 nanos:560800379}" Jul 15 23:13:41.424211 kubelet[2668]: I0715 23:13:41.424088 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:13:41.537243 containerd[1512]: time="2025-07-15T23:13:41.537183439Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"b5de65442546ccee4dab4c8b1f1fd0fdc004ed15c2da33f1fbfbbb12afabccac\" pid:5003 exit_status:1 exited_at:{seconds:1752621221 nanos:536635582}" Jul 15 23:13:42.521289 containerd[1512]: time="2025-07-15T23:13:42.521233658Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"723a43ba7052c4a51fa32493190049747fe7bc26afb1e82c03ec8effafb41799\" pid:5027 exit_status:1 exited_at:{seconds:1752621222 nanos:520758883}" Jul 15 23:13:43.362348 containerd[1512]: time="2025-07-15T23:13:43.362295046Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:43.365704 containerd[1512]: time="2025-07-15T23:13:43.365096370Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Jul 15 23:13:43.365962 containerd[1512]: time="2025-07-15T23:13:43.365915795Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:43.368945 containerd[1512]: time="2025-07-15T23:13:43.368891924Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:43.371311 containerd[1512]: time="2025-07-15T23:13:43.371279996Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 3.653431248s" Jul 15 23:13:43.371524 containerd[1512]: time="2025-07-15T23:13:43.371500162Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Jul 15 23:13:43.373837 containerd[1512]: time="2025-07-15T23:13:43.373710589Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Jul 15 23:13:43.392691 containerd[1512]: time="2025-07-15T23:13:43.392518674Z" level=info msg="CreateContainer within sandbox \"e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Jul 15 23:13:43.403052 containerd[1512]: time="2025-07-15T23:13:43.402984388Z" level=info msg="Container 05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:43.416902 containerd[1512]: time="2025-07-15T23:13:43.416650479Z" level=info msg="CreateContainer within sandbox \"e16e16206aff571ed494321aea870c02995398f30bc79e362c334428e9cc6e88\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\"" Jul 15 23:13:43.420875 containerd[1512]: time="2025-07-15T23:13:43.420618078Z" level=info msg="StartContainer for \"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\"" Jul 15 23:13:43.423367 containerd[1512]: time="2025-07-15T23:13:43.423328600Z" level=info msg="connecting to shim 05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce" address="unix:///run/containerd/s/2b772c7d26bf8649f62dcc65fc6d98f51c3d0f62cc020ab85b16fa352afcc884" protocol=ttrpc version=3 Jul 15 23:13:43.457060 systemd[1]: Started cri-containerd-05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce.scope - libcontainer container 05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce. Jul 15 23:13:43.515305 containerd[1512]: time="2025-07-15T23:13:43.515229121Z" level=info msg="StartContainer for \"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\" returns successfully" Jul 15 23:13:43.742675 containerd[1512]: time="2025-07-15T23:13:43.742043215Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:43.743238 containerd[1512]: time="2025-07-15T23:13:43.742816478Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Jul 15 23:13:43.747110 containerd[1512]: time="2025-07-15T23:13:43.746962323Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 372.981446ms" Jul 15 23:13:43.747110 containerd[1512]: time="2025-07-15T23:13:43.747009924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Jul 15 23:13:43.749194 containerd[1512]: time="2025-07-15T23:13:43.749149269Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Jul 15 23:13:43.758757 containerd[1512]: time="2025-07-15T23:13:43.758678275Z" level=info msg="CreateContainer within sandbox \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 23:13:43.767382 containerd[1512]: time="2025-07-15T23:13:43.767330055Z" level=info msg="Container a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:43.783077 containerd[1512]: time="2025-07-15T23:13:43.783027806Z" level=info msg="CreateContainer within sandbox \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\"" Jul 15 23:13:43.784971 containerd[1512]: time="2025-07-15T23:13:43.784136960Z" level=info msg="StartContainer for \"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\"" Jul 15 23:13:43.786348 containerd[1512]: time="2025-07-15T23:13:43.786296625Z" level=info msg="connecting to shim a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9" address="unix:///run/containerd/s/2cab5eba2143d3f1a554469d3ef3d34fb772a9debb2bb007ba2e78fb7c00a342" protocol=ttrpc version=3 Jul 15 23:13:43.816113 systemd[1]: Started cri-containerd-a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9.scope - libcontainer container a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9. Jul 15 23:13:43.861030 containerd[1512]: time="2025-07-15T23:13:43.860919947Z" level=info msg="StartContainer for \"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" returns successfully" Jul 15 23:13:44.484649 kubelet[2668]: I0715 23:13:44.484544 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-f5986dd7d-tb5xs" podStartSLOduration=31.332302337 podStartE2EDuration="45.484526534s" podCreationTimestamp="2025-07-15 23:12:59 +0000 UTC" firstStartedPulling="2025-07-15 23:13:29.596129328 +0000 UTC m=+46.630669238" lastFinishedPulling="2025-07-15 23:13:43.748353485 +0000 UTC m=+60.782893435" observedRunningTime="2025-07-15 23:13:44.482982088 +0000 UTC m=+61.517521998" watchObservedRunningTime="2025-07-15 23:13:44.484526534 +0000 UTC m=+61.519066444" Jul 15 23:13:45.464735 kubelet[2668]: I0715 23:13:45.464587 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:13:45.547880 containerd[1512]: time="2025-07-15T23:13:45.547284330Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\" id:\"f01ce767f76523f665879fff8cc7db7114fa37d3c777e3dd50e2dc825eb0dcca\" pid:5138 exited_at:{seconds:1752621225 nanos:543786988}" Jul 15 23:13:45.587322 kubelet[2668]: I0715 23:13:45.587184 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-8bf995db6-rbt25" podStartSLOduration=25.075337636 podStartE2EDuration="39.587155538s" podCreationTimestamp="2025-07-15 23:13:06 +0000 UTC" firstStartedPulling="2025-07-15 23:13:28.861458194 +0000 UTC m=+45.895998104" lastFinishedPulling="2025-07-15 23:13:43.373276096 +0000 UTC m=+60.407816006" observedRunningTime="2025-07-15 23:13:44.501655322 +0000 UTC m=+61.536195232" watchObservedRunningTime="2025-07-15 23:13:45.587155538 +0000 UTC m=+62.621695448" Jul 15 23:13:45.728156 containerd[1512]: time="2025-07-15T23:13:45.726942633Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:45.729551 containerd[1512]: time="2025-07-15T23:13:45.729371104Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Jul 15 23:13:45.730764 containerd[1512]: time="2025-07-15T23:13:45.730522857Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:45.736082 containerd[1512]: time="2025-07-15T23:13:45.736027819Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:45.737151 containerd[1512]: time="2025-07-15T23:13:45.737085890Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 1.98787882s" Jul 15 23:13:45.737151 containerd[1512]: time="2025-07-15T23:13:45.737140811Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Jul 15 23:13:45.740513 containerd[1512]: time="2025-07-15T23:13:45.740450028Z" level=info msg="CreateContainer within sandbox \"f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Jul 15 23:13:45.753036 containerd[1512]: time="2025-07-15T23:13:45.752998076Z" level=info msg="Container b956df34519a83b883356c79ffa2cc71eab4703ee0e9aa76e1b132b9f8ce14f6: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:45.762278 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3969280873.mount: Deactivated successfully. Jul 15 23:13:45.767664 containerd[1512]: time="2025-07-15T23:13:45.767604744Z" level=info msg="CreateContainer within sandbox \"f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"b956df34519a83b883356c79ffa2cc71eab4703ee0e9aa76e1b132b9f8ce14f6\"" Jul 15 23:13:45.768609 containerd[1512]: time="2025-07-15T23:13:45.768418967Z" level=info msg="StartContainer for \"b956df34519a83b883356c79ffa2cc71eab4703ee0e9aa76e1b132b9f8ce14f6\"" Jul 15 23:13:45.770360 containerd[1512]: time="2025-07-15T23:13:45.770320063Z" level=info msg="connecting to shim b956df34519a83b883356c79ffa2cc71eab4703ee0e9aa76e1b132b9f8ce14f6" address="unix:///run/containerd/s/91d3871668df5dce09ae19ebfe4e47511eadd73a3faf16ae4b553c0ca8f7f510" protocol=ttrpc version=3 Jul 15 23:13:45.800019 systemd[1]: Started cri-containerd-b956df34519a83b883356c79ffa2cc71eab4703ee0e9aa76e1b132b9f8ce14f6.scope - libcontainer container b956df34519a83b883356c79ffa2cc71eab4703ee0e9aa76e1b132b9f8ce14f6. Jul 15 23:13:45.859885 containerd[1512]: time="2025-07-15T23:13:45.859803004Z" level=info msg="StartContainer for \"b956df34519a83b883356c79ffa2cc71eab4703ee0e9aa76e1b132b9f8ce14f6\" returns successfully" Jul 15 23:13:45.863086 containerd[1512]: time="2025-07-15T23:13:45.863030299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Jul 15 23:13:46.062010 containerd[1512]: time="2025-07-15T23:13:46.061760218Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\" id:\"37c3a7c3f164a6ed8b097c0821aafc73c1f4ebee958f8e23e6bd5ddc0565bb4a\" pid:5212 exited_at:{seconds:1752621226 nanos:60795391}" Jul 15 23:13:46.103226 containerd[1512]: time="2025-07-15T23:13:46.103163697Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"be65dce8480b669a26ee57e278f96391caa2b11afeb26e413ad5acbd61468ba5\" pid:5204 exited_at:{seconds:1752621226 nanos:102827807}" Jul 15 23:13:47.696148 containerd[1512]: time="2025-07-15T23:13:47.696029104Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:47.698027 containerd[1512]: time="2025-07-15T23:13:47.697973759Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Jul 15 23:13:47.699815 containerd[1512]: time="2025-07-15T23:13:47.699748410Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:47.703442 containerd[1512]: time="2025-07-15T23:13:47.703354873Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jul 15 23:13:47.704429 containerd[1512]: time="2025-07-15T23:13:47.704253219Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 1.84118108s" Jul 15 23:13:47.704429 containerd[1512]: time="2025-07-15T23:13:47.704300380Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Jul 15 23:13:47.715731 containerd[1512]: time="2025-07-15T23:13:47.715671546Z" level=info msg="CreateContainer within sandbox \"f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Jul 15 23:13:47.732045 containerd[1512]: time="2025-07-15T23:13:47.730143000Z" level=info msg="Container c17f50ef2254cd1a5ebfbce6c086573b68ce5c2bccbdc0aaebfdecb1255a830b: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:47.739358 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3509118008.mount: Deactivated successfully. Jul 15 23:13:47.750571 containerd[1512]: time="2025-07-15T23:13:47.750512223Z" level=info msg="CreateContainer within sandbox \"f0915037475d918b573e42c342be9ec3f1b20977b45f5c06b23da3ee1428721d\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c17f50ef2254cd1a5ebfbce6c086573b68ce5c2bccbdc0aaebfdecb1255a830b\"" Jul 15 23:13:47.754927 containerd[1512]: time="2025-07-15T23:13:47.754882468Z" level=info msg="StartContainer for \"c17f50ef2254cd1a5ebfbce6c086573b68ce5c2bccbdc0aaebfdecb1255a830b\"" Jul 15 23:13:47.756802 containerd[1512]: time="2025-07-15T23:13:47.756757362Z" level=info msg="connecting to shim c17f50ef2254cd1a5ebfbce6c086573b68ce5c2bccbdc0aaebfdecb1255a830b" address="unix:///run/containerd/s/91d3871668df5dce09ae19ebfe4e47511eadd73a3faf16ae4b553c0ca8f7f510" protocol=ttrpc version=3 Jul 15 23:13:47.783082 systemd[1]: Started cri-containerd-c17f50ef2254cd1a5ebfbce6c086573b68ce5c2bccbdc0aaebfdecb1255a830b.scope - libcontainer container c17f50ef2254cd1a5ebfbce6c086573b68ce5c2bccbdc0aaebfdecb1255a830b. Jul 15 23:13:47.833626 containerd[1512]: time="2025-07-15T23:13:47.833499599Z" level=info msg="StartContainer for \"c17f50ef2254cd1a5ebfbce6c086573b68ce5c2bccbdc0aaebfdecb1255a830b\" returns successfully" Jul 15 23:13:48.219315 kubelet[2668]: I0715 23:13:48.219188 2668 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Jul 15 23:13:48.225399 kubelet[2668]: I0715 23:13:48.225366 2668 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Jul 15 23:13:48.754464 containerd[1512]: time="2025-07-15T23:13:48.754403135Z" level=info msg="TaskExit event in podsandbox handler container_id:\"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\" id:\"e6ef5cd16f23d7508979a9d7e7a1c521085e1202439b2f7a7cbf14d52249be0f\" pid:5284 exited_at:{seconds:1752621228 nanos:753695275}" Jul 15 23:13:48.792616 kubelet[2668]: I0715 23:13:48.791913 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-hddsp" podStartSLOduration=26.493534236 podStartE2EDuration="42.790831807s" podCreationTimestamp="2025-07-15 23:13:06 +0000 UTC" firstStartedPulling="2025-07-15 23:13:31.415230405 +0000 UTC m=+48.449770315" lastFinishedPulling="2025-07-15 23:13:47.712527936 +0000 UTC m=+64.747067886" observedRunningTime="2025-07-15 23:13:48.509665883 +0000 UTC m=+65.544205793" watchObservedRunningTime="2025-07-15 23:13:48.790831807 +0000 UTC m=+65.825371757" Jul 15 23:13:54.233861 kubelet[2668]: I0715 23:13:54.233220 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:13:54.365445 kubelet[2668]: I0715 23:13:54.365409 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:13:54.436526 kubelet[2668]: I0715 23:13:54.436484 2668 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jul 15 23:13:54.438518 containerd[1512]: time="2025-07-15T23:13:54.438470351Z" level=info msg="StopContainer for \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" with timeout 30 (s)" Jul 15 23:13:54.440136 containerd[1512]: time="2025-07-15T23:13:54.440090938Z" level=info msg="Stop container \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" with signal terminated" Jul 15 23:13:54.507903 systemd[1]: Created slice kubepods-besteffort-podeb02fdd1_380d_41c9_9ac4_05ee9a338e63.slice - libcontainer container kubepods-besteffort-podeb02fdd1_380d_41c9_9ac4_05ee9a338e63.slice. Jul 15 23:13:54.540040 systemd[1]: cri-containerd-c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208.scope: Deactivated successfully. Jul 15 23:13:54.540587 systemd[1]: cri-containerd-c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208.scope: Consumed 1.137s CPU time, 43.8M memory peak, 4K read from disk. Jul 15 23:13:54.547362 containerd[1512]: time="2025-07-15T23:13:54.547319455Z" level=info msg="received exit event container_id:\"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" id:\"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" pid:4848 exit_status:1 exited_at:{seconds:1752621234 nanos:545908951}" Jul 15 23:13:54.548929 containerd[1512]: time="2025-07-15T23:13:54.547637580Z" level=info msg="TaskExit event in podsandbox handler container_id:\"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" id:\"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" pid:4848 exit_status:1 exited_at:{seconds:1752621234 nanos:545908951}" Jul 15 23:13:54.582507 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208-rootfs.mount: Deactivated successfully. Jul 15 23:13:54.598219 kubelet[2668]: I0715 23:13:54.598078 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/eb02fdd1-380d-41c9-9ac4-05ee9a338e63-calico-apiserver-certs\") pod \"calico-apiserver-797fd56c96-j447m\" (UID: \"eb02fdd1-380d-41c9-9ac4-05ee9a338e63\") " pod="calico-apiserver/calico-apiserver-797fd56c96-j447m" Jul 15 23:13:54.598785 kubelet[2668]: I0715 23:13:54.598694 2668 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdk8x\" (UniqueName: \"kubernetes.io/projected/eb02fdd1-380d-41c9-9ac4-05ee9a338e63-kube-api-access-mdk8x\") pod \"calico-apiserver-797fd56c96-j447m\" (UID: \"eb02fdd1-380d-41c9-9ac4-05ee9a338e63\") " pod="calico-apiserver/calico-apiserver-797fd56c96-j447m" Jul 15 23:13:54.709385 containerd[1512]: time="2025-07-15T23:13:54.709297309Z" level=info msg="StopContainer for \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" returns successfully" Jul 15 23:13:54.725925 containerd[1512]: time="2025-07-15T23:13:54.722024967Z" level=info msg="StopPodSandbox for \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\"" Jul 15 23:13:54.725925 containerd[1512]: time="2025-07-15T23:13:54.722141049Z" level=info msg="Container to stop \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 15 23:13:54.751234 systemd[1]: cri-containerd-4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d.scope: Deactivated successfully. Jul 15 23:13:54.754083 containerd[1512]: time="2025-07-15T23:13:54.754044835Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" id:\"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" pid:4226 exit_status:137 exited_at:{seconds:1752621234 nanos:753429945}" Jul 15 23:13:54.794219 containerd[1512]: time="2025-07-15T23:13:54.794103281Z" level=info msg="received exit event sandbox_id:\"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" exit_status:137 exited_at:{seconds:1752621234 nanos:753429945}" Jul 15 23:13:54.796782 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d-rootfs.mount: Deactivated successfully. Jul 15 23:13:54.800576 containerd[1512]: time="2025-07-15T23:13:54.800496191Z" level=info msg="shim disconnected" id=4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d namespace=k8s.io Jul 15 23:13:54.800576 containerd[1512]: time="2025-07-15T23:13:54.800546472Z" level=warning msg="cleaning up after shim disconnected" id=4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d namespace=k8s.io Jul 15 23:13:54.800576 containerd[1512]: time="2025-07-15T23:13:54.800580072Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 15 23:13:54.803926 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d-shm.mount: Deactivated successfully. Jul 15 23:13:54.817521 containerd[1512]: time="2025-07-15T23:13:54.817473682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797fd56c96-j447m,Uid:eb02fdd1-380d-41c9-9ac4-05ee9a338e63,Namespace:calico-apiserver,Attempt:0,}" Jul 15 23:13:54.904025 systemd-networkd[1424]: cali806be1891fc: Link DOWN Jul 15 23:13:54.904880 systemd-networkd[1424]: cali806be1891fc: Lost carrier Jul 15 23:13:55.059052 systemd-networkd[1424]: cali938f4026974: Link UP Jul 15 23:13:55.059187 systemd-networkd[1424]: cali938f4026974: Gained carrier Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:54.890 [INFO][5376] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0 calico-apiserver-797fd56c96- calico-apiserver eb02fdd1-380d-41c9-9ac4-05ee9a338e63 1155 0 2025-07-15 23:13:54 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:797fd56c96 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4372-0-1-n-91aeaf5bee calico-apiserver-797fd56c96-j447m eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali938f4026974 [] [] }} ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-j447m" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:54.891 [INFO][5376] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-j447m" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:54.957 [INFO][5390] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" HandleID="k8s-pod-network.6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:54.957 [INFO][5390] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" HandleID="k8s-pod-network.6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3610), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4372-0-1-n-91aeaf5bee", "pod":"calico-apiserver-797fd56c96-j447m", "timestamp":"2025-07-15 23:13:54.957285516 +0000 UTC"}, Hostname:"ci-4372-0-1-n-91aeaf5bee", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:54.957 [INFO][5390] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:54.957 [INFO][5390] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:54.957 [INFO][5390] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4372-0-1-n-91aeaf5bee' Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:54.979 [INFO][5390] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:54.992 [INFO][5390] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.006 [INFO][5390] ipam/ipam.go 511: Trying affinity for 192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.008 [INFO][5390] ipam/ipam.go 158: Attempting to load block cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.013 [INFO][5390] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.56.64/26 host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.014 [INFO][5390] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.56.64/26 handle="k8s-pod-network.6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.023 [INFO][5390] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3 Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.029 [INFO][5390] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.56.64/26 handle="k8s-pod-network.6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.050 [INFO][5390] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.56.74/26] block=192.168.56.64/26 handle="k8s-pod-network.6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.050 [INFO][5390] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.56.74/26] handle="k8s-pod-network.6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" host="ci-4372-0-1-n-91aeaf5bee" Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.050 [INFO][5390] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:55.086392 containerd[1512]: 2025-07-15 23:13:55.050 [INFO][5390] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.56.74/26] IPv6=[] ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" HandleID="k8s-pod-network.6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" Jul 15 23:13:55.090028 containerd[1512]: 2025-07-15 23:13:55.054 [INFO][5376] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-j447m" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0", GenerateName:"calico-apiserver-797fd56c96-", Namespace:"calico-apiserver", SelfLink:"", UID:"eb02fdd1-380d-41c9-9ac4-05ee9a338e63", ResourceVersion:"1155", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"797fd56c96", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"", Pod:"calico-apiserver-797fd56c96-j447m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali938f4026974", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:55.090028 containerd[1512]: 2025-07-15 23:13:55.055 [INFO][5376] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.56.74/32] ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-j447m" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" Jul 15 23:13:55.090028 containerd[1512]: 2025-07-15 23:13:55.055 [INFO][5376] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali938f4026974 ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-j447m" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" Jul 15 23:13:55.090028 containerd[1512]: 2025-07-15 23:13:55.058 [INFO][5376] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-j447m" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" Jul 15 23:13:55.090028 containerd[1512]: 2025-07-15 23:13:55.061 [INFO][5376] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-j447m" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0", GenerateName:"calico-apiserver-797fd56c96-", Namespace:"calico-apiserver", SelfLink:"", UID:"eb02fdd1-380d-41c9-9ac4-05ee9a338e63", ResourceVersion:"1155", Generation:0, CreationTimestamp:time.Date(2025, time.July, 15, 23, 13, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"797fd56c96", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4372-0-1-n-91aeaf5bee", ContainerID:"6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3", Pod:"calico-apiserver-797fd56c96-j447m", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.56.74/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali938f4026974", MAC:"6e:2e:fd:90:dc:28", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jul 15 23:13:55.090028 containerd[1512]: 2025-07-15 23:13:55.078 [INFO][5376] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" Namespace="calico-apiserver" Pod="calico-apiserver-797fd56c96-j447m" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--797fd56c96--j447m-eth0" Jul 15 23:13:55.128881 containerd[1512]: time="2025-07-15T23:13:55.127522655Z" level=info msg="connecting to shim 6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3" address="unix:///run/containerd/s/70ab70023e07f07e2556f816e0c870a36fcfa42b9c1453212e1ae4f9bbd37da3" namespace=k8s.io protocol=ttrpc version=3 Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:54.899 [INFO][5371] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:54.899 [INFO][5371] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" iface="eth0" netns="/var/run/netns/cni-9433207a-4a95-b3ac-a282-ead68091eb83" Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:54.901 [INFO][5371] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" iface="eth0" netns="/var/run/netns/cni-9433207a-4a95-b3ac-a282-ead68091eb83" Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:54.917 [INFO][5371] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" after=17.375378ms iface="eth0" netns="/var/run/netns/cni-9433207a-4a95-b3ac-a282-ead68091eb83" Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:54.918 [INFO][5371] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:54.918 [INFO][5371] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:54.964 [INFO][5397] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:54.964 [INFO][5397] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:55.050 [INFO][5397] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:55.147 [INFO][5397] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:55.148 [INFO][5397] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:55.155 [INFO][5397] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:55.162461 containerd[1512]: 2025-07-15 23:13:55.158 [INFO][5371] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:13:55.164508 containerd[1512]: time="2025-07-15T23:13:55.164470295Z" level=info msg="TearDown network for sandbox \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" successfully" Jul 15 23:13:55.164717 containerd[1512]: time="2025-07-15T23:13:55.164697859Z" level=info msg="StopPodSandbox for \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" returns successfully" Jul 15 23:13:55.207043 systemd[1]: Started cri-containerd-6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3.scope - libcontainer container 6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3. Jul 15 23:13:55.295137 containerd[1512]: time="2025-07-15T23:13:55.295068956Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-797fd56c96-j447m,Uid:eb02fdd1-380d-41c9-9ac4-05ee9a338e63,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3\"" Jul 15 23:13:55.300040 containerd[1512]: time="2025-07-15T23:13:55.299939040Z" level=info msg="CreateContainer within sandbox \"6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Jul 15 23:13:55.307313 kubelet[2668]: I0715 23:13:55.307273 2668 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-knfns\" (UniqueName: \"kubernetes.io/projected/1a4cf393-394d-46d2-b039-ad46900b55f7-kube-api-access-knfns\") pod \"1a4cf393-394d-46d2-b039-ad46900b55f7\" (UID: \"1a4cf393-394d-46d2-b039-ad46900b55f7\") " Jul 15 23:13:55.308687 kubelet[2668]: I0715 23:13:55.308327 2668 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1a4cf393-394d-46d2-b039-ad46900b55f7-calico-apiserver-certs\") pod \"1a4cf393-394d-46d2-b039-ad46900b55f7\" (UID: \"1a4cf393-394d-46d2-b039-ad46900b55f7\") " Jul 15 23:13:55.313273 containerd[1512]: time="2025-07-15T23:13:55.311023032Z" level=info msg="Container b769666e7efe3621fedb249dd6f8d0da00f8788cdab139369f12fc5ef48da9fb: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:13:55.317053 kubelet[2668]: I0715 23:13:55.317012 2668 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a4cf393-394d-46d2-b039-ad46900b55f7-kube-api-access-knfns" (OuterVolumeSpecName: "kube-api-access-knfns") pod "1a4cf393-394d-46d2-b039-ad46900b55f7" (UID: "1a4cf393-394d-46d2-b039-ad46900b55f7"). InnerVolumeSpecName "kube-api-access-knfns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 23:13:55.318102 kubelet[2668]: I0715 23:13:55.318072 2668 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a4cf393-394d-46d2-b039-ad46900b55f7-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "1a4cf393-394d-46d2-b039-ad46900b55f7" (UID: "1a4cf393-394d-46d2-b039-ad46900b55f7"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 23:13:55.325473 containerd[1512]: time="2025-07-15T23:13:55.325386001Z" level=info msg="CreateContainer within sandbox \"6a8c21cbadff261233a7cb687aca34ca7ea81fb31dd56b08580428155b612db3\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"b769666e7efe3621fedb249dd6f8d0da00f8788cdab139369f12fc5ef48da9fb\"" Jul 15 23:13:55.327529 containerd[1512]: time="2025-07-15T23:13:55.327485557Z" level=info msg="StartContainer for \"b769666e7efe3621fedb249dd6f8d0da00f8788cdab139369f12fc5ef48da9fb\"" Jul 15 23:13:55.330629 containerd[1512]: time="2025-07-15T23:13:55.330521329Z" level=info msg="connecting to shim b769666e7efe3621fedb249dd6f8d0da00f8788cdab139369f12fc5ef48da9fb" address="unix:///run/containerd/s/70ab70023e07f07e2556f816e0c870a36fcfa42b9c1453212e1ae4f9bbd37da3" protocol=ttrpc version=3 Jul 15 23:13:55.354024 systemd[1]: Started cri-containerd-b769666e7efe3621fedb249dd6f8d0da00f8788cdab139369f12fc5ef48da9fb.scope - libcontainer container b769666e7efe3621fedb249dd6f8d0da00f8788cdab139369f12fc5ef48da9fb. Jul 15 23:13:55.399418 containerd[1512]: time="2025-07-15T23:13:55.399385882Z" level=info msg="StartContainer for \"b769666e7efe3621fedb249dd6f8d0da00f8788cdab139369f12fc5ef48da9fb\" returns successfully" Jul 15 23:13:55.409095 kubelet[2668]: I0715 23:13:55.409038 2668 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-knfns\" (UniqueName: \"kubernetes.io/projected/1a4cf393-394d-46d2-b039-ad46900b55f7-kube-api-access-knfns\") on node \"ci-4372-0-1-n-91aeaf5bee\" DevicePath \"\"" Jul 15 23:13:55.409095 kubelet[2668]: I0715 23:13:55.409071 2668 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/1a4cf393-394d-46d2-b039-ad46900b55f7-calico-apiserver-certs\") on node \"ci-4372-0-1-n-91aeaf5bee\" DevicePath \"\"" Jul 15 23:13:55.529977 kubelet[2668]: I0715 23:13:55.529001 2668 scope.go:117] "RemoveContainer" containerID="c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208" Jul 15 23:13:55.534081 containerd[1512]: time="2025-07-15T23:13:55.533983812Z" level=info msg="RemoveContainer for \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\"" Jul 15 23:13:55.545434 containerd[1512]: time="2025-07-15T23:13:55.544902481Z" level=info msg="RemoveContainer for \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" returns successfully" Jul 15 23:13:55.545629 kubelet[2668]: I0715 23:13:55.545172 2668 scope.go:117] "RemoveContainer" containerID="c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208" Jul 15 23:13:55.546037 containerd[1512]: time="2025-07-15T23:13:55.545977860Z" level=error msg="ContainerStatus for \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\" failed" error="rpc error: code = NotFound desc = an error occurred when try to find container \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\": not found" Jul 15 23:13:55.546817 systemd[1]: Removed slice kubepods-besteffort-pod1a4cf393_394d_46d2_b039_ad46900b55f7.slice - libcontainer container kubepods-besteffort-pod1a4cf393_394d_46d2_b039_ad46900b55f7.slice. Jul 15 23:13:55.546936 systemd[1]: kubepods-besteffort-pod1a4cf393_394d_46d2_b039_ad46900b55f7.slice: Consumed 1.158s CPU time, 44.1M memory peak, 4K read from disk. Jul 15 23:13:55.549206 kubelet[2668]: E0715 23:13:55.549075 2668 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = an error occurred when try to find container \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\": not found" containerID="c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208" Jul 15 23:13:55.550011 kubelet[2668]: I0715 23:13:55.549901 2668 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"containerd","ID":"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208"} err="failed to get container status \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\": rpc error: code = NotFound desc = an error occurred when try to find container \"c55c15cef4f789e00ed248f80b2b9b8a1495279aa8598d9736a20e13a33c7208\": not found" Jul 15 23:13:55.593197 systemd[1]: run-netns-cni\x2d9433207a\x2d4a95\x2db3ac\x2da282\x2dead68091eb83.mount: Deactivated successfully. Jul 15 23:13:55.593307 systemd[1]: var-lib-kubelet-pods-1a4cf393\x2d394d\x2d46d2\x2db039\x2dad46900b55f7-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dknfns.mount: Deactivated successfully. Jul 15 23:13:55.593372 systemd[1]: var-lib-kubelet-pods-1a4cf393\x2d394d\x2d46d2\x2db039\x2dad46900b55f7-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 15 23:13:55.614391 kubelet[2668]: I0715 23:13:55.614335 2668 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-797fd56c96-j447m" podStartSLOduration=1.614308643 podStartE2EDuration="1.614308643s" podCreationTimestamp="2025-07-15 23:13:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-07-15 23:13:55.574351911 +0000 UTC m=+72.608891821" watchObservedRunningTime="2025-07-15 23:13:55.614308643 +0000 UTC m=+72.648848513" Jul 15 23:13:57.037145 systemd-networkd[1424]: cali938f4026974: Gained IPv6LL Jul 15 23:13:57.091616 kubelet[2668]: I0715 23:13:57.091515 2668 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a4cf393-394d-46d2-b039-ad46900b55f7" path="/var/lib/kubelet/pods/1a4cf393-394d-46d2-b039-ad46900b55f7/volumes" Jul 15 23:13:58.313637 containerd[1512]: time="2025-07-15T23:13:58.313596432Z" level=info msg="StopContainer for \"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" with timeout 30 (s)" Jul 15 23:13:58.315120 containerd[1512]: time="2025-07-15T23:13:58.315077818Z" level=info msg="Stop container \"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" with signal terminated" Jul 15 23:13:58.346860 systemd[1]: cri-containerd-a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9.scope: Deactivated successfully. Jul 15 23:13:58.347910 systemd[1]: cri-containerd-a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9.scope: Consumed 1.423s CPU time, 49.2M memory peak. Jul 15 23:13:58.353207 containerd[1512]: time="2025-07-15T23:13:58.353168338Z" level=info msg="received exit event container_id:\"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" id:\"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" pid:5101 exit_status:1 exited_at:{seconds:1752621238 nanos:352403404}" Jul 15 23:13:58.353434 containerd[1512]: time="2025-07-15T23:13:58.353372101Z" level=info msg="TaskExit event in podsandbox handler container_id:\"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" id:\"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" pid:5101 exit_status:1 exited_at:{seconds:1752621238 nanos:352403404}" Jul 15 23:13:58.394298 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9-rootfs.mount: Deactivated successfully. Jul 15 23:13:58.417993 containerd[1512]: time="2025-07-15T23:13:58.417943453Z" level=info msg="StopContainer for \"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" returns successfully" Jul 15 23:13:58.419039 containerd[1512]: time="2025-07-15T23:13:58.418965311Z" level=info msg="StopPodSandbox for \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\"" Jul 15 23:13:58.419127 containerd[1512]: time="2025-07-15T23:13:58.419097554Z" level=info msg="Container to stop \"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" must be in running or unknown state, current state \"CONTAINER_EXITED\"" Jul 15 23:13:58.432019 systemd[1]: cri-containerd-5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635.scope: Deactivated successfully. Jul 15 23:13:58.437077 containerd[1512]: time="2025-07-15T23:13:58.436789069Z" level=info msg="TaskExit event in podsandbox handler container_id:\"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" id:\"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" pid:4549 exit_status:137 exited_at:{seconds:1752621238 nanos:435902173}" Jul 15 23:13:58.477703 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635-rootfs.mount: Deactivated successfully. Jul 15 23:13:58.478758 containerd[1512]: time="2025-07-15T23:13:58.478617575Z" level=info msg="shim disconnected" id=5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635 namespace=k8s.io Jul 15 23:13:58.478758 containerd[1512]: time="2025-07-15T23:13:58.478672496Z" level=warning msg="cleaning up after shim disconnected" id=5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635 namespace=k8s.io Jul 15 23:13:58.478758 containerd[1512]: time="2025-07-15T23:13:58.478704897Z" level=info msg="cleaning up dead shim" namespace=k8s.io Jul 15 23:13:58.500455 containerd[1512]: time="2025-07-15T23:13:58.500357203Z" level=info msg="received exit event sandbox_id:\"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" exit_status:137 exited_at:{seconds:1752621238 nanos:435902173}" Jul 15 23:13:58.506386 systemd[1]: run-containerd-io.containerd.grpc.v1.cri-sandboxes-5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635-shm.mount: Deactivated successfully. Jul 15 23:13:58.556698 kubelet[2668]: I0715 23:13:58.556526 2668 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:13:58.573669 systemd-networkd[1424]: cali74b548697b0: Link DOWN Jul 15 23:13:58.573675 systemd-networkd[1424]: cali74b548697b0: Lost carrier Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.571 [INFO][5578] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.571 [INFO][5578] cni-plugin/dataplane_linux.go 559: Deleting workload's device in netns. ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" iface="eth0" netns="/var/run/netns/cni-7a9e287e-411b-5b2b-f76f-2655d823a0ef" Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.572 [INFO][5578] cni-plugin/dataplane_linux.go 570: Entered netns, deleting veth. ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" iface="eth0" netns="/var/run/netns/cni-7a9e287e-411b-5b2b-f76f-2655d823a0ef" Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.581 [INFO][5578] cni-plugin/dataplane_linux.go 604: Deleted device in netns. ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" after=8.97504ms iface="eth0" netns="/var/run/netns/cni-7a9e287e-411b-5b2b-f76f-2655d823a0ef" Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.581 [INFO][5578] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.581 [INFO][5578] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.629 [INFO][5586] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.629 [INFO][5586] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.629 [INFO][5586] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.708 [INFO][5586] ipam/ipam_plugin.go 431: Released address using handleID ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.708 [INFO][5586] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.711 [INFO][5586] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:13:58.716384 containerd[1512]: 2025-07-15 23:13:58.714 [INFO][5578] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:13:58.719725 systemd[1]: run-netns-cni\x2d7a9e287e\x2d411b\x2d5b2b\x2df76f\x2d2655d823a0ef.mount: Deactivated successfully. Jul 15 23:13:58.720761 containerd[1512]: time="2025-07-15T23:13:58.720699573Z" level=info msg="TearDown network for sandbox \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" successfully" Jul 15 23:13:58.720761 containerd[1512]: time="2025-07-15T23:13:58.720735254Z" level=info msg="StopPodSandbox for \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" returns successfully" Jul 15 23:13:58.828711 kubelet[2668]: I0715 23:13:58.828583 2668 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kkq7\" (UniqueName: \"kubernetes.io/projected/29bdaeaf-6929-486a-a35b-f522167f96fa-kube-api-access-6kkq7\") pod \"29bdaeaf-6929-486a-a35b-f522167f96fa\" (UID: \"29bdaeaf-6929-486a-a35b-f522167f96fa\") " Jul 15 23:13:58.828952 kubelet[2668]: I0715 23:13:58.828636 2668 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/29bdaeaf-6929-486a-a35b-f522167f96fa-calico-apiserver-certs\") pod \"29bdaeaf-6929-486a-a35b-f522167f96fa\" (UID: \"29bdaeaf-6929-486a-a35b-f522167f96fa\") " Jul 15 23:13:58.836900 kubelet[2668]: I0715 23:13:58.836815 2668 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29bdaeaf-6929-486a-a35b-f522167f96fa-kube-api-access-6kkq7" (OuterVolumeSpecName: "kube-api-access-6kkq7") pod "29bdaeaf-6929-486a-a35b-f522167f96fa" (UID: "29bdaeaf-6929-486a-a35b-f522167f96fa"). InnerVolumeSpecName "kube-api-access-6kkq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jul 15 23:13:58.837466 kubelet[2668]: I0715 23:13:58.837426 2668 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29bdaeaf-6929-486a-a35b-f522167f96fa-calico-apiserver-certs" (OuterVolumeSpecName: "calico-apiserver-certs") pod "29bdaeaf-6929-486a-a35b-f522167f96fa" (UID: "29bdaeaf-6929-486a-a35b-f522167f96fa"). InnerVolumeSpecName "calico-apiserver-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jul 15 23:13:58.837837 systemd[1]: var-lib-kubelet-pods-29bdaeaf\x2d6929\x2d486a\x2da35b\x2df522167f96fa-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d6kkq7.mount: Deactivated successfully. Jul 15 23:13:58.930236 kubelet[2668]: I0715 23:13:58.930152 2668 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kkq7\" (UniqueName: \"kubernetes.io/projected/29bdaeaf-6929-486a-a35b-f522167f96fa-kube-api-access-6kkq7\") on node \"ci-4372-0-1-n-91aeaf5bee\" DevicePath \"\"" Jul 15 23:13:58.930791 kubelet[2668]: I0715 23:13:58.930748 2668 reconciler_common.go:293] "Volume detached for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/29bdaeaf-6929-486a-a35b-f522167f96fa-calico-apiserver-certs\") on node \"ci-4372-0-1-n-91aeaf5bee\" DevicePath \"\"" Jul 15 23:13:59.102757 systemd[1]: Removed slice kubepods-besteffort-pod29bdaeaf_6929_486a_a35b_f522167f96fa.slice - libcontainer container kubepods-besteffort-pod29bdaeaf_6929_486a_a35b_f522167f96fa.slice. Jul 15 23:13:59.102910 systemd[1]: kubepods-besteffort-pod29bdaeaf_6929_486a_a35b_f522167f96fa.slice: Consumed 1.443s CPU time, 49.4M memory peak. Jul 15 23:13:59.393472 systemd[1]: var-lib-kubelet-pods-29bdaeaf\x2d6929\x2d486a\x2da35b\x2df522167f96fa-volumes-kubernetes.io\x7esecret-calico\x2dapiserver\x2dcerts.mount: Deactivated successfully. Jul 15 23:14:01.093184 kubelet[2668]: I0715 23:14:01.093130 2668 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29bdaeaf-6929-486a-a35b-f522167f96fa" path="/var/lib/kubelet/pods/29bdaeaf-6929-486a-a35b-f522167f96fa/volumes" Jul 15 23:14:16.105349 containerd[1512]: time="2025-07-15T23:14:16.105295751Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\" id:\"503b43f8c1197497fd239c2cdf33642d5b37efd0abe464b70be79c56fe803fd3\" pid:5647 exited_at:{seconds:1752621256 nanos:102800061}" Jul 15 23:14:16.120930 containerd[1512]: time="2025-07-15T23:14:16.120189771Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"c4c65b9a29ebe7af82f92dfa1a8a7d6ae0879bacb9da7e3ce699b9590f6aa8a3\" pid:5636 exited_at:{seconds:1752621256 nanos:119603119}" Jul 15 23:14:16.296274 update_engine[1489]: I20250715 23:14:16.295604 1489 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Jul 15 23:14:16.296274 update_engine[1489]: I20250715 23:14:16.295651 1489 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Jul 15 23:14:16.296274 update_engine[1489]: I20250715 23:14:16.295945 1489 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Jul 15 23:14:16.296887 update_engine[1489]: I20250715 23:14:16.296860 1489 omaha_request_params.cc:62] Current group set to alpha Jul 15 23:14:16.298865 update_engine[1489]: I20250715 23:14:16.298193 1489 update_attempter.cc:499] Already updated boot flags. Skipping. Jul 15 23:14:16.298865 update_engine[1489]: I20250715 23:14:16.298691 1489 update_attempter.cc:643] Scheduling an action processor start. Jul 15 23:14:16.298865 update_engine[1489]: I20250715 23:14:16.298722 1489 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jul 15 23:14:16.302112 update_engine[1489]: I20250715 23:14:16.301810 1489 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Jul 15 23:14:16.303906 update_engine[1489]: I20250715 23:14:16.302619 1489 omaha_request_action.cc:271] Posting an Omaha request to disabled Jul 15 23:14:16.303906 update_engine[1489]: I20250715 23:14:16.302649 1489 omaha_request_action.cc:272] Request: Jul 15 23:14:16.303906 update_engine[1489]: Jul 15 23:14:16.303906 update_engine[1489]: Jul 15 23:14:16.303906 update_engine[1489]: Jul 15 23:14:16.303906 update_engine[1489]: Jul 15 23:14:16.303906 update_engine[1489]: Jul 15 23:14:16.303906 update_engine[1489]: Jul 15 23:14:16.303906 update_engine[1489]: Jul 15 23:14:16.303906 update_engine[1489]: Jul 15 23:14:16.303906 update_engine[1489]: I20250715 23:14:16.302659 1489 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 23:14:16.308880 update_engine[1489]: I20250715 23:14:16.308322 1489 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 23:14:16.308981 update_engine[1489]: I20250715 23:14:16.308815 1489 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 23:14:16.309657 update_engine[1489]: E20250715 23:14:16.309624 1489 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 23:14:16.309800 update_engine[1489]: I20250715 23:14:16.309782 1489 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Jul 15 23:14:16.314254 locksmithd[1533]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Jul 15 23:14:18.767884 containerd[1512]: time="2025-07-15T23:14:18.766872481Z" level=info msg="TaskExit event in podsandbox handler container_id:\"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\" id:\"218a134e513c6a6f06ed4745d8580870bf59b7140793ef91161db219fa7664d4\" pid:5677 exited_at:{seconds:1752621258 nanos:766582595}" Jul 15 23:14:26.278883 update_engine[1489]: I20250715 23:14:26.278681 1489 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 23:14:26.279306 update_engine[1489]: I20250715 23:14:26.279011 1489 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 23:14:26.279336 update_engine[1489]: I20250715 23:14:26.279306 1489 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 23:14:26.279860 update_engine[1489]: E20250715 23:14:26.279791 1489 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 23:14:26.279941 update_engine[1489]: I20250715 23:14:26.279922 1489 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Jul 15 23:14:31.168043 containerd[1512]: time="2025-07-15T23:14:31.167961590Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"c8e2fc06d62ad6da0cf09886fd01072f0c74b56bab748be3e511430873c9c597\" pid:5704 exited_at:{seconds:1752621271 nanos:167581622}" Jul 15 23:14:36.287050 update_engine[1489]: I20250715 23:14:36.286891 1489 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 23:14:36.287419 update_engine[1489]: I20250715 23:14:36.287200 1489 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 23:14:36.287742 update_engine[1489]: I20250715 23:14:36.287518 1489 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 23:14:36.288128 update_engine[1489]: E20250715 23:14:36.288079 1489 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 23:14:36.288207 update_engine[1489]: I20250715 23:14:36.288163 1489 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Jul 15 23:14:41.109732 systemd[1]: Started sshd@7-91.99.216.80:22-139.178.68.195:56538.service - OpenSSH per-connection server daemon (139.178.68.195:56538). Jul 15 23:14:42.124765 sshd[5720]: Accepted publickey for core from 139.178.68.195 port 56538 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:14:42.127790 sshd-session[5720]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:14:42.134893 systemd-logind[1488]: New session 8 of user core. Jul 15 23:14:42.141203 systemd[1]: Started session-8.scope - Session 8 of User core. Jul 15 23:14:42.906879 sshd[5722]: Connection closed by 139.178.68.195 port 56538 Jul 15 23:14:42.905769 sshd-session[5720]: pam_unix(sshd:session): session closed for user core Jul 15 23:14:42.912222 systemd[1]: sshd@7-91.99.216.80:22-139.178.68.195:56538.service: Deactivated successfully. Jul 15 23:14:42.915308 systemd[1]: session-8.scope: Deactivated successfully. Jul 15 23:14:42.918063 systemd-logind[1488]: Session 8 logged out. Waiting for processes to exit. Jul 15 23:14:42.919823 systemd-logind[1488]: Removed session 8. Jul 15 23:14:43.129816 kubelet[2668]: I0715 23:14:43.129744 2668 scope.go:117] "RemoveContainer" containerID="a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9" Jul 15 23:14:43.133143 containerd[1512]: time="2025-07-15T23:14:43.133091812Z" level=info msg="RemoveContainer for \"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\"" Jul 15 23:14:43.138288 containerd[1512]: time="2025-07-15T23:14:43.138241564Z" level=info msg="RemoveContainer for \"a82d1c4bcd44115d8fa4f4e8fd78bc69b97e80de5cf8fbd66ebdc9d21669f3f9\" returns successfully" Jul 15 23:14:43.140890 containerd[1512]: time="2025-07-15T23:14:43.140307809Z" level=info msg="StopPodSandbox for \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\"" Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.190 [WARNING][5743] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.190 [INFO][5743] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.190 [INFO][5743] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" iface="eth0" netns="" Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.190 [INFO][5743] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.190 [INFO][5743] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.239 [INFO][5750] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.239 [INFO][5750] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.239 [INFO][5750] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.251 [WARNING][5750] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.251 [INFO][5750] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.254 [INFO][5750] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:14:43.261177 containerd[1512]: 2025-07-15 23:14:43.257 [INFO][5743] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:14:43.261815 containerd[1512]: time="2025-07-15T23:14:43.261144206Z" level=info msg="TearDown network for sandbox \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" successfully" Jul 15 23:14:43.261815 containerd[1512]: time="2025-07-15T23:14:43.261196687Z" level=info msg="StopPodSandbox for \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" returns successfully" Jul 15 23:14:43.262816 containerd[1512]: time="2025-07-15T23:14:43.262730080Z" level=info msg="RemovePodSandbox for \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\"" Jul 15 23:14:43.263326 containerd[1512]: time="2025-07-15T23:14:43.263274092Z" level=info msg="Forcibly stopping sandbox \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\"" Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.308 [WARNING][5764] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.309 [INFO][5764] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.309 [INFO][5764] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" iface="eth0" netns="" Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.309 [INFO][5764] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.309 [INFO][5764] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.334 [INFO][5772] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.335 [INFO][5772] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.335 [INFO][5772] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.348 [WARNING][5772] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.348 [INFO][5772] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" HandleID="k8s-pod-network.5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--tb5xs-eth0" Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.350 [INFO][5772] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:14:43.355726 containerd[1512]: 2025-07-15 23:14:43.352 [INFO][5764] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635" Jul 15 23:14:43.355726 containerd[1512]: time="2025-07-15T23:14:43.355587546Z" level=info msg="TearDown network for sandbox \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" successfully" Jul 15 23:14:43.358950 containerd[1512]: time="2025-07-15T23:14:43.358829457Z" level=info msg="Ensure that sandbox 5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635 in task-service has been cleanup successfully" Jul 15 23:14:43.362793 containerd[1512]: time="2025-07-15T23:14:43.362593419Z" level=info msg="RemovePodSandbox \"5889586088c88cb38246ea8cf3713448386d0a5243f4144858b24cc5c28c7635\" returns successfully" Jul 15 23:14:43.364161 containerd[1512]: time="2025-07-15T23:14:43.363653202Z" level=info msg="StopPodSandbox for \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\"" Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.409 [WARNING][5786] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.409 [INFO][5786] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.409 [INFO][5786] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" iface="eth0" netns="" Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.409 [INFO][5786] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.409 [INFO][5786] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.434 [INFO][5793] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.434 [INFO][5793] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.434 [INFO][5793] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.449 [WARNING][5793] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.449 [INFO][5793] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.452 [INFO][5793] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:14:43.457891 containerd[1512]: 2025-07-15 23:14:43.456 [INFO][5786] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:14:43.458417 containerd[1512]: time="2025-07-15T23:14:43.457960940Z" level=info msg="TearDown network for sandbox \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" successfully" Jul 15 23:14:43.458417 containerd[1512]: time="2025-07-15T23:14:43.457994700Z" level=info msg="StopPodSandbox for \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" returns successfully" Jul 15 23:14:43.459374 containerd[1512]: time="2025-07-15T23:14:43.458923681Z" level=info msg="RemovePodSandbox for \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\"" Jul 15 23:14:43.459374 containerd[1512]: time="2025-07-15T23:14:43.458980042Z" level=info msg="Forcibly stopping sandbox \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\"" Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.543 [WARNING][5807] cni-plugin/k8s.go 598: WorkloadEndpoint does not exist in the datastore, moving forward with the clean up ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" WorkloadEndpoint="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.543 [INFO][5807] cni-plugin/k8s.go 640: Cleaning up netns ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.543 [INFO][5807] cni-plugin/dataplane_linux.go 555: CleanUpNamespace called with no netns name, ignoring. ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" iface="eth0" netns="" Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.543 [INFO][5807] cni-plugin/k8s.go 647: Releasing IP address(es) ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.543 [INFO][5807] cni-plugin/utils.go 188: Calico CNI releasing IP address ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.586 [INFO][5815] ipam/ipam_plugin.go 412: Releasing address using handleID ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.586 [INFO][5815] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.586 [INFO][5815] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.602 [WARNING][5815] ipam/ipam_plugin.go 429: Asked to release address but it doesn't exist. Ignoring ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.602 [INFO][5815] ipam/ipam_plugin.go 440: Releasing address using workloadID ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" HandleID="k8s-pod-network.4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Workload="ci--4372--0--1--n--91aeaf5bee-k8s-calico--apiserver--f5986dd7d--qm2lh-eth0" Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.605 [INFO][5815] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Jul 15 23:14:43.609812 containerd[1512]: 2025-07-15 23:14:43.607 [INFO][5807] cni-plugin/k8s.go 653: Teardown processing complete. ContainerID="4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d" Jul 15 23:14:43.611952 containerd[1512]: time="2025-07-15T23:14:43.610675191Z" level=info msg="TearDown network for sandbox \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" successfully" Jul 15 23:14:43.613089 containerd[1512]: time="2025-07-15T23:14:43.613015443Z" level=info msg="Ensure that sandbox 4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d in task-service has been cleanup successfully" Jul 15 23:14:43.617467 containerd[1512]: time="2025-07-15T23:14:43.617407178Z" level=info msg="RemovePodSandbox \"4a688397883f0a159d31c3f8c431f20c90a8f316b0ef5d31b7d68cea9b644b4d\" returns successfully" Jul 15 23:14:46.125256 containerd[1512]: time="2025-07-15T23:14:46.125203342Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\" id:\"a9ddf435360553cce88d0929f960c5a1fe59993dac274a39eee9caf3b9c3e741\" pid:5852 exited_at:{seconds:1752621286 nanos:124895736}" Jul 15 23:14:46.205783 containerd[1512]: time="2025-07-15T23:14:46.204881970Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"ccc377f08689ff629bd51e6b596e3a83e8ab8e681928c05cf13d6c0c9cbb8b9a\" pid:5839 exited_at:{seconds:1752621286 nanos:204278316}" Jul 15 23:14:46.281195 update_engine[1489]: I20250715 23:14:46.281107 1489 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 23:14:46.281908 update_engine[1489]: I20250715 23:14:46.281347 1489 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 23:14:46.281908 update_engine[1489]: I20250715 23:14:46.281580 1489 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 23:14:46.282115 update_engine[1489]: E20250715 23:14:46.282003 1489 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 23:14:46.282115 update_engine[1489]: I20250715 23:14:46.282068 1489 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jul 15 23:14:46.282115 update_engine[1489]: I20250715 23:14:46.282078 1489 omaha_request_action.cc:617] Omaha request response: Jul 15 23:14:46.282373 update_engine[1489]: E20250715 23:14:46.282162 1489 omaha_request_action.cc:636] Omaha request network transfer failed. Jul 15 23:14:46.282373 update_engine[1489]: I20250715 23:14:46.282178 1489 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Jul 15 23:14:46.282373 update_engine[1489]: I20250715 23:14:46.282183 1489 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 15 23:14:46.282373 update_engine[1489]: I20250715 23:14:46.282192 1489 update_attempter.cc:306] Processing Done. Jul 15 23:14:46.282373 update_engine[1489]: E20250715 23:14:46.282207 1489 update_attempter.cc:619] Update failed. Jul 15 23:14:46.282373 update_engine[1489]: I20250715 23:14:46.282212 1489 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Jul 15 23:14:46.282373 update_engine[1489]: I20250715 23:14:46.282218 1489 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Jul 15 23:14:46.282373 update_engine[1489]: I20250715 23:14:46.282223 1489 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Jul 15 23:14:46.283666 update_engine[1489]: I20250715 23:14:46.282925 1489 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Jul 15 23:14:46.283666 update_engine[1489]: I20250715 23:14:46.282991 1489 omaha_request_action.cc:271] Posting an Omaha request to disabled Jul 15 23:14:46.283666 update_engine[1489]: I20250715 23:14:46.283000 1489 omaha_request_action.cc:272] Request: Jul 15 23:14:46.283666 update_engine[1489]: Jul 15 23:14:46.283666 update_engine[1489]: Jul 15 23:14:46.283666 update_engine[1489]: Jul 15 23:14:46.283666 update_engine[1489]: Jul 15 23:14:46.283666 update_engine[1489]: Jul 15 23:14:46.283666 update_engine[1489]: Jul 15 23:14:46.283666 update_engine[1489]: I20250715 23:14:46.283010 1489 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Jul 15 23:14:46.283666 update_engine[1489]: I20250715 23:14:46.283254 1489 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Jul 15 23:14:46.283666 update_engine[1489]: I20250715 23:14:46.283577 1489 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Jul 15 23:14:46.284141 locksmithd[1533]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Jul 15 23:14:46.284761 update_engine[1489]: E20250715 23:14:46.284559 1489 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Jul 15 23:14:46.284761 update_engine[1489]: I20250715 23:14:46.284622 1489 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Jul 15 23:14:46.284761 update_engine[1489]: I20250715 23:14:46.284636 1489 omaha_request_action.cc:617] Omaha request response: Jul 15 23:14:46.284761 update_engine[1489]: I20250715 23:14:46.284664 1489 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 15 23:14:46.284761 update_engine[1489]: I20250715 23:14:46.284674 1489 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Jul 15 23:14:46.285231 update_engine[1489]: I20250715 23:14:46.284878 1489 update_attempter.cc:306] Processing Done. Jul 15 23:14:46.285231 update_engine[1489]: I20250715 23:14:46.284898 1489 update_attempter.cc:310] Error event sent. Jul 15 23:14:46.285231 update_engine[1489]: I20250715 23:14:46.284911 1489 update_check_scheduler.cc:74] Next update check in 49m59s Jul 15 23:14:46.285384 locksmithd[1533]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Jul 15 23:14:48.084097 systemd[1]: Started sshd@8-91.99.216.80:22-139.178.68.195:56544.service - OpenSSH per-connection server daemon (139.178.68.195:56544). Jul 15 23:14:48.906897 containerd[1512]: time="2025-07-15T23:14:48.906788885Z" level=info msg="TaskExit event in podsandbox handler container_id:\"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\" id:\"807fa3667285b3f81cdafa8166d7a31a1921afbec23c69ae177c275a34a47bed\" pid:5888 exited_at:{seconds:1752621288 nanos:903795539}" Jul 15 23:14:49.103678 sshd[5872]: Accepted publickey for core from 139.178.68.195 port 56544 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:14:49.105814 sshd-session[5872]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:14:49.115270 systemd-logind[1488]: New session 9 of user core. Jul 15 23:14:49.121048 systemd[1]: Started session-9.scope - Session 9 of User core. Jul 15 23:14:49.911248 sshd[5899]: Connection closed by 139.178.68.195 port 56544 Jul 15 23:14:49.912225 sshd-session[5872]: pam_unix(sshd:session): session closed for user core Jul 15 23:14:49.918443 systemd-logind[1488]: Session 9 logged out. Waiting for processes to exit. Jul 15 23:14:49.918572 systemd[1]: sshd@8-91.99.216.80:22-139.178.68.195:56544.service: Deactivated successfully. Jul 15 23:14:49.922705 systemd[1]: session-9.scope: Deactivated successfully. Jul 15 23:14:49.927573 systemd-logind[1488]: Removed session 9. Jul 15 23:14:52.198794 containerd[1512]: time="2025-07-15T23:14:52.198730455Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\" id:\"6faac65813414abf87ac16c79c868061748d4267b0551dd20c8c290bf496c67c\" pid:5924 exited_at:{seconds:1752621292 nanos:198102041}" Jul 15 23:14:55.092214 systemd[1]: Started sshd@9-91.99.216.80:22-139.178.68.195:58142.service - OpenSSH per-connection server daemon (139.178.68.195:58142). Jul 15 23:14:56.124384 sshd[5939]: Accepted publickey for core from 139.178.68.195 port 58142 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:14:56.127035 sshd-session[5939]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:14:56.134931 systemd-logind[1488]: New session 10 of user core. Jul 15 23:14:56.141107 systemd[1]: Started session-10.scope - Session 10 of User core. Jul 15 23:14:56.908304 sshd[5942]: Connection closed by 139.178.68.195 port 58142 Jul 15 23:14:56.909201 sshd-session[5939]: pam_unix(sshd:session): session closed for user core Jul 15 23:14:56.916400 systemd-logind[1488]: Session 10 logged out. Waiting for processes to exit. Jul 15 23:14:56.917335 systemd[1]: sshd@9-91.99.216.80:22-139.178.68.195:58142.service: Deactivated successfully. Jul 15 23:14:56.921407 systemd[1]: session-10.scope: Deactivated successfully. Jul 15 23:14:56.923443 systemd-logind[1488]: Removed session 10. Jul 15 23:14:57.112117 systemd[1]: Started sshd@10-91.99.216.80:22-139.178.68.195:58146.service - OpenSSH per-connection server daemon (139.178.68.195:58146). Jul 15 23:14:58.210502 sshd[5955]: Accepted publickey for core from 139.178.68.195 port 58146 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:14:58.213309 sshd-session[5955]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:14:58.224379 systemd-logind[1488]: New session 11 of user core. Jul 15 23:14:58.232117 systemd[1]: Started session-11.scope - Session 11 of User core. Jul 15 23:14:59.078525 sshd[5957]: Connection closed by 139.178.68.195 port 58146 Jul 15 23:14:59.078176 sshd-session[5955]: pam_unix(sshd:session): session closed for user core Jul 15 23:14:59.083409 systemd-logind[1488]: Session 11 logged out. Waiting for processes to exit. Jul 15 23:14:59.085606 systemd[1]: sshd@10-91.99.216.80:22-139.178.68.195:58146.service: Deactivated successfully. Jul 15 23:14:59.092467 systemd[1]: session-11.scope: Deactivated successfully. Jul 15 23:14:59.095800 systemd-logind[1488]: Removed session 11. Jul 15 23:14:59.231539 systemd[1]: Started sshd@11-91.99.216.80:22-139.178.68.195:58162.service - OpenSSH per-connection server daemon (139.178.68.195:58162). Jul 15 23:15:00.226505 sshd[5967]: Accepted publickey for core from 139.178.68.195 port 58162 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:15:00.228825 sshd-session[5967]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:15:00.237245 systemd-logind[1488]: New session 12 of user core. Jul 15 23:15:00.243072 systemd[1]: Started session-12.scope - Session 12 of User core. Jul 15 23:15:00.982820 sshd[5969]: Connection closed by 139.178.68.195 port 58162 Jul 15 23:15:00.982659 sshd-session[5967]: pam_unix(sshd:session): session closed for user core Jul 15 23:15:00.989597 systemd-logind[1488]: Session 12 logged out. Waiting for processes to exit. Jul 15 23:15:00.990429 systemd[1]: sshd@11-91.99.216.80:22-139.178.68.195:58162.service: Deactivated successfully. Jul 15 23:15:00.993786 systemd[1]: session-12.scope: Deactivated successfully. Jul 15 23:15:00.998691 systemd-logind[1488]: Removed session 12. Jul 15 23:15:06.155274 systemd[1]: Started sshd@12-91.99.216.80:22-139.178.68.195:60846.service - OpenSSH per-connection server daemon (139.178.68.195:60846). Jul 15 23:15:07.157228 sshd[6001]: Accepted publickey for core from 139.178.68.195 port 60846 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:15:07.159282 sshd-session[6001]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:15:07.166754 systemd-logind[1488]: New session 13 of user core. Jul 15 23:15:07.173067 systemd[1]: Started session-13.scope - Session 13 of User core. Jul 15 23:15:07.920933 sshd[6003]: Connection closed by 139.178.68.195 port 60846 Jul 15 23:15:07.921781 sshd-session[6001]: pam_unix(sshd:session): session closed for user core Jul 15 23:15:07.926759 systemd-logind[1488]: Session 13 logged out. Waiting for processes to exit. Jul 15 23:15:07.927711 systemd[1]: sshd@12-91.99.216.80:22-139.178.68.195:60846.service: Deactivated successfully. Jul 15 23:15:07.931987 systemd[1]: session-13.scope: Deactivated successfully. Jul 15 23:15:07.934682 systemd-logind[1488]: Removed session 13. Jul 15 23:15:08.130546 systemd[1]: Started sshd@13-91.99.216.80:22-139.178.68.195:60854.service - OpenSSH per-connection server daemon (139.178.68.195:60854). Jul 15 23:15:09.228268 sshd[6015]: Accepted publickey for core from 139.178.68.195 port 60854 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:15:09.230598 sshd-session[6015]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:15:09.237443 systemd-logind[1488]: New session 14 of user core. Jul 15 23:15:09.243158 systemd[1]: Started session-14.scope - Session 14 of User core. Jul 15 23:15:10.207984 sshd[6017]: Connection closed by 139.178.68.195 port 60854 Jul 15 23:15:10.208630 sshd-session[6015]: pam_unix(sshd:session): session closed for user core Jul 15 23:15:10.216333 systemd[1]: sshd@13-91.99.216.80:22-139.178.68.195:60854.service: Deactivated successfully. Jul 15 23:15:10.222958 systemd[1]: session-14.scope: Deactivated successfully. Jul 15 23:15:10.225914 systemd-logind[1488]: Session 14 logged out. Waiting for processes to exit. Jul 15 23:15:10.229837 systemd-logind[1488]: Removed session 14. Jul 15 23:15:10.401034 systemd[1]: Started sshd@14-91.99.216.80:22-139.178.68.195:45782.service - OpenSSH per-connection server daemon (139.178.68.195:45782). Jul 15 23:15:11.499827 sshd[6027]: Accepted publickey for core from 139.178.68.195 port 45782 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:15:11.502006 sshd-session[6027]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:15:11.508647 systemd-logind[1488]: New session 15 of user core. Jul 15 23:15:11.517148 systemd[1]: Started session-15.scope - Session 15 of User core. Jul 15 23:15:14.252653 sshd[6029]: Connection closed by 139.178.68.195 port 45782 Jul 15 23:15:14.254492 sshd-session[6027]: pam_unix(sshd:session): session closed for user core Jul 15 23:15:14.262136 systemd[1]: sshd@14-91.99.216.80:22-139.178.68.195:45782.service: Deactivated successfully. Jul 15 23:15:14.265692 systemd[1]: session-15.scope: Deactivated successfully. Jul 15 23:15:14.266224 systemd[1]: session-15.scope: Consumed 614ms CPU time, 75.6M memory peak. Jul 15 23:15:14.268427 systemd-logind[1488]: Session 15 logged out. Waiting for processes to exit. Jul 15 23:15:14.270360 systemd-logind[1488]: Removed session 15. Jul 15 23:15:14.447520 systemd[1]: Started sshd@15-91.99.216.80:22-139.178.68.195:45788.service - OpenSSH per-connection server daemon (139.178.68.195:45788). Jul 15 23:15:15.547116 sshd[6046]: Accepted publickey for core from 139.178.68.195 port 45788 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:15:15.549631 sshd-session[6046]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:15:15.555533 systemd-logind[1488]: New session 16 of user core. Jul 15 23:15:15.562081 systemd[1]: Started session-16.scope - Session 16 of User core. Jul 15 23:15:16.089236 containerd[1512]: time="2025-07-15T23:15:16.088617542Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\" id:\"a9e533f0e1e18405127a2f6dd42d19f014d6145d2e4f8ef634d234f70d35292e\" pid:6080 exited_at:{seconds:1752621316 nanos:87407715}" Jul 15 23:15:16.128009 containerd[1512]: time="2025-07-15T23:15:16.127951672Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"9d14f4e19fc44a100f45d1e37c1af6e378f8b5f21d5d12f48730b8dc4c7e7c91\" pid:6068 exited_at:{seconds:1752621316 nanos:126959010}" Jul 15 23:15:16.499980 sshd[6048]: Connection closed by 139.178.68.195 port 45788 Jul 15 23:15:16.500996 sshd-session[6046]: pam_unix(sshd:session): session closed for user core Jul 15 23:15:16.506452 systemd[1]: sshd@15-91.99.216.80:22-139.178.68.195:45788.service: Deactivated successfully. Jul 15 23:15:16.510091 systemd[1]: session-16.scope: Deactivated successfully. Jul 15 23:15:16.511259 systemd-logind[1488]: Session 16 logged out. Waiting for processes to exit. Jul 15 23:15:16.514480 systemd-logind[1488]: Removed session 16. Jul 15 23:15:16.667097 systemd[1]: Started sshd@16-91.99.216.80:22-139.178.68.195:45794.service - OpenSSH per-connection server daemon (139.178.68.195:45794). Jul 15 23:15:17.689690 sshd[6101]: Accepted publickey for core from 139.178.68.195 port 45794 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:15:17.691191 sshd-session[6101]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:15:17.702078 systemd-logind[1488]: New session 17 of user core. Jul 15 23:15:17.708987 systemd[1]: Started session-17.scope - Session 17 of User core. Jul 15 23:15:18.497432 sshd[6103]: Connection closed by 139.178.68.195 port 45794 Jul 15 23:15:18.496490 sshd-session[6101]: pam_unix(sshd:session): session closed for user core Jul 15 23:15:18.503641 systemd-logind[1488]: Session 17 logged out. Waiting for processes to exit. Jul 15 23:15:18.504259 systemd[1]: sshd@16-91.99.216.80:22-139.178.68.195:45794.service: Deactivated successfully. Jul 15 23:15:18.507674 systemd[1]: session-17.scope: Deactivated successfully. Jul 15 23:15:18.511953 systemd-logind[1488]: Removed session 17. Jul 15 23:15:18.792419 containerd[1512]: time="2025-07-15T23:15:18.792281726Z" level=info msg="TaskExit event in podsandbox handler container_id:\"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\" id:\"e896f5b61f2dfec9ca8fddecfd79deff3eacc022a3b53f010d19332d9199613a\" pid:6132 exited_at:{seconds:1752621318 nanos:791609311}" Jul 15 23:15:23.677588 systemd[1]: Started sshd@17-91.99.216.80:22-139.178.68.195:34172.service - OpenSSH per-connection server daemon (139.178.68.195:34172). Jul 15 23:15:24.710392 sshd[6147]: Accepted publickey for core from 139.178.68.195 port 34172 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:15:24.713745 sshd-session[6147]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:15:24.724421 systemd-logind[1488]: New session 18 of user core. Jul 15 23:15:24.732213 systemd[1]: Started session-18.scope - Session 18 of User core. Jul 15 23:15:25.520958 sshd[6149]: Connection closed by 139.178.68.195 port 34172 Jul 15 23:15:25.521506 sshd-session[6147]: pam_unix(sshd:session): session closed for user core Jul 15 23:15:25.526467 systemd-logind[1488]: Session 18 logged out. Waiting for processes to exit. Jul 15 23:15:25.527949 systemd[1]: sshd@17-91.99.216.80:22-139.178.68.195:34172.service: Deactivated successfully. Jul 15 23:15:25.533141 systemd[1]: session-18.scope: Deactivated successfully. Jul 15 23:15:25.536278 systemd-logind[1488]: Removed session 18. Jul 15 23:15:30.696479 systemd[1]: Started sshd@18-91.99.216.80:22-139.178.68.195:46562.service - OpenSSH per-connection server daemon (139.178.68.195:46562). Jul 15 23:15:31.141827 containerd[1512]: time="2025-07-15T23:15:31.141766793Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"52f368a856d06ff9cf9533a07824a295ca139ef1ad59054149672e132be88d0e\" pid:6175 exited_at:{seconds:1752621331 nanos:140975055}" Jul 15 23:15:31.712635 sshd[6160]: Accepted publickey for core from 139.178.68.195 port 46562 ssh2: RSA SHA256:+cMC7rDY11ooX0rGk8xTzTdhKmHBDbuiScEsywsTdAk Jul 15 23:15:31.715736 sshd-session[6160]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jul 15 23:15:31.722036 systemd-logind[1488]: New session 19 of user core. Jul 15 23:15:31.727028 systemd[1]: Started session-19.scope - Session 19 of User core. Jul 15 23:15:32.477442 sshd[6186]: Connection closed by 139.178.68.195 port 46562 Jul 15 23:15:32.478422 sshd-session[6160]: pam_unix(sshd:session): session closed for user core Jul 15 23:15:32.484626 systemd[1]: sshd@18-91.99.216.80:22-139.178.68.195:46562.service: Deactivated successfully. Jul 15 23:15:32.489363 systemd[1]: session-19.scope: Deactivated successfully. Jul 15 23:15:32.490956 systemd-logind[1488]: Session 19 logged out. Waiting for processes to exit. Jul 15 23:15:32.493097 systemd-logind[1488]: Removed session 19. Jul 15 23:15:46.046392 containerd[1512]: time="2025-07-15T23:15:46.046340189Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\" id:\"e18690607f035b00f3b2a84c9a5a181bea0ec07ed5aea796d7287c692a665089\" pid:6229 exited_at:{seconds:1752621346 nanos:46071583}" Jul 15 23:15:46.072043 containerd[1512]: time="2025-07-15T23:15:46.071968136Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97713e2a680c60771d2d8f2756803d2f57f9c70eadbd0faf46f5160b72c84a01\" id:\"1eb44766c3eefbdc9eb114e38d468b186a35fe22bd173494ccb7a35994edeb0b\" pid:6212 exited_at:{seconds:1752621346 nanos:71510726}" Jul 15 23:15:46.859369 systemd[1]: cri-containerd-fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5.scope: Deactivated successfully. Jul 15 23:15:46.861597 systemd[1]: cri-containerd-fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5.scope: Consumed 3.874s CPU time, 60M memory peak, 2.6M read from disk. Jul 15 23:15:46.863078 containerd[1512]: time="2025-07-15T23:15:46.862956532Z" level=info msg="received exit event container_id:\"fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5\" id:\"fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5\" pid:2513 exit_status:1 exited_at:{seconds:1752621346 nanos:861303774}" Jul 15 23:15:46.863167 containerd[1512]: time="2025-07-15T23:15:46.863120015Z" level=info msg="TaskExit event in podsandbox handler container_id:\"fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5\" id:\"fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5\" pid:2513 exit_status:1 exited_at:{seconds:1752621346 nanos:861303774}" Jul 15 23:15:46.895646 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5-rootfs.mount: Deactivated successfully. Jul 15 23:15:46.920159 kubelet[2668]: I0715 23:15:46.919833 2668 scope.go:117] "RemoveContainer" containerID="fdd8bb0f8d968e8039cb84bc62df3a46075efb2c59589e21bad885c966c53db5" Jul 15 23:15:46.925530 containerd[1512]: time="2025-07-15T23:15:46.925476683Z" level=info msg="CreateContainer within sandbox \"4f0e01a9fa54188028a3565c8024d3635cc15aeaf6792066e2d97731de1fc24a\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jul 15 23:15:46.938334 containerd[1512]: time="2025-07-15T23:15:46.938068252Z" level=info msg="Container f0455184177f900fe064abb2d136588ba409079d45eaf125b23e894fcab01475: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:15:46.943172 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount43503951.mount: Deactivated successfully. Jul 15 23:15:46.953111 containerd[1512]: time="2025-07-15T23:15:46.953047555Z" level=info msg="CreateContainer within sandbox \"4f0e01a9fa54188028a3565c8024d3635cc15aeaf6792066e2d97731de1fc24a\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"f0455184177f900fe064abb2d136588ba409079d45eaf125b23e894fcab01475\"" Jul 15 23:15:46.953880 containerd[1512]: time="2025-07-15T23:15:46.953811612Z" level=info msg="StartContainer for \"f0455184177f900fe064abb2d136588ba409079d45eaf125b23e894fcab01475\"" Jul 15 23:15:46.955986 containerd[1512]: time="2025-07-15T23:15:46.955874740Z" level=info msg="connecting to shim f0455184177f900fe064abb2d136588ba409079d45eaf125b23e894fcab01475" address="unix:///run/containerd/s/98ec46a433129f8ba016994d760164ab7e3d5df326d4a6c8560004079f856c5c" protocol=ttrpc version=3 Jul 15 23:15:46.983035 systemd[1]: Started cri-containerd-f0455184177f900fe064abb2d136588ba409079d45eaf125b23e894fcab01475.scope - libcontainer container f0455184177f900fe064abb2d136588ba409079d45eaf125b23e894fcab01475. Jul 15 23:15:47.035025 containerd[1512]: time="2025-07-15T23:15:47.034872629Z" level=info msg="StartContainer for \"f0455184177f900fe064abb2d136588ba409079d45eaf125b23e894fcab01475\" returns successfully" Jul 15 23:15:47.314012 kubelet[2668]: E0715 23:15:47.313897 2668 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:60534->10.0.0.2:2379: read: connection timed out" Jul 15 23:15:48.470132 systemd[1]: cri-containerd-57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586.scope: Deactivated successfully. Jul 15 23:15:48.470501 systemd[1]: cri-containerd-57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586.scope: Consumed 24.269s CPU time, 113.8M memory peak, 3.7M read from disk. Jul 15 23:15:48.475379 containerd[1512]: time="2025-07-15T23:15:48.474620813Z" level=info msg="received exit event container_id:\"57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586\" id:\"57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586\" pid:2999 exit_status:1 exited_at:{seconds:1752621348 nanos:473478666}" Jul 15 23:15:48.476400 containerd[1512]: time="2025-07-15T23:15:48.476348532Z" level=info msg="TaskExit event in podsandbox handler container_id:\"57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586\" id:\"57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586\" pid:2999 exit_status:1 exited_at:{seconds:1752621348 nanos:473478666}" Jul 15 23:15:48.506414 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586-rootfs.mount: Deactivated successfully. Jul 15 23:15:48.757798 containerd[1512]: time="2025-07-15T23:15:48.757658818Z" level=info msg="TaskExit event in podsandbox handler container_id:\"461f62f940208a4a693f40412bb1dbb2b586fe20a81031725316d3117987e7de\" id:\"3688755c094d9516569cb7772be7fe0d1ce7f76501a7704d9efb12cac895e4c0\" pid:6314 exited_at:{seconds:1752621348 nanos:757229688}" Jul 15 23:15:48.936399 kubelet[2668]: I0715 23:15:48.936320 2668 scope.go:117] "RemoveContainer" containerID="57b1b8857d14cf0afa359c238ec19ba354424b88b35ff52a8f62a8a56d63c586" Jul 15 23:15:48.941478 containerd[1512]: time="2025-07-15T23:15:48.941423028Z" level=info msg="CreateContainer within sandbox \"d2e457d410b36301902c681c4607f7ce7b71f4b89493d6649bb0d3a54f3811a1\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jul 15 23:15:48.956352 containerd[1512]: time="2025-07-15T23:15:48.955451950Z" level=info msg="Container 01c6f7206ed5a767055da639b089f64e4caaa89a4e3b472c7b43a53f3686ce31: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:15:48.965522 containerd[1512]: time="2025-07-15T23:15:48.965482460Z" level=info msg="CreateContainer within sandbox \"d2e457d410b36301902c681c4607f7ce7b71f4b89493d6649bb0d3a54f3811a1\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"01c6f7206ed5a767055da639b089f64e4caaa89a4e3b472c7b43a53f3686ce31\"" Jul 15 23:15:48.967061 containerd[1512]: time="2025-07-15T23:15:48.967026135Z" level=info msg="StartContainer for \"01c6f7206ed5a767055da639b089f64e4caaa89a4e3b472c7b43a53f3686ce31\"" Jul 15 23:15:48.968343 containerd[1512]: time="2025-07-15T23:15:48.968310124Z" level=info msg="connecting to shim 01c6f7206ed5a767055da639b089f64e4caaa89a4e3b472c7b43a53f3686ce31" address="unix:///run/containerd/s/3fbc25577f74d3ccbb50896ec87d174916f1543fa1a673a86620f35ea84c88a2" protocol=ttrpc version=3 Jul 15 23:15:48.993032 systemd[1]: Started cri-containerd-01c6f7206ed5a767055da639b089f64e4caaa89a4e3b472c7b43a53f3686ce31.scope - libcontainer container 01c6f7206ed5a767055da639b089f64e4caaa89a4e3b472c7b43a53f3686ce31. Jul 15 23:15:49.039763 containerd[1512]: time="2025-07-15T23:15:49.039389073Z" level=info msg="StartContainer for \"01c6f7206ed5a767055da639b089f64e4caaa89a4e3b472c7b43a53f3686ce31\" returns successfully" Jul 15 23:15:51.668970 kubelet[2668]: E0715 23:15:51.668632 2668 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:60378->10.0.0.2:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4372-0-1-n-91aeaf5bee.18528fd2f5cb8288 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4372-0-1-n-91aeaf5bee,UID:0fe268a4ee8fc52243b93ea09c6ed498,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4372-0-1-n-91aeaf5bee,},FirstTimestamp:2025-07-15 23:15:41.230203528 +0000 UTC m=+178.264743438,LastTimestamp:2025-07-15 23:15:41.230203528 +0000 UTC m=+178.264743438,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4372-0-1-n-91aeaf5bee,}" Jul 15 23:15:52.198029 containerd[1512]: time="2025-07-15T23:15:52.197974278Z" level=info msg="TaskExit event in podsandbox handler container_id:\"05cb205b1f2ddebc03f66d10c73f2739d28860348da898b6f30161737af8a2ce\" id:\"f0129701c5df8089862b06606adc3f58399824634c55831d02fd832582282d70\" pid:6367 exit_status:1 exited_at:{seconds:1752621352 nanos:197374344}" Jul 15 23:15:52.362832 systemd[1]: cri-containerd-55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd.scope: Deactivated successfully. Jul 15 23:15:52.363936 systemd[1]: cri-containerd-55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd.scope: Consumed 2.898s CPU time, 23.1M memory peak, 2.8M read from disk. Jul 15 23:15:52.368002 containerd[1512]: time="2025-07-15T23:15:52.367960456Z" level=info msg="received exit event container_id:\"55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd\" id:\"55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd\" pid:2531 exit_status:1 exited_at:{seconds:1752621352 nanos:367554447}" Jul 15 23:15:52.368278 containerd[1512]: time="2025-07-15T23:15:52.368223422Z" level=info msg="TaskExit event in podsandbox handler container_id:\"55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd\" id:\"55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd\" pid:2531 exit_status:1 exited_at:{seconds:1752621352 nanos:367554447}" Jul 15 23:15:52.397585 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd-rootfs.mount: Deactivated successfully. Jul 15 23:15:52.955082 kubelet[2668]: I0715 23:15:52.955046 2668 scope.go:117] "RemoveContainer" containerID="55d5d08c147fb024ec1c882caf7d8d14ef7e7eb3ad86aacdce09aaab9934ebbd" Jul 15 23:15:52.958497 containerd[1512]: time="2025-07-15T23:15:52.958462517Z" level=info msg="CreateContainer within sandbox \"9f2b0c232ec9a91076fdbc2131dadedc332ff6c0ca74bb53a0cd4ba374312575\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Jul 15 23:15:52.969726 containerd[1512]: time="2025-07-15T23:15:52.968921397Z" level=info msg="Container 49f50e2564c6cc4ecd5f7790cf3cf57484987ecf75ad4e841c033d8f5b172242: CDI devices from CRI Config.CDIDevices: []" Jul 15 23:15:52.985322 containerd[1512]: time="2025-07-15T23:15:52.985271812Z" level=info msg="CreateContainer within sandbox \"9f2b0c232ec9a91076fdbc2131dadedc332ff6c0ca74bb53a0cd4ba374312575\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"49f50e2564c6cc4ecd5f7790cf3cf57484987ecf75ad4e841c033d8f5b172242\"" Jul 15 23:15:52.986006 containerd[1512]: time="2025-07-15T23:15:52.985978868Z" level=info msg="StartContainer for \"49f50e2564c6cc4ecd5f7790cf3cf57484987ecf75ad4e841c033d8f5b172242\"" Jul 15 23:15:52.987391 containerd[1512]: time="2025-07-15T23:15:52.987359540Z" level=info msg="connecting to shim 49f50e2564c6cc4ecd5f7790cf3cf57484987ecf75ad4e841c033d8f5b172242" address="unix:///run/containerd/s/69683dcbbe27a3963d3389cdbff78471cbeba379606dfa1d1cfe668f9cc78e5a" protocol=ttrpc version=3 Jul 15 23:15:53.018147 systemd[1]: Started cri-containerd-49f50e2564c6cc4ecd5f7790cf3cf57484987ecf75ad4e841c033d8f5b172242.scope - libcontainer container 49f50e2564c6cc4ecd5f7790cf3cf57484987ecf75ad4e841c033d8f5b172242. Jul 15 23:15:53.074741 containerd[1512]: time="2025-07-15T23:15:53.074623661Z" level=info msg="StartContainer for \"49f50e2564c6cc4ecd5f7790cf3cf57484987ecf75ad4e841c033d8f5b172242\" returns successfully"