Aug 19 00:15:58.809876 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Aug 19 00:15:58.809900 kernel: Linux version 6.12.41-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.0 p8) 14.3.0, GNU ld (Gentoo 2.44 p4) 2.44.0) #1 SMP PREEMPT Mon Aug 18 22:15:14 -00 2025 Aug 19 00:15:58.809910 kernel: KASLR enabled Aug 19 00:15:58.809916 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Aug 19 00:15:58.809922 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Aug 19 00:15:58.809927 kernel: random: crng init done Aug 19 00:15:58.809934 kernel: secureboot: Secure boot disabled Aug 19 00:15:58.809939 kernel: ACPI: Early table checksum verification disabled Aug 19 00:15:58.809945 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Aug 19 00:15:58.809951 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Aug 19 00:15:58.809959 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:15:58.809964 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:15:58.809970 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:15:58.809976 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:15:58.809983 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:15:58.809990 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:15:58.809996 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:15:58.810002 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:15:58.810008 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Aug 19 00:15:58.810014 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Aug 19 00:15:58.810020 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Aug 19 00:15:58.810026 kernel: ACPI: Use ACPI SPCR as default console: Yes Aug 19 00:15:58.810032 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Aug 19 00:15:58.810038 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Aug 19 00:15:58.810044 kernel: Zone ranges: Aug 19 00:15:58.810050 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Aug 19 00:15:58.810057 kernel: DMA32 empty Aug 19 00:15:58.810063 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Aug 19 00:15:58.810069 kernel: Device empty Aug 19 00:15:58.810075 kernel: Movable zone start for each node Aug 19 00:15:58.810081 kernel: Early memory node ranges Aug 19 00:15:58.810087 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Aug 19 00:15:58.810094 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Aug 19 00:15:58.810099 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Aug 19 00:15:58.810105 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Aug 19 00:15:58.810111 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Aug 19 00:15:58.810117 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Aug 19 00:15:58.810124 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Aug 19 00:15:58.813267 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Aug 19 00:15:58.813285 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Aug 19 00:15:58.813301 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Aug 19 00:15:58.813308 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Aug 19 00:15:58.813352 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Aug 19 00:15:58.813367 kernel: psci: probing for conduit method from ACPI. Aug 19 00:15:58.813374 kernel: psci: PSCIv1.1 detected in firmware. Aug 19 00:15:58.813381 kernel: psci: Using standard PSCI v0.2 function IDs Aug 19 00:15:58.813387 kernel: psci: Trusted OS migration not required Aug 19 00:15:58.813394 kernel: psci: SMC Calling Convention v1.1 Aug 19 00:15:58.813400 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Aug 19 00:15:58.813407 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Aug 19 00:15:58.813414 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Aug 19 00:15:58.813420 kernel: pcpu-alloc: [0] 0 [0] 1 Aug 19 00:15:58.813427 kernel: Detected PIPT I-cache on CPU0 Aug 19 00:15:58.813433 kernel: CPU features: detected: GIC system register CPU interface Aug 19 00:15:58.813442 kernel: CPU features: detected: Spectre-v4 Aug 19 00:15:58.813449 kernel: CPU features: detected: Spectre-BHB Aug 19 00:15:58.813455 kernel: CPU features: kernel page table isolation forced ON by KASLR Aug 19 00:15:58.813462 kernel: CPU features: detected: Kernel page table isolation (KPTI) Aug 19 00:15:58.813468 kernel: CPU features: detected: ARM erratum 1418040 Aug 19 00:15:58.813475 kernel: CPU features: detected: SSBS not fully self-synchronizing Aug 19 00:15:58.813481 kernel: alternatives: applying boot alternatives Aug 19 00:15:58.813490 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=a868ccde263e96e0a18737fdbf04ca04bbf30dfe23963f1ae3994966e8fc9468 Aug 19 00:15:58.813497 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. Aug 19 00:15:58.813503 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Aug 19 00:15:58.813512 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Aug 19 00:15:58.813518 kernel: Fallback order for Node 0: 0 Aug 19 00:15:58.813525 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Aug 19 00:15:58.813531 kernel: Policy zone: Normal Aug 19 00:15:58.813538 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Aug 19 00:15:58.813544 kernel: software IO TLB: area num 2. Aug 19 00:15:58.813550 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Aug 19 00:15:58.813557 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Aug 19 00:15:58.813564 kernel: rcu: Preemptible hierarchical RCU implementation. Aug 19 00:15:58.813571 kernel: rcu: RCU event tracing is enabled. Aug 19 00:15:58.813578 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Aug 19 00:15:58.813584 kernel: Trampoline variant of Tasks RCU enabled. Aug 19 00:15:58.813593 kernel: Tracing variant of Tasks RCU enabled. Aug 19 00:15:58.813599 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Aug 19 00:15:58.813606 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Aug 19 00:15:58.813613 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 00:15:58.813662 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Aug 19 00:15:58.813669 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Aug 19 00:15:58.813676 kernel: GICv3: 256 SPIs implemented Aug 19 00:15:58.813682 kernel: GICv3: 0 Extended SPIs implemented Aug 19 00:15:58.813688 kernel: Root IRQ handler: gic_handle_irq Aug 19 00:15:58.813695 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Aug 19 00:15:58.813702 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Aug 19 00:15:58.813708 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Aug 19 00:15:58.813718 kernel: ITS [mem 0x08080000-0x0809ffff] Aug 19 00:15:58.813724 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Aug 19 00:15:58.813731 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Aug 19 00:15:58.813738 kernel: GICv3: using LPI property table @0x0000000100120000 Aug 19 00:15:58.813744 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Aug 19 00:15:58.813751 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Aug 19 00:15:58.813757 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 19 00:15:58.813764 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Aug 19 00:15:58.813771 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Aug 19 00:15:58.813777 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Aug 19 00:15:58.813784 kernel: Console: colour dummy device 80x25 Aug 19 00:15:58.813792 kernel: ACPI: Core revision 20240827 Aug 19 00:15:58.813799 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Aug 19 00:15:58.813806 kernel: pid_max: default: 32768 minimum: 301 Aug 19 00:15:58.813813 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Aug 19 00:15:58.813819 kernel: landlock: Up and running. Aug 19 00:15:58.813826 kernel: SELinux: Initializing. Aug 19 00:15:58.813833 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 00:15:58.813840 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Aug 19 00:15:58.813846 kernel: rcu: Hierarchical SRCU implementation. Aug 19 00:15:58.813855 kernel: rcu: Max phase no-delay instances is 400. Aug 19 00:15:58.813862 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Aug 19 00:15:58.813868 kernel: Remapping and enabling EFI services. Aug 19 00:15:58.813875 kernel: smp: Bringing up secondary CPUs ... Aug 19 00:15:58.813882 kernel: Detected PIPT I-cache on CPU1 Aug 19 00:15:58.813889 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Aug 19 00:15:58.813896 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Aug 19 00:15:58.813903 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Aug 19 00:15:58.813910 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Aug 19 00:15:58.813918 kernel: smp: Brought up 1 node, 2 CPUs Aug 19 00:15:58.813930 kernel: SMP: Total of 2 processors activated. Aug 19 00:15:58.813937 kernel: CPU: All CPU(s) started at EL1 Aug 19 00:15:58.813946 kernel: CPU features: detected: 32-bit EL0 Support Aug 19 00:15:58.813953 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Aug 19 00:15:58.813960 kernel: CPU features: detected: Common not Private translations Aug 19 00:15:58.813967 kernel: CPU features: detected: CRC32 instructions Aug 19 00:15:58.813996 kernel: CPU features: detected: Enhanced Virtualization Traps Aug 19 00:15:58.814007 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Aug 19 00:15:58.814015 kernel: CPU features: detected: LSE atomic instructions Aug 19 00:15:58.814022 kernel: CPU features: detected: Privileged Access Never Aug 19 00:15:58.814029 kernel: CPU features: detected: RAS Extension Support Aug 19 00:15:58.814036 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Aug 19 00:15:58.814043 kernel: alternatives: applying system-wide alternatives Aug 19 00:15:58.814051 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Aug 19 00:15:58.814058 kernel: Memory: 3859620K/4096000K available (11136K kernel code, 2436K rwdata, 9060K rodata, 38912K init, 1038K bss, 214900K reserved, 16384K cma-reserved) Aug 19 00:15:58.814066 kernel: devtmpfs: initialized Aug 19 00:15:58.814075 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Aug 19 00:15:58.814082 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Aug 19 00:15:58.814090 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Aug 19 00:15:58.814097 kernel: 0 pages in range for non-PLT usage Aug 19 00:15:58.814104 kernel: 508576 pages in range for PLT usage Aug 19 00:15:58.814111 kernel: pinctrl core: initialized pinctrl subsystem Aug 19 00:15:58.814118 kernel: SMBIOS 3.0.0 present. Aug 19 00:15:58.814125 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Aug 19 00:15:58.814204 kernel: DMI: Memory slots populated: 1/1 Aug 19 00:15:58.814215 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Aug 19 00:15:58.814223 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Aug 19 00:15:58.814230 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Aug 19 00:15:58.814237 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Aug 19 00:15:58.814244 kernel: audit: initializing netlink subsys (disabled) Aug 19 00:15:58.814251 kernel: audit: type=2000 audit(0.017:1): state=initialized audit_enabled=0 res=1 Aug 19 00:15:58.814258 kernel: thermal_sys: Registered thermal governor 'step_wise' Aug 19 00:15:58.814265 kernel: cpuidle: using governor menu Aug 19 00:15:58.814273 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Aug 19 00:15:58.814281 kernel: ASID allocator initialised with 32768 entries Aug 19 00:15:58.814288 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Aug 19 00:15:58.814295 kernel: Serial: AMBA PL011 UART driver Aug 19 00:15:58.814303 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Aug 19 00:15:58.814310 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Aug 19 00:15:58.814329 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Aug 19 00:15:58.814336 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Aug 19 00:15:58.814343 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Aug 19 00:15:58.814351 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Aug 19 00:15:58.814360 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Aug 19 00:15:58.814367 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Aug 19 00:15:58.814374 kernel: ACPI: Added _OSI(Module Device) Aug 19 00:15:58.814381 kernel: ACPI: Added _OSI(Processor Device) Aug 19 00:15:58.814388 kernel: ACPI: Added _OSI(Processor Aggregator Device) Aug 19 00:15:58.814395 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Aug 19 00:15:58.814402 kernel: ACPI: Interpreter enabled Aug 19 00:15:58.814409 kernel: ACPI: Using GIC for interrupt routing Aug 19 00:15:58.814416 kernel: ACPI: MCFG table detected, 1 entries Aug 19 00:15:58.814426 kernel: ACPI: CPU0 has been hot-added Aug 19 00:15:58.814433 kernel: ACPI: CPU1 has been hot-added Aug 19 00:15:58.814440 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Aug 19 00:15:58.814450 kernel: printk: legacy console [ttyAMA0] enabled Aug 19 00:15:58.814457 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Aug 19 00:15:58.814641 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Aug 19 00:15:58.814709 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Aug 19 00:15:58.814816 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Aug 19 00:15:58.814896 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Aug 19 00:15:58.814955 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Aug 19 00:15:58.814965 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Aug 19 00:15:58.814972 kernel: PCI host bridge to bus 0000:00 Aug 19 00:15:58.815045 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Aug 19 00:15:58.815099 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Aug 19 00:15:58.817263 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Aug 19 00:15:58.817396 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Aug 19 00:15:58.817488 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Aug 19 00:15:58.817561 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Aug 19 00:15:58.817622 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Aug 19 00:15:58.817685 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Aug 19 00:15:58.817758 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 19 00:15:58.817821 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Aug 19 00:15:58.817880 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 19 00:15:58.817941 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Aug 19 00:15:58.817999 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Aug 19 00:15:58.818064 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 19 00:15:58.818122 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Aug 19 00:15:58.818204 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 19 00:15:58.818267 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Aug 19 00:15:58.818348 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 19 00:15:58.818410 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Aug 19 00:15:58.818468 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 19 00:15:58.818526 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Aug 19 00:15:58.818583 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Aug 19 00:15:58.818650 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 19 00:15:58.818712 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Aug 19 00:15:58.818771 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 19 00:15:58.818830 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Aug 19 00:15:58.818960 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Aug 19 00:15:58.819036 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 19 00:15:58.819096 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Aug 19 00:15:58.820009 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 19 00:15:58.820091 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Aug 19 00:15:58.820192 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Aug 19 00:15:58.820265 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 19 00:15:58.820368 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Aug 19 00:15:58.820437 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 19 00:15:58.820502 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Aug 19 00:15:58.820570 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Aug 19 00:15:58.820654 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 19 00:15:58.820728 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Aug 19 00:15:58.820796 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 19 00:15:58.820855 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Aug 19 00:15:58.822383 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Aug 19 00:15:58.822482 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 19 00:15:58.822550 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Aug 19 00:15:58.822622 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 19 00:15:58.822687 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Aug 19 00:15:58.822760 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Aug 19 00:15:58.822826 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Aug 19 00:15:58.822889 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 19 00:15:58.822953 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Aug 19 00:15:58.823028 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Aug 19 00:15:58.823092 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Aug 19 00:15:58.823212 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Aug 19 00:15:58.823285 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Aug 19 00:15:58.823366 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Aug 19 00:15:58.823434 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Aug 19 00:15:58.823511 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Aug 19 00:15:58.823581 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Aug 19 00:15:58.823663 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Aug 19 00:15:58.823730 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Aug 19 00:15:58.823795 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Aug 19 00:15:58.823868 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Aug 19 00:15:58.823933 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Aug 19 00:15:58.824008 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Aug 19 00:15:58.824073 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Aug 19 00:15:58.827357 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Aug 19 00:15:58.827506 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Aug 19 00:15:58.827575 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Aug 19 00:15:58.827657 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Aug 19 00:15:58.827721 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Aug 19 00:15:58.827794 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Aug 19 00:15:58.827859 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Aug 19 00:15:58.827928 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Aug 19 00:15:58.827991 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Aug 19 00:15:58.828052 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Aug 19 00:15:58.828118 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Aug 19 00:15:58.828210 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Aug 19 00:15:58.828277 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Aug 19 00:15:58.828365 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Aug 19 00:15:58.828431 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Aug 19 00:15:58.828490 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Aug 19 00:15:58.828555 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Aug 19 00:15:58.828616 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Aug 19 00:15:58.828679 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Aug 19 00:15:58.828743 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Aug 19 00:15:58.828803 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Aug 19 00:15:58.828863 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 05] add_size 200000 add_align 100000 Aug 19 00:15:58.828929 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Aug 19 00:15:58.828991 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Aug 19 00:15:58.829052 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Aug 19 00:15:58.829118 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Aug 19 00:15:58.830360 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Aug 19 00:15:58.830458 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Aug 19 00:15:58.830554 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Aug 19 00:15:58.830622 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Aug 19 00:15:58.830688 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Aug 19 00:15:58.830800 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Aug 19 00:15:58.830889 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Aug 19 00:15:58.830962 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Aug 19 00:15:58.831044 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Aug 19 00:15:58.831121 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Aug 19 00:15:58.831256 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Aug 19 00:15:58.831350 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Aug 19 00:15:58.831433 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Aug 19 00:15:58.831513 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Aug 19 00:15:58.831591 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Aug 19 00:15:58.831666 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Aug 19 00:15:58.831742 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Aug 19 00:15:58.831814 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Aug 19 00:15:58.831889 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Aug 19 00:15:58.831961 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Aug 19 00:15:58.832034 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Aug 19 00:15:58.832109 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Aug 19 00:15:58.832287 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Aug 19 00:15:58.832384 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Aug 19 00:15:58.832461 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Aug 19 00:15:58.832535 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Aug 19 00:15:58.832616 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Aug 19 00:15:58.832690 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Aug 19 00:15:58.832763 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Aug 19 00:15:58.832840 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Aug 19 00:15:58.832921 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Aug 19 00:15:58.832996 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Aug 19 00:15:58.833069 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Aug 19 00:15:58.833160 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Aug 19 00:15:58.833236 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Aug 19 00:15:58.833309 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Aug 19 00:15:58.833436 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Aug 19 00:15:58.833514 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Aug 19 00:15:58.833586 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Aug 19 00:15:58.833650 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Aug 19 00:15:58.833780 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Aug 19 00:15:58.833895 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Aug 19 00:15:58.833960 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Aug 19 00:15:58.834020 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Aug 19 00:15:58.834080 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Aug 19 00:15:58.834431 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Aug 19 00:15:58.834533 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Aug 19 00:15:58.834603 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Aug 19 00:15:58.834663 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Aug 19 00:15:58.835114 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Aug 19 00:15:58.835220 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Aug 19 00:15:58.835283 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Aug 19 00:15:58.835386 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Aug 19 00:15:58.835451 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Aug 19 00:15:58.835530 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Aug 19 00:15:58.835627 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Aug 19 00:15:58.835695 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Aug 19 00:15:58.835754 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Aug 19 00:15:58.835814 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Aug 19 00:15:58.835881 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Aug 19 00:15:58.835942 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Aug 19 00:15:58.836004 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Aug 19 00:15:58.836063 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Aug 19 00:15:58.836123 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Aug 19 00:15:58.836226 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Aug 19 00:15:58.836297 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Aug 19 00:15:58.836374 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Aug 19 00:15:58.836436 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Aug 19 00:15:58.836494 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Aug 19 00:15:58.836554 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Aug 19 00:15:58.836623 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Aug 19 00:15:58.836684 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Aug 19 00:15:58.836743 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Aug 19 00:15:58.836800 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Aug 19 00:15:58.836860 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Aug 19 00:15:58.836928 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Aug 19 00:15:58.836989 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Aug 19 00:15:58.837053 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Aug 19 00:15:58.837114 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Aug 19 00:15:58.837218 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Aug 19 00:15:58.837287 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Aug 19 00:15:58.837375 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Aug 19 00:15:58.837442 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Aug 19 00:15:58.837505 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Aug 19 00:15:58.837574 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Aug 19 00:15:58.837633 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Aug 19 00:15:58.837694 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Aug 19 00:15:58.837824 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Aug 19 00:15:58.837896 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Aug 19 00:15:58.837957 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Aug 19 00:15:58.838015 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Aug 19 00:15:58.838073 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Aug 19 00:15:58.838195 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Aug 19 00:15:58.838268 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Aug 19 00:15:58.838373 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Aug 19 00:15:58.838505 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Aug 19 00:15:58.838578 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Aug 19 00:15:58.838633 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Aug 19 00:15:58.838689 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Aug 19 00:15:58.838817 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Aug 19 00:15:58.838886 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Aug 19 00:15:58.838950 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Aug 19 00:15:58.839014 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Aug 19 00:15:58.839068 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Aug 19 00:15:58.839122 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Aug 19 00:15:58.841375 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Aug 19 00:15:58.841475 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Aug 19 00:15:58.841545 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Aug 19 00:15:58.841632 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Aug 19 00:15:58.841695 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Aug 19 00:15:58.841749 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Aug 19 00:15:58.841812 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Aug 19 00:15:58.841870 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Aug 19 00:15:58.841924 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Aug 19 00:15:58.841990 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Aug 19 00:15:58.842046 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Aug 19 00:15:58.842102 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Aug 19 00:15:58.842209 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Aug 19 00:15:58.842277 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Aug 19 00:15:58.842357 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Aug 19 00:15:58.842422 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Aug 19 00:15:58.842527 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Aug 19 00:15:58.842597 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Aug 19 00:15:58.842661 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Aug 19 00:15:58.842716 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Aug 19 00:15:58.842770 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Aug 19 00:15:58.842780 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Aug 19 00:15:58.842788 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Aug 19 00:15:58.842795 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Aug 19 00:15:58.842807 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Aug 19 00:15:58.842814 kernel: iommu: Default domain type: Translated Aug 19 00:15:58.842822 kernel: iommu: DMA domain TLB invalidation policy: strict mode Aug 19 00:15:58.842829 kernel: efivars: Registered efivars operations Aug 19 00:15:58.842837 kernel: vgaarb: loaded Aug 19 00:15:58.842844 kernel: clocksource: Switched to clocksource arch_sys_counter Aug 19 00:15:58.842852 kernel: VFS: Disk quotas dquot_6.6.0 Aug 19 00:15:58.842859 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Aug 19 00:15:58.842867 kernel: pnp: PnP ACPI init Aug 19 00:15:58.842979 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Aug 19 00:15:58.842993 kernel: pnp: PnP ACPI: found 1 devices Aug 19 00:15:58.843001 kernel: NET: Registered PF_INET protocol family Aug 19 00:15:58.843009 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Aug 19 00:15:58.843017 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Aug 19 00:15:58.843025 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Aug 19 00:15:58.843033 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Aug 19 00:15:58.843041 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Aug 19 00:15:58.843051 kernel: TCP: Hash tables configured (established 32768 bind 32768) Aug 19 00:15:58.843059 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 00:15:58.843066 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Aug 19 00:15:58.843074 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Aug 19 00:15:58.843183 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Aug 19 00:15:58.843197 kernel: PCI: CLS 0 bytes, default 64 Aug 19 00:15:58.843205 kernel: kvm [1]: HYP mode not available Aug 19 00:15:58.843213 kernel: Initialise system trusted keyrings Aug 19 00:15:58.843220 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Aug 19 00:15:58.843232 kernel: Key type asymmetric registered Aug 19 00:15:58.843239 kernel: Asymmetric key parser 'x509' registered Aug 19 00:15:58.843247 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Aug 19 00:15:58.843254 kernel: io scheduler mq-deadline registered Aug 19 00:15:58.843262 kernel: io scheduler kyber registered Aug 19 00:15:58.843269 kernel: io scheduler bfq registered Aug 19 00:15:58.843277 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Aug 19 00:15:58.843404 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Aug 19 00:15:58.843473 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Aug 19 00:15:58.843539 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 19 00:15:58.843603 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Aug 19 00:15:58.843664 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Aug 19 00:15:58.843724 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 19 00:15:58.843792 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Aug 19 00:15:58.843853 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Aug 19 00:15:58.843915 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 19 00:15:58.843977 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Aug 19 00:15:58.844040 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Aug 19 00:15:58.844099 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 19 00:15:58.844177 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Aug 19 00:15:58.845241 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Aug 19 00:15:58.845370 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 19 00:15:58.845447 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Aug 19 00:15:58.845509 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Aug 19 00:15:58.845578 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 19 00:15:58.845640 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Aug 19 00:15:58.845702 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Aug 19 00:15:58.845761 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 19 00:15:58.845825 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Aug 19 00:15:58.845887 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Aug 19 00:15:58.845946 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 19 00:15:58.845957 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Aug 19 00:15:58.846020 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Aug 19 00:15:58.846083 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Aug 19 00:15:58.847389 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Aug 19 00:15:58.847408 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Aug 19 00:15:58.847416 kernel: ACPI: button: Power Button [PWRB] Aug 19 00:15:58.847425 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Aug 19 00:15:58.847498 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Aug 19 00:15:58.847566 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Aug 19 00:15:58.847584 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Aug 19 00:15:58.847592 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Aug 19 00:15:58.847655 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Aug 19 00:15:58.847666 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Aug 19 00:15:58.847674 kernel: thunder_xcv, ver 1.0 Aug 19 00:15:58.847681 kernel: thunder_bgx, ver 1.0 Aug 19 00:15:58.847689 kernel: nicpf, ver 1.0 Aug 19 00:15:58.847696 kernel: nicvf, ver 1.0 Aug 19 00:15:58.847770 kernel: rtc-efi rtc-efi.0: registered as rtc0 Aug 19 00:15:58.847831 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-08-19T00:15:58 UTC (1755562558) Aug 19 00:15:58.847841 kernel: hid: raw HID events driver (C) Jiri Kosina Aug 19 00:15:58.847849 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Aug 19 00:15:58.847857 kernel: NET: Registered PF_INET6 protocol family Aug 19 00:15:58.847905 kernel: watchdog: NMI not fully supported Aug 19 00:15:58.847915 kernel: watchdog: Hard watchdog permanently disabled Aug 19 00:15:58.847923 kernel: Segment Routing with IPv6 Aug 19 00:15:58.847930 kernel: In-situ OAM (IOAM) with IPv6 Aug 19 00:15:58.847938 kernel: NET: Registered PF_PACKET protocol family Aug 19 00:15:58.847948 kernel: Key type dns_resolver registered Aug 19 00:15:58.847956 kernel: registered taskstats version 1 Aug 19 00:15:58.847963 kernel: Loading compiled-in X.509 certificates Aug 19 00:15:58.847971 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.41-flatcar: becc5a61d1c5dcbcd174f4649c64b863031dbaa8' Aug 19 00:15:58.847979 kernel: Demotion targets for Node 0: null Aug 19 00:15:58.847986 kernel: Key type .fscrypt registered Aug 19 00:15:58.847994 kernel: Key type fscrypt-provisioning registered Aug 19 00:15:58.848002 kernel: ima: No TPM chip found, activating TPM-bypass! Aug 19 00:15:58.848011 kernel: ima: Allocated hash algorithm: sha1 Aug 19 00:15:58.848019 kernel: ima: No architecture policies found Aug 19 00:15:58.848027 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Aug 19 00:15:58.848034 kernel: clk: Disabling unused clocks Aug 19 00:15:58.848042 kernel: PM: genpd: Disabling unused power domains Aug 19 00:15:58.848050 kernel: Warning: unable to open an initial console. Aug 19 00:15:58.848058 kernel: Freeing unused kernel memory: 38912K Aug 19 00:15:58.848065 kernel: Run /init as init process Aug 19 00:15:58.848073 kernel: with arguments: Aug 19 00:15:58.848082 kernel: /init Aug 19 00:15:58.848090 kernel: with environment: Aug 19 00:15:58.848097 kernel: HOME=/ Aug 19 00:15:58.848105 kernel: TERM=linux Aug 19 00:15:58.848113 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a Aug 19 00:15:58.848121 systemd[1]: Successfully made /usr/ read-only. Aug 19 00:15:58.848146 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 00:15:58.848155 systemd[1]: Detected virtualization kvm. Aug 19 00:15:58.848166 systemd[1]: Detected architecture arm64. Aug 19 00:15:58.848173 systemd[1]: Running in initrd. Aug 19 00:15:58.848181 systemd[1]: No hostname configured, using default hostname. Aug 19 00:15:58.848190 systemd[1]: Hostname set to . Aug 19 00:15:58.848197 systemd[1]: Initializing machine ID from VM UUID. Aug 19 00:15:58.848205 systemd[1]: Queued start job for default target initrd.target. Aug 19 00:15:58.848214 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 00:15:58.848222 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 00:15:58.848232 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Aug 19 00:15:58.848241 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 00:15:58.848250 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Aug 19 00:15:58.848259 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Aug 19 00:15:58.848269 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... Aug 19 00:15:58.848277 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... Aug 19 00:15:58.848285 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 00:15:58.848295 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 00:15:58.848303 systemd[1]: Reached target paths.target - Path Units. Aug 19 00:15:58.848311 systemd[1]: Reached target slices.target - Slice Units. Aug 19 00:15:58.848333 systemd[1]: Reached target swap.target - Swaps. Aug 19 00:15:58.848341 systemd[1]: Reached target timers.target - Timer Units. Aug 19 00:15:58.848349 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 00:15:58.848357 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 00:15:58.848365 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Aug 19 00:15:58.848375 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Aug 19 00:15:58.848383 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 00:15:58.848392 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 00:15:58.848404 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 00:15:58.848413 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 00:15:58.848423 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Aug 19 00:15:58.848432 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 00:15:58.848440 systemd[1]: Finished network-cleanup.service - Network Cleanup. Aug 19 00:15:58.848449 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Aug 19 00:15:58.848458 systemd[1]: Starting systemd-fsck-usr.service... Aug 19 00:15:58.848466 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 00:15:58.848475 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 00:15:58.848483 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:15:58.848491 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Aug 19 00:15:58.848501 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 00:15:58.848509 systemd[1]: Finished systemd-fsck-usr.service. Aug 19 00:15:58.848553 systemd-journald[244]: Collecting audit messages is disabled. Aug 19 00:15:58.848576 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 19 00:15:58.848585 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Aug 19 00:15:58.848593 kernel: Bridge firewalling registered Aug 19 00:15:58.848601 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 00:15:58.848609 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:15:58.848618 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Aug 19 00:15:58.848626 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 00:15:58.848635 systemd-journald[244]: Journal started Aug 19 00:15:58.848655 systemd-journald[244]: Runtime Journal (/run/log/journal/2e1cc688a50b419c906d5e2914990733) is 8M, max 76.5M, 68.5M free. Aug 19 00:15:58.809478 systemd-modules-load[246]: Inserted module 'overlay' Aug 19 00:15:58.829719 systemd-modules-load[246]: Inserted module 'br_netfilter' Aug 19 00:15:58.851413 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 00:15:58.854302 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 19 00:15:58.865395 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 00:15:58.868742 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 00:15:58.874585 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 00:15:58.877552 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 00:15:58.885286 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Aug 19 00:15:58.888267 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 00:15:58.891269 systemd-tmpfiles[274]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Aug 19 00:15:58.900414 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 00:15:58.904416 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 00:15:58.926336 dracut-cmdline[282]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=a868ccde263e96e0a18737fdbf04ca04bbf30dfe23963f1ae3994966e8fc9468 Aug 19 00:15:58.960818 systemd-resolved[285]: Positive Trust Anchors: Aug 19 00:15:58.960838 systemd-resolved[285]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 00:15:58.960870 systemd-resolved[285]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 00:15:58.967413 systemd-resolved[285]: Defaulting to hostname 'linux'. Aug 19 00:15:58.968656 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 00:15:58.970162 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 00:15:59.046202 kernel: SCSI subsystem initialized Aug 19 00:15:59.051169 kernel: Loading iSCSI transport class v2.0-870. Aug 19 00:15:59.059186 kernel: iscsi: registered transport (tcp) Aug 19 00:15:59.074447 kernel: iscsi: registered transport (qla4xxx) Aug 19 00:15:59.074521 kernel: QLogic iSCSI HBA Driver Aug 19 00:15:59.101466 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 00:15:59.129822 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 00:15:59.139127 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 00:15:59.200003 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Aug 19 00:15:59.202781 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Aug 19 00:15:59.277182 kernel: raid6: neonx8 gen() 14843 MB/s Aug 19 00:15:59.293217 kernel: raid6: neonx4 gen() 13993 MB/s Aug 19 00:15:59.310236 kernel: raid6: neonx2 gen() 13022 MB/s Aug 19 00:15:59.327214 kernel: raid6: neonx1 gen() 10187 MB/s Aug 19 00:15:59.344210 kernel: raid6: int64x8 gen() 6399 MB/s Aug 19 00:15:59.361216 kernel: raid6: int64x4 gen() 7248 MB/s Aug 19 00:15:59.378223 kernel: raid6: int64x2 gen() 5920 MB/s Aug 19 00:15:59.395200 kernel: raid6: int64x1 gen() 4639 MB/s Aug 19 00:15:59.395274 kernel: raid6: using algorithm neonx8 gen() 14843 MB/s Aug 19 00:15:59.412281 kernel: raid6: .... xor() 11678 MB/s, rmw enabled Aug 19 00:15:59.412385 kernel: raid6: using neon recovery algorithm Aug 19 00:15:59.417174 kernel: xor: measuring software checksum speed Aug 19 00:15:59.417249 kernel: 8regs : 18903 MB/sec Aug 19 00:15:59.418407 kernel: 32regs : 17776 MB/sec Aug 19 00:15:59.418431 kernel: arm64_neon : 27965 MB/sec Aug 19 00:15:59.418445 kernel: xor: using function: arm64_neon (27965 MB/sec) Aug 19 00:15:59.475225 kernel: Btrfs loaded, zoned=no, fsverity=no Aug 19 00:15:59.484890 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Aug 19 00:15:59.488445 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 00:15:59.521841 systemd-udevd[493]: Using default interface naming scheme 'v255'. Aug 19 00:15:59.526490 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 00:15:59.531539 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Aug 19 00:15:59.559445 dracut-pre-trigger[502]: rd.md=0: removing MD RAID activation Aug 19 00:15:59.591702 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 00:15:59.594926 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 00:15:59.663295 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 00:15:59.666980 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Aug 19 00:15:59.764161 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Aug 19 00:15:59.769173 kernel: scsi host0: Virtio SCSI HBA Aug 19 00:15:59.785186 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Aug 19 00:15:59.786171 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Aug 19 00:15:59.794180 kernel: ACPI: bus type USB registered Aug 19 00:15:59.794238 kernel: usbcore: registered new interface driver usbfs Aug 19 00:15:59.794250 kernel: usbcore: registered new interface driver hub Aug 19 00:15:59.795494 kernel: usbcore: registered new device driver usb Aug 19 00:15:59.838586 kernel: sd 0:0:0:1: Power-on or device reset occurred Aug 19 00:15:59.838847 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:15:59.841918 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Aug 19 00:15:59.842409 kernel: sd 0:0:0:1: [sda] Write Protect is off Aug 19 00:15:59.842657 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Aug 19 00:15:59.842957 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Aug 19 00:15:59.840304 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:15:59.843961 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:15:59.847634 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:15:59.852788 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Aug 19 00:15:59.852837 kernel: GPT:17805311 != 80003071 Aug 19 00:15:59.852856 kernel: GPT:Alternate GPT header not at the end of the disk. Aug 19 00:15:59.852866 kernel: GPT:17805311 != 80003071 Aug 19 00:15:59.852874 kernel: GPT: Use GNU Parted to correct GPT errors. Aug 19 00:15:59.851082 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. Aug 19 00:15:59.856339 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 19 00:15:59.856382 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Aug 19 00:15:59.864242 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 19 00:15:59.864459 kernel: sr 0:0:0:0: Power-on or device reset occurred Aug 19 00:15:59.866219 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Aug 19 00:15:59.868290 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Aug 19 00:15:59.868517 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Aug 19 00:15:59.870193 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Aug 19 00:15:59.871378 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Aug 19 00:15:59.874908 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Aug 19 00:15:59.875973 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Aug 19 00:15:59.877521 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Aug 19 00:15:59.877766 kernel: hub 1-0:1.0: USB hub found Aug 19 00:15:59.879166 kernel: hub 1-0:1.0: 4 ports detected Aug 19 00:15:59.882447 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Aug 19 00:15:59.882701 kernel: hub 2-0:1.0: USB hub found Aug 19 00:15:59.884119 kernel: hub 2-0:1.0: 4 ports detected Aug 19 00:15:59.889846 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:15:59.930503 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Aug 19 00:15:59.953970 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Aug 19 00:15:59.963971 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 19 00:15:59.973855 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - QEMU_HARDDISK USR-A. Aug 19 00:15:59.976018 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Aug 19 00:15:59.979948 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Aug 19 00:15:59.991468 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Aug 19 00:15:59.993873 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 00:15:59.995657 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 00:15:59.997113 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 00:15:59.999260 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Aug 19 00:16:00.001818 disk-uuid[598]: Primary Header is updated. Aug 19 00:16:00.001818 disk-uuid[598]: Secondary Entries is updated. Aug 19 00:16:00.001818 disk-uuid[598]: Secondary Header is updated. Aug 19 00:16:00.014168 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 19 00:16:00.025617 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Aug 19 00:16:00.031211 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 19 00:16:00.122402 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Aug 19 00:16:00.256362 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Aug 19 00:16:00.256426 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Aug 19 00:16:00.257346 kernel: usbcore: registered new interface driver usbhid Aug 19 00:16:00.258149 kernel: usbhid: USB HID core driver Aug 19 00:16:00.362202 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Aug 19 00:16:00.489183 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Aug 19 00:16:00.542218 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Aug 19 00:16:01.035175 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Aug 19 00:16:01.036568 disk-uuid[600]: The operation has completed successfully. Aug 19 00:16:01.106850 systemd[1]: disk-uuid.service: Deactivated successfully. Aug 19 00:16:01.108022 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Aug 19 00:16:01.136679 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... Aug 19 00:16:01.156598 sh[624]: Success Aug 19 00:16:01.172434 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Aug 19 00:16:01.172501 kernel: device-mapper: uevent: version 1.0.3 Aug 19 00:16:01.173784 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Aug 19 00:16:01.184603 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Aug 19 00:16:01.248748 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. Aug 19 00:16:01.252790 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... Aug 19 00:16:01.264111 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. Aug 19 00:16:01.277768 kernel: BTRFS: device fsid 1e492084-d287-4a43-8dc6-ad086a072625 devid 1 transid 45 /dev/mapper/usr (254:0) scanned by mount (637) Aug 19 00:16:01.277836 kernel: BTRFS info (device dm-0): first mount of filesystem 1e492084-d287-4a43-8dc6-ad086a072625 Aug 19 00:16:01.277853 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:16:01.279164 kernel: BTRFS info (device dm-0): using free-space-tree Aug 19 00:16:01.290234 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. Aug 19 00:16:01.291671 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Aug 19 00:16:01.292851 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Aug 19 00:16:01.294288 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Aug 19 00:16:01.295758 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Aug 19 00:16:01.337600 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (670) Aug 19 00:16:01.337660 kernel: BTRFS info (device sda6): first mount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:16:01.338206 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:16:01.339194 kernel: BTRFS info (device sda6): using free-space-tree Aug 19 00:16:01.351157 kernel: BTRFS info (device sda6): last unmount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:16:01.351775 systemd[1]: Finished ignition-setup.service - Ignition (setup). Aug 19 00:16:01.353758 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Aug 19 00:16:01.446643 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 00:16:01.450368 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 00:16:01.496518 systemd-networkd[807]: lo: Link UP Aug 19 00:16:01.497146 systemd-networkd[807]: lo: Gained carrier Aug 19 00:16:01.498826 systemd-networkd[807]: Enumeration completed Aug 19 00:16:01.499991 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 00:16:01.500740 systemd[1]: Reached target network.target - Network. Aug 19 00:16:01.502018 systemd-networkd[807]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:16:01.502022 systemd-networkd[807]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 00:16:01.504104 systemd-networkd[807]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:16:01.504108 systemd-networkd[807]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 00:16:01.504492 systemd-networkd[807]: eth0: Link UP Aug 19 00:16:01.504637 systemd-networkd[807]: eth1: Link UP Aug 19 00:16:01.504759 systemd-networkd[807]: eth0: Gained carrier Aug 19 00:16:01.504770 systemd-networkd[807]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:16:01.512393 systemd-networkd[807]: eth1: Gained carrier Aug 19 00:16:01.512419 systemd-networkd[807]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:16:01.522216 ignition[721]: Ignition 2.21.0 Aug 19 00:16:01.522230 ignition[721]: Stage: fetch-offline Aug 19 00:16:01.525525 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 00:16:01.522446 ignition[721]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:16:01.522458 ignition[721]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 19 00:16:01.528489 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Aug 19 00:16:01.522708 ignition[721]: parsed url from cmdline: "" Aug 19 00:16:01.522712 ignition[721]: no config URL provided Aug 19 00:16:01.522717 ignition[721]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 00:16:01.522792 ignition[721]: no config at "/usr/lib/ignition/user.ign" Aug 19 00:16:01.522798 ignition[721]: failed to fetch config: resource requires networking Aug 19 00:16:01.523190 ignition[721]: Ignition finished successfully Aug 19 00:16:01.542267 systemd-networkd[807]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 19 00:16:01.561270 systemd-networkd[807]: eth0: DHCPv4 address 91.99.87.156/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 19 00:16:01.561847 ignition[816]: Ignition 2.21.0 Aug 19 00:16:01.561862 ignition[816]: Stage: fetch Aug 19 00:16:01.562112 ignition[816]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:16:01.562126 ignition[816]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 19 00:16:01.563594 ignition[816]: parsed url from cmdline: "" Aug 19 00:16:01.563603 ignition[816]: no config URL provided Aug 19 00:16:01.563618 ignition[816]: reading system config file "/usr/lib/ignition/user.ign" Aug 19 00:16:01.563636 ignition[816]: no config at "/usr/lib/ignition/user.ign" Aug 19 00:16:01.563768 ignition[816]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Aug 19 00:16:01.573274 ignition[816]: GET result: OK Aug 19 00:16:01.573459 ignition[816]: parsing config with SHA512: 5faf918f788ac3f69f70e72236cbe9c308b08dbf3bfb55a1d428cc0ca5096b5e50247c3033b0500203349d1ddf5b29e8bcf3dee372a37df7cf253f8c40332b0c Aug 19 00:16:01.584316 unknown[816]: fetched base config from "system" Aug 19 00:16:01.584328 unknown[816]: fetched base config from "system" Aug 19 00:16:01.584783 ignition[816]: fetch: fetch complete Aug 19 00:16:01.584333 unknown[816]: fetched user config from "hetzner" Aug 19 00:16:01.584788 ignition[816]: fetch: fetch passed Aug 19 00:16:01.587609 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Aug 19 00:16:01.584843 ignition[816]: Ignition finished successfully Aug 19 00:16:01.591734 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Aug 19 00:16:01.623629 ignition[823]: Ignition 2.21.0 Aug 19 00:16:01.623648 ignition[823]: Stage: kargs Aug 19 00:16:01.623812 ignition[823]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:16:01.627212 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Aug 19 00:16:01.623822 ignition[823]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 19 00:16:01.624845 ignition[823]: kargs: kargs passed Aug 19 00:16:01.624904 ignition[823]: Ignition finished successfully Aug 19 00:16:01.631366 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Aug 19 00:16:01.667059 ignition[829]: Ignition 2.21.0 Aug 19 00:16:01.667834 ignition[829]: Stage: disks Aug 19 00:16:01.668004 ignition[829]: no configs at "/usr/lib/ignition/base.d" Aug 19 00:16:01.668014 ignition[829]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 19 00:16:01.669648 ignition[829]: disks: disks passed Aug 19 00:16:01.669728 ignition[829]: Ignition finished successfully Aug 19 00:16:01.671425 systemd[1]: Finished ignition-disks.service - Ignition (disks). Aug 19 00:16:01.672740 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Aug 19 00:16:01.673429 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Aug 19 00:16:01.674577 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 00:16:01.675663 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 00:16:01.676611 systemd[1]: Reached target basic.target - Basic System. Aug 19 00:16:01.678623 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Aug 19 00:16:01.712768 systemd-fsck[838]: ROOT: clean, 15/1628000 files, 120826/1617920 blocks Aug 19 00:16:01.718815 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Aug 19 00:16:01.721222 systemd[1]: Mounting sysroot.mount - /sysroot... Aug 19 00:16:01.804545 kernel: EXT4-fs (sda9): mounted filesystem 593a9299-85f8-44ab-a00f-cf95b7233713 r/w with ordered data mode. Quota mode: none. Aug 19 00:16:01.805216 systemd[1]: Mounted sysroot.mount - /sysroot. Aug 19 00:16:01.806479 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Aug 19 00:16:01.809050 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 00:16:01.814172 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Aug 19 00:16:01.820547 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Aug 19 00:16:01.821975 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Aug 19 00:16:01.822025 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 00:16:01.836225 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (846) Aug 19 00:16:01.840682 kernel: BTRFS info (device sda6): first mount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:16:01.840753 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:16:01.842125 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Aug 19 00:16:01.845777 kernel: BTRFS info (device sda6): using free-space-tree Aug 19 00:16:01.851223 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Aug 19 00:16:01.860693 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 00:16:01.905041 coreos-metadata[848]: Aug 19 00:16:01.904 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Aug 19 00:16:01.907716 coreos-metadata[848]: Aug 19 00:16:01.907 INFO Fetch successful Aug 19 00:16:01.909627 coreos-metadata[848]: Aug 19 00:16:01.909 INFO wrote hostname ci-4426-0-0-8-661ee896d9 to /sysroot/etc/hostname Aug 19 00:16:01.915157 initrd-setup-root[875]: cut: /sysroot/etc/passwd: No such file or directory Aug 19 00:16:01.916198 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 19 00:16:01.923715 initrd-setup-root[882]: cut: /sysroot/etc/group: No such file or directory Aug 19 00:16:01.931198 initrd-setup-root[889]: cut: /sysroot/etc/shadow: No such file or directory Aug 19 00:16:01.936125 initrd-setup-root[896]: cut: /sysroot/etc/gshadow: No such file or directory Aug 19 00:16:02.047634 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Aug 19 00:16:02.050553 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Aug 19 00:16:02.052101 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Aug 19 00:16:02.071157 kernel: BTRFS info (device sda6): last unmount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:16:02.093244 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Aug 19 00:16:02.103686 ignition[964]: INFO : Ignition 2.21.0 Aug 19 00:16:02.105893 ignition[964]: INFO : Stage: mount Aug 19 00:16:02.105893 ignition[964]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 00:16:02.105893 ignition[964]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 19 00:16:02.111445 ignition[964]: INFO : mount: mount passed Aug 19 00:16:02.111445 ignition[964]: INFO : Ignition finished successfully Aug 19 00:16:02.111176 systemd[1]: Finished ignition-mount.service - Ignition (mount). Aug 19 00:16:02.113073 systemd[1]: Starting ignition-files.service - Ignition (files)... Aug 19 00:16:02.277662 systemd[1]: sysroot-oem.mount: Deactivated successfully. Aug 19 00:16:02.282151 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Aug 19 00:16:02.312171 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (975) Aug 19 00:16:02.314496 kernel: BTRFS info (device sda6): first mount of filesystem de95eca0-5455-4710-9904-3d3a2312ef33 Aug 19 00:16:02.314568 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Aug 19 00:16:02.314586 kernel: BTRFS info (device sda6): using free-space-tree Aug 19 00:16:02.320712 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Aug 19 00:16:02.354995 ignition[992]: INFO : Ignition 2.21.0 Aug 19 00:16:02.354995 ignition[992]: INFO : Stage: files Aug 19 00:16:02.356445 ignition[992]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 00:16:02.356445 ignition[992]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 19 00:16:02.359666 ignition[992]: DEBUG : files: compiled without relabeling support, skipping Aug 19 00:16:02.359666 ignition[992]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Aug 19 00:16:02.359666 ignition[992]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Aug 19 00:16:02.363440 ignition[992]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Aug 19 00:16:02.363440 ignition[992]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Aug 19 00:16:02.365826 ignition[992]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Aug 19 00:16:02.364563 unknown[992]: wrote ssh authorized keys file for user: core Aug 19 00:16:02.368756 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Aug 19 00:16:02.368756 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Aug 19 00:16:02.467322 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Aug 19 00:16:02.750234 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Aug 19 00:16:02.750234 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Aug 19 00:16:02.753885 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Aug 19 00:16:02.753885 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Aug 19 00:16:02.753885 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Aug 19 00:16:02.753885 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 00:16:02.753885 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Aug 19 00:16:02.753885 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 00:16:02.753885 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Aug 19 00:16:02.762080 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 00:16:02.762080 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Aug 19 00:16:02.762080 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:16:02.765966 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:16:02.765966 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:16:02.765966 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Aug 19 00:16:03.046246 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Aug 19 00:16:03.260617 ignition[992]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Aug 19 00:16:03.262021 ignition[992]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Aug 19 00:16:03.264494 ignition[992]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 00:16:03.267968 ignition[992]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Aug 19 00:16:03.267968 ignition[992]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Aug 19 00:16:03.267968 ignition[992]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Aug 19 00:16:03.267968 ignition[992]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 19 00:16:03.267968 ignition[992]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Aug 19 00:16:03.279727 ignition[992]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Aug 19 00:16:03.279727 ignition[992]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Aug 19 00:16:03.279727 ignition[992]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Aug 19 00:16:03.279727 ignition[992]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Aug 19 00:16:03.279727 ignition[992]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Aug 19 00:16:03.279727 ignition[992]: INFO : files: files passed Aug 19 00:16:03.279727 ignition[992]: INFO : Ignition finished successfully Aug 19 00:16:03.274462 systemd[1]: Finished ignition-files.service - Ignition (files). Aug 19 00:16:03.277215 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Aug 19 00:16:03.280650 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Aug 19 00:16:03.284296 systemd-networkd[807]: eth1: Gained IPv6LL Aug 19 00:16:03.305726 systemd[1]: ignition-quench.service: Deactivated successfully. Aug 19 00:16:03.307958 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Aug 19 00:16:03.313679 initrd-setup-root-after-ignition[1022]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 00:16:03.313679 initrd-setup-root-after-ignition[1022]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Aug 19 00:16:03.316537 initrd-setup-root-after-ignition[1026]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Aug 19 00:16:03.319108 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 00:16:03.320057 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Aug 19 00:16:03.322167 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Aug 19 00:16:03.348394 systemd-networkd[807]: eth0: Gained IPv6LL Aug 19 00:16:03.390475 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Aug 19 00:16:03.391336 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Aug 19 00:16:03.393168 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Aug 19 00:16:03.393864 systemd[1]: Reached target initrd.target - Initrd Default Target. Aug 19 00:16:03.394650 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Aug 19 00:16:03.395766 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Aug 19 00:16:03.427617 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 00:16:03.432669 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Aug 19 00:16:03.457864 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Aug 19 00:16:03.459521 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 00:16:03.466754 systemd[1]: Stopped target timers.target - Timer Units. Aug 19 00:16:03.469554 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Aug 19 00:16:03.469809 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Aug 19 00:16:03.471108 systemd[1]: Stopped target initrd.target - Initrd Default Target. Aug 19 00:16:03.472493 systemd[1]: Stopped target basic.target - Basic System. Aug 19 00:16:03.473857 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Aug 19 00:16:03.475472 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Aug 19 00:16:03.476553 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Aug 19 00:16:03.478812 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Aug 19 00:16:03.479926 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Aug 19 00:16:03.480940 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Aug 19 00:16:03.482469 systemd[1]: Stopped target sysinit.target - System Initialization. Aug 19 00:16:03.483464 systemd[1]: Stopped target local-fs.target - Local File Systems. Aug 19 00:16:03.485158 systemd[1]: Stopped target swap.target - Swaps. Aug 19 00:16:03.487003 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Aug 19 00:16:03.487216 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Aug 19 00:16:03.488511 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Aug 19 00:16:03.489707 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 00:16:03.490787 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Aug 19 00:16:03.490907 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 00:16:03.492194 systemd[1]: dracut-initqueue.service: Deactivated successfully. Aug 19 00:16:03.492402 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Aug 19 00:16:03.493919 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Aug 19 00:16:03.494087 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Aug 19 00:16:03.495660 systemd[1]: ignition-files.service: Deactivated successfully. Aug 19 00:16:03.495820 systemd[1]: Stopped ignition-files.service - Ignition (files). Aug 19 00:16:03.496656 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Aug 19 00:16:03.496868 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Aug 19 00:16:03.500494 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Aug 19 00:16:03.504688 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Aug 19 00:16:03.505356 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Aug 19 00:16:03.505640 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 00:16:03.508344 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Aug 19 00:16:03.508528 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Aug 19 00:16:03.515451 systemd[1]: initrd-cleanup.service: Deactivated successfully. Aug 19 00:16:03.515567 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Aug 19 00:16:03.528736 systemd[1]: sysroot-boot.mount: Deactivated successfully. Aug 19 00:16:03.532870 systemd[1]: sysroot-boot.service: Deactivated successfully. Aug 19 00:16:03.533068 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Aug 19 00:16:03.541442 ignition[1046]: INFO : Ignition 2.21.0 Aug 19 00:16:03.541442 ignition[1046]: INFO : Stage: umount Aug 19 00:16:03.544793 ignition[1046]: INFO : no configs at "/usr/lib/ignition/base.d" Aug 19 00:16:03.544793 ignition[1046]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Aug 19 00:16:03.550273 ignition[1046]: INFO : umount: umount passed Aug 19 00:16:03.550273 ignition[1046]: INFO : Ignition finished successfully Aug 19 00:16:03.549554 systemd[1]: ignition-mount.service: Deactivated successfully. Aug 19 00:16:03.549691 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Aug 19 00:16:03.551394 systemd[1]: ignition-disks.service: Deactivated successfully. Aug 19 00:16:03.551483 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Aug 19 00:16:03.552791 systemd[1]: ignition-kargs.service: Deactivated successfully. Aug 19 00:16:03.552868 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Aug 19 00:16:03.554950 systemd[1]: ignition-fetch.service: Deactivated successfully. Aug 19 00:16:03.555028 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Aug 19 00:16:03.556219 systemd[1]: Stopped target network.target - Network. Aug 19 00:16:03.557212 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Aug 19 00:16:03.557292 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Aug 19 00:16:03.558482 systemd[1]: Stopped target paths.target - Path Units. Aug 19 00:16:03.559415 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Aug 19 00:16:03.563236 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 00:16:03.565986 systemd[1]: Stopped target slices.target - Slice Units. Aug 19 00:16:03.568432 systemd[1]: Stopped target sockets.target - Socket Units. Aug 19 00:16:03.570564 systemd[1]: iscsid.socket: Deactivated successfully. Aug 19 00:16:03.570671 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Aug 19 00:16:03.571745 systemd[1]: iscsiuio.socket: Deactivated successfully. Aug 19 00:16:03.571780 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Aug 19 00:16:03.572678 systemd[1]: ignition-setup.service: Deactivated successfully. Aug 19 00:16:03.572742 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Aug 19 00:16:03.573707 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Aug 19 00:16:03.573752 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Aug 19 00:16:03.574681 systemd[1]: initrd-setup-root.service: Deactivated successfully. Aug 19 00:16:03.574736 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Aug 19 00:16:03.576080 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Aug 19 00:16:03.576995 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Aug 19 00:16:03.584551 systemd[1]: systemd-resolved.service: Deactivated successfully. Aug 19 00:16:03.584701 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Aug 19 00:16:03.589931 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. Aug 19 00:16:03.590946 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Aug 19 00:16:03.591425 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 00:16:03.594824 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. Aug 19 00:16:03.595265 systemd[1]: systemd-networkd.service: Deactivated successfully. Aug 19 00:16:03.595435 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Aug 19 00:16:03.598376 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. Aug 19 00:16:03.599091 systemd[1]: Stopped target network-pre.target - Preparation for Network. Aug 19 00:16:03.600117 systemd[1]: systemd-networkd.socket: Deactivated successfully. Aug 19 00:16:03.600310 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Aug 19 00:16:03.603436 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Aug 19 00:16:03.603972 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Aug 19 00:16:03.604041 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Aug 19 00:16:03.608055 systemd[1]: systemd-sysctl.service: Deactivated successfully. Aug 19 00:16:03.608124 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Aug 19 00:16:03.612172 systemd[1]: systemd-modules-load.service: Deactivated successfully. Aug 19 00:16:03.612245 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Aug 19 00:16:03.613576 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 00:16:03.619641 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. Aug 19 00:16:03.632692 systemd[1]: systemd-udevd.service: Deactivated successfully. Aug 19 00:16:03.634846 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 00:16:03.636560 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Aug 19 00:16:03.636617 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Aug 19 00:16:03.637993 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Aug 19 00:16:03.638039 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 00:16:03.639098 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Aug 19 00:16:03.639187 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Aug 19 00:16:03.640674 systemd[1]: dracut-cmdline.service: Deactivated successfully. Aug 19 00:16:03.640751 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Aug 19 00:16:03.642194 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Aug 19 00:16:03.642255 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Aug 19 00:16:03.644259 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Aug 19 00:16:03.647312 systemd[1]: systemd-network-generator.service: Deactivated successfully. Aug 19 00:16:03.647403 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 00:16:03.648437 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Aug 19 00:16:03.648492 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 00:16:03.649887 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Aug 19 00:16:03.649937 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 19 00:16:03.652815 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Aug 19 00:16:03.652870 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 00:16:03.654373 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:16:03.654446 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:16:03.656928 systemd[1]: network-cleanup.service: Deactivated successfully. Aug 19 00:16:03.659921 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Aug 19 00:16:03.668075 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Aug 19 00:16:03.668424 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Aug 19 00:16:03.671331 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Aug 19 00:16:03.673688 systemd[1]: Starting initrd-switch-root.service - Switch Root... Aug 19 00:16:03.695892 systemd[1]: Switching root. Aug 19 00:16:03.733759 systemd-journald[244]: Journal stopped Aug 19 00:16:04.810330 systemd-journald[244]: Received SIGTERM from PID 1 (systemd). Aug 19 00:16:04.810412 kernel: SELinux: policy capability network_peer_controls=1 Aug 19 00:16:04.810425 kernel: SELinux: policy capability open_perms=1 Aug 19 00:16:04.810438 kernel: SELinux: policy capability extended_socket_class=1 Aug 19 00:16:04.810447 kernel: SELinux: policy capability always_check_network=0 Aug 19 00:16:04.810456 kernel: SELinux: policy capability cgroup_seclabel=1 Aug 19 00:16:04.810466 kernel: SELinux: policy capability nnp_nosuid_transition=1 Aug 19 00:16:04.810475 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Aug 19 00:16:04.810485 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Aug 19 00:16:04.810494 kernel: SELinux: policy capability userspace_initial_context=0 Aug 19 00:16:04.810507 kernel: audit: type=1403 audit(1755562563.929:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 Aug 19 00:16:04.810517 systemd[1]: Successfully loaded SELinux policy in 74.212ms. Aug 19 00:16:04.810535 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.497ms. Aug 19 00:16:04.810547 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Aug 19 00:16:04.810558 systemd[1]: Detected virtualization kvm. Aug 19 00:16:04.810569 systemd[1]: Detected architecture arm64. Aug 19 00:16:04.810579 systemd[1]: Detected first boot. Aug 19 00:16:04.810589 systemd[1]: Hostname set to . Aug 19 00:16:04.810599 systemd[1]: Initializing machine ID from VM UUID. Aug 19 00:16:04.810609 zram_generator::config[1090]: No configuration found. Aug 19 00:16:04.810619 kernel: NET: Registered PF_VSOCK protocol family Aug 19 00:16:04.810628 systemd[1]: Populated /etc with preset unit settings. Aug 19 00:16:04.810639 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. Aug 19 00:16:04.810649 systemd[1]: initrd-switch-root.service: Deactivated successfully. Aug 19 00:16:04.810660 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Aug 19 00:16:04.810669 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Aug 19 00:16:04.810680 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Aug 19 00:16:04.810693 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Aug 19 00:16:04.810703 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Aug 19 00:16:04.810712 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Aug 19 00:16:04.810723 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Aug 19 00:16:04.810736 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Aug 19 00:16:04.810753 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Aug 19 00:16:04.810765 systemd[1]: Created slice user.slice - User and Session Slice. Aug 19 00:16:04.810775 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Aug 19 00:16:04.810785 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Aug 19 00:16:04.810798 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Aug 19 00:16:04.810809 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Aug 19 00:16:04.810822 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Aug 19 00:16:04.810834 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Aug 19 00:16:04.810844 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Aug 19 00:16:04.810854 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Aug 19 00:16:04.810867 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Aug 19 00:16:04.810878 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Aug 19 00:16:04.810888 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Aug 19 00:16:04.810899 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Aug 19 00:16:04.810909 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Aug 19 00:16:04.811232 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Aug 19 00:16:04.811254 systemd[1]: Reached target remote-fs.target - Remote File Systems. Aug 19 00:16:04.811265 systemd[1]: Reached target slices.target - Slice Units. Aug 19 00:16:04.811275 systemd[1]: Reached target swap.target - Swaps. Aug 19 00:16:04.811285 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Aug 19 00:16:04.811308 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Aug 19 00:16:04.811323 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Aug 19 00:16:04.811338 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Aug 19 00:16:04.812192 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Aug 19 00:16:04.812213 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Aug 19 00:16:04.812224 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Aug 19 00:16:04.812234 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Aug 19 00:16:04.812247 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Aug 19 00:16:04.812258 systemd[1]: Mounting media.mount - External Media Directory... Aug 19 00:16:04.812268 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Aug 19 00:16:04.812277 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Aug 19 00:16:04.812292 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Aug 19 00:16:04.812319 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Aug 19 00:16:04.812329 systemd[1]: Reached target machines.target - Containers. Aug 19 00:16:04.812339 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Aug 19 00:16:04.812349 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:16:04.812359 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Aug 19 00:16:04.812369 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Aug 19 00:16:04.812401 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 00:16:04.812417 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 00:16:04.812429 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 00:16:04.812438 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Aug 19 00:16:04.812448 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 00:16:04.812458 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Aug 19 00:16:04.812469 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Aug 19 00:16:04.812479 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Aug 19 00:16:04.812492 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Aug 19 00:16:04.812503 systemd[1]: Stopped systemd-fsck-usr.service. Aug 19 00:16:04.812515 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:16:04.812525 systemd[1]: Starting systemd-journald.service - Journal Service... Aug 19 00:16:04.812535 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Aug 19 00:16:04.812546 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Aug 19 00:16:04.812558 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Aug 19 00:16:04.812569 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Aug 19 00:16:04.812580 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Aug 19 00:16:04.812593 systemd[1]: verity-setup.service: Deactivated successfully. Aug 19 00:16:04.812603 systemd[1]: Stopped verity-setup.service. Aug 19 00:16:04.812615 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Aug 19 00:16:04.812625 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Aug 19 00:16:04.812636 systemd[1]: Mounted media.mount - External Media Directory. Aug 19 00:16:04.812645 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Aug 19 00:16:04.812655 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Aug 19 00:16:04.812665 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Aug 19 00:16:04.812675 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Aug 19 00:16:04.812685 systemd[1]: modprobe@configfs.service: Deactivated successfully. Aug 19 00:16:04.812695 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Aug 19 00:16:04.812706 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 00:16:04.812716 kernel: loop: module loaded Aug 19 00:16:04.812726 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 00:16:04.812737 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 00:16:04.812746 kernel: fuse: init (API version 7.41) Aug 19 00:16:04.812756 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 00:16:04.812767 systemd[1]: modprobe@fuse.service: Deactivated successfully. Aug 19 00:16:04.812777 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Aug 19 00:16:04.812788 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 00:16:04.812801 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 00:16:04.812811 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Aug 19 00:16:04.812821 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Aug 19 00:16:04.812832 systemd[1]: Reached target network-pre.target - Preparation for Network. Aug 19 00:16:04.812842 kernel: ACPI: bus type drm_connector registered Aug 19 00:16:04.814585 systemd-journald[1154]: Collecting audit messages is disabled. Aug 19 00:16:04.814644 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Aug 19 00:16:04.814660 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Aug 19 00:16:04.814680 systemd-journald[1154]: Journal started Aug 19 00:16:04.814704 systemd-journald[1154]: Runtime Journal (/run/log/journal/2e1cc688a50b419c906d5e2914990733) is 8M, max 76.5M, 68.5M free. Aug 19 00:16:04.490264 systemd[1]: Queued start job for default target multi-user.target. Aug 19 00:16:04.507032 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Aug 19 00:16:04.818234 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 00:16:04.507682 systemd[1]: systemd-journald.service: Deactivated successfully. Aug 19 00:16:04.823543 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Aug 19 00:16:04.827770 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Aug 19 00:16:04.834101 systemd[1]: Started systemd-journald.service - Journal Service. Aug 19 00:16:04.837044 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Aug 19 00:16:04.840569 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 00:16:04.841439 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 00:16:04.842550 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Aug 19 00:16:04.846683 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Aug 19 00:16:04.847564 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Aug 19 00:16:04.848597 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Aug 19 00:16:04.876587 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Aug 19 00:16:04.881230 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Aug 19 00:16:04.881291 systemd[1]: Reached target local-fs.target - Local File Systems. Aug 19 00:16:04.883781 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Aug 19 00:16:04.887406 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Aug 19 00:16:04.888338 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:16:04.892799 systemd-tmpfiles[1187]: ACLs are not supported, ignoring. Aug 19 00:16:04.892820 systemd-tmpfiles[1187]: ACLs are not supported, ignoring. Aug 19 00:16:04.893366 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Aug 19 00:16:04.897059 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Aug 19 00:16:04.898287 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 00:16:04.906934 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Aug 19 00:16:04.911959 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Aug 19 00:16:04.917267 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Aug 19 00:16:04.918633 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Aug 19 00:16:04.932546 systemd[1]: Starting systemd-sysusers.service - Create System Users... Aug 19 00:16:04.943047 systemd-journald[1154]: Time spent on flushing to /var/log/journal/2e1cc688a50b419c906d5e2914990733 is 75.398ms for 1169 entries. Aug 19 00:16:04.943047 systemd-journald[1154]: System Journal (/var/log/journal/2e1cc688a50b419c906d5e2914990733) is 8M, max 584.8M, 576.8M free. Aug 19 00:16:05.028874 systemd-journald[1154]: Received client request to flush runtime journal. Aug 19 00:16:05.031255 kernel: loop0: detected capacity change from 0 to 211168 Aug 19 00:16:05.031283 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Aug 19 00:16:04.941154 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Aug 19 00:16:04.942318 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Aug 19 00:16:04.945625 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Aug 19 00:16:05.021655 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Aug 19 00:16:05.036996 kernel: loop1: detected capacity change from 0 to 100608 Aug 19 00:16:05.034997 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Aug 19 00:16:05.042324 systemd[1]: Finished systemd-sysusers.service - Create System Users. Aug 19 00:16:05.047012 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Aug 19 00:16:05.077169 kernel: loop2: detected capacity change from 0 to 8 Aug 19 00:16:05.082711 systemd-tmpfiles[1229]: ACLs are not supported, ignoring. Aug 19 00:16:05.082728 systemd-tmpfiles[1229]: ACLs are not supported, ignoring. Aug 19 00:16:05.091200 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Aug 19 00:16:05.097170 kernel: loop3: detected capacity change from 0 to 119320 Aug 19 00:16:05.140158 kernel: loop4: detected capacity change from 0 to 211168 Aug 19 00:16:05.163176 kernel: loop5: detected capacity change from 0 to 100608 Aug 19 00:16:05.175176 kernel: loop6: detected capacity change from 0 to 8 Aug 19 00:16:05.177233 kernel: loop7: detected capacity change from 0 to 119320 Aug 19 00:16:05.191622 (sd-merge)[1235]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-hetzner'. Aug 19 00:16:05.193018 (sd-merge)[1235]: Merged extensions into '/usr'. Aug 19 00:16:05.201274 systemd[1]: Reload requested from client PID 1216 ('systemd-sysext') (unit systemd-sysext.service)... Aug 19 00:16:05.201308 systemd[1]: Reloading... Aug 19 00:16:05.329113 zram_generator::config[1264]: No configuration found. Aug 19 00:16:05.527348 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Aug 19 00:16:05.527548 systemd[1]: Reloading finished in 325 ms. Aug 19 00:16:05.531160 ldconfig[1212]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Aug 19 00:16:05.543846 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Aug 19 00:16:05.545030 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Aug 19 00:16:05.561466 systemd[1]: Starting ensure-sysext.service... Aug 19 00:16:05.566499 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Aug 19 00:16:05.595405 systemd[1]: Reload requested from client PID 1298 ('systemctl') (unit ensure-sysext.service)... Aug 19 00:16:05.595421 systemd[1]: Reloading... Aug 19 00:16:05.617481 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Aug 19 00:16:05.619716 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Aug 19 00:16:05.620826 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Aug 19 00:16:05.621468 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. Aug 19 00:16:05.622441 systemd-tmpfiles[1299]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. Aug 19 00:16:05.622669 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Aug 19 00:16:05.622717 systemd-tmpfiles[1299]: ACLs are not supported, ignoring. Aug 19 00:16:05.632081 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 00:16:05.632288 systemd-tmpfiles[1299]: Skipping /boot Aug 19 00:16:05.645737 systemd-tmpfiles[1299]: Detected autofs mount point /boot during canonicalization of boot. Aug 19 00:16:05.645895 systemd-tmpfiles[1299]: Skipping /boot Aug 19 00:16:05.707162 zram_generator::config[1329]: No configuration found. Aug 19 00:16:05.876208 systemd[1]: Reloading finished in 280 ms. Aug 19 00:16:05.890874 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Aug 19 00:16:05.899005 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Aug 19 00:16:05.909349 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 00:16:05.913315 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Aug 19 00:16:05.916415 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Aug 19 00:16:05.919362 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Aug 19 00:16:05.925175 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Aug 19 00:16:05.927639 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Aug 19 00:16:05.933651 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:16:05.940742 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Aug 19 00:16:05.945244 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Aug 19 00:16:05.950930 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Aug 19 00:16:05.951792 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:16:05.951929 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:16:05.956264 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:16:05.956438 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:16:05.956512 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:16:05.960720 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Aug 19 00:16:05.966266 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Aug 19 00:16:05.969497 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Aug 19 00:16:05.971400 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Aug 19 00:16:05.971548 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Aug 19 00:16:05.977617 systemd[1]: Finished ensure-sysext.service. Aug 19 00:16:05.986306 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Aug 19 00:16:05.993703 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Aug 19 00:16:05.994979 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Aug 19 00:16:06.001432 systemd[1]: Starting systemd-update-done.service - Update is Completed... Aug 19 00:16:06.029322 systemd-udevd[1370]: Using default interface naming scheme 'v255'. Aug 19 00:16:06.038803 systemd[1]: modprobe@loop.service: Deactivated successfully. Aug 19 00:16:06.039018 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Aug 19 00:16:06.046179 systemd[1]: Finished systemd-update-done.service - Update is Completed. Aug 19 00:16:06.047274 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Aug 19 00:16:06.049615 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Aug 19 00:16:06.053325 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Aug 19 00:16:06.054785 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Aug 19 00:16:06.054988 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Aug 19 00:16:06.056441 systemd[1]: modprobe@drm.service: Deactivated successfully. Aug 19 00:16:06.058669 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Aug 19 00:16:06.066282 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Aug 19 00:16:06.066419 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Aug 19 00:16:06.066449 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Aug 19 00:16:06.080174 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Aug 19 00:16:06.086549 augenrules[1410]: No rules Aug 19 00:16:06.085579 systemd[1]: Starting systemd-networkd.service - Network Configuration... Aug 19 00:16:06.091281 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 00:16:06.091574 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 00:16:06.098863 systemd[1]: Started systemd-userdbd.service - User Database Manager. Aug 19 00:16:06.205704 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Aug 19 00:16:06.369174 kernel: mousedev: PS/2 mouse device common for all mice Aug 19 00:16:06.390911 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Aug 19 00:16:06.395465 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Aug 19 00:16:06.448197 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Aug 19 00:16:06.489783 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Aug 19 00:16:06.491445 systemd[1]: Reached target time-set.target - System Time Set. Aug 19 00:16:06.493499 systemd-networkd[1412]: lo: Link UP Aug 19 00:16:06.493507 systemd-networkd[1412]: lo: Gained carrier Aug 19 00:16:06.495721 systemd-networkd[1412]: Enumeration completed Aug 19 00:16:06.495915 systemd-timesyncd[1384]: No network connectivity, watching for changes. Aug 19 00:16:06.496345 systemd[1]: Started systemd-networkd.service - Network Configuration. Aug 19 00:16:06.496878 systemd-networkd[1412]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:16:06.496955 systemd-networkd[1412]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 00:16:06.498024 systemd-networkd[1412]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:16:06.498109 systemd-networkd[1412]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Aug 19 00:16:06.499029 systemd-networkd[1412]: eth0: Link UP Aug 19 00:16:06.499196 systemd-networkd[1412]: eth0: Gained carrier Aug 19 00:16:06.499211 systemd-networkd[1412]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:16:06.500851 systemd-resolved[1369]: Positive Trust Anchors: Aug 19 00:16:06.500874 systemd-resolved[1369]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Aug 19 00:16:06.500905 systemd-resolved[1369]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Aug 19 00:16:06.501571 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Aug 19 00:16:06.505414 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Aug 19 00:16:06.505644 systemd-networkd[1412]: eth1: Link UP Aug 19 00:16:06.506759 systemd-networkd[1412]: eth1: Gained carrier Aug 19 00:16:06.506790 systemd-networkd[1412]: eth1: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. Aug 19 00:16:06.517565 systemd-resolved[1369]: Using system hostname 'ci-4426-0-0-8-661ee896d9'. Aug 19 00:16:06.523925 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Aug 19 00:16:06.526217 systemd-networkd[1412]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Aug 19 00:16:06.527058 systemd-timesyncd[1384]: Network configuration changed, trying to establish connection. Aug 19 00:16:06.527270 systemd[1]: Reached target network.target - Network. Aug 19 00:16:06.528377 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Aug 19 00:16:06.529696 systemd[1]: Reached target sysinit.target - System Initialization. Aug 19 00:16:06.531322 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Aug 19 00:16:06.532365 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Aug 19 00:16:06.534453 systemd[1]: Started logrotate.timer - Daily rotation of log files. Aug 19 00:16:06.535169 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Aug 19 00:16:06.536402 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Aug 19 00:16:06.537731 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Aug 19 00:16:06.537770 systemd[1]: Reached target paths.target - Path Units. Aug 19 00:16:06.538737 systemd[1]: Reached target timers.target - Timer Units. Aug 19 00:16:06.541496 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Aug 19 00:16:06.545114 systemd[1]: Starting docker.socket - Docker Socket for the API... Aug 19 00:16:06.551032 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Aug 19 00:16:06.553547 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Aug 19 00:16:06.554919 systemd[1]: Reached target ssh-access.target - SSH Access Available. Aug 19 00:16:06.558710 systemd-networkd[1412]: eth0: DHCPv4 address 91.99.87.156/32, gateway 172.31.1.1 acquired from 172.31.1.1 Aug 19 00:16:06.564357 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Aug 19 00:16:06.566332 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Aug 19 00:16:06.571180 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Aug 19 00:16:06.572642 systemd[1]: Listening on docker.socket - Docker Socket for the API. Aug 19 00:16:06.573964 systemd[1]: Reached target sockets.target - Socket Units. Aug 19 00:16:06.576362 systemd[1]: Reached target basic.target - Basic System. Aug 19 00:16:06.576973 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Aug 19 00:16:06.577011 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Aug 19 00:16:06.578927 systemd[1]: Starting containerd.service - containerd container runtime... Aug 19 00:16:06.581158 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Aug 19 00:16:06.581251 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Aug 19 00:16:06.581286 kernel: [drm] features: -context_init Aug 19 00:16:06.582776 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Aug 19 00:16:06.590406 kernel: [drm] number of scanouts: 1 Aug 19 00:16:06.590500 kernel: [drm] number of cap sets: 0 Aug 19 00:16:06.588368 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Aug 19 00:16:06.592196 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Aug 19 00:16:06.599218 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Aug 19 00:16:06.600557 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Aug 19 00:16:06.603455 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Aug 19 00:16:06.604048 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Aug 19 00:16:06.608873 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Aug 19 00:16:06.612398 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Aug 19 00:16:06.615914 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Aug 19 00:16:06.620092 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Aug 19 00:16:06.625952 systemd[1]: Starting systemd-logind.service - User Login Management... Aug 19 00:16:06.629652 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Aug 19 00:16:06.635483 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Aug 19 00:16:06.638801 systemd[1]: Starting update-engine.service - Update Engine... Aug 19 00:16:06.644617 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Aug 19 00:16:06.649206 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Aug 19 00:16:06.658844 kernel: Console: switching to colour frame buffer device 160x50 Aug 19 00:16:06.654522 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Aug 19 00:16:06.662229 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Aug 19 00:16:06.668577 jq[1483]: false Aug 19 00:16:06.674156 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Aug 19 00:16:06.679650 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Aug 19 00:16:06.680592 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Aug 19 00:16:06.682679 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Aug 19 00:16:06.682924 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Aug 19 00:16:06.704794 jq[1493]: true Aug 19 00:16:06.739392 coreos-metadata[1480]: Aug 19 00:16:06.739 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Aug 19 00:16:06.744331 systemd-timesyncd[1384]: Contacted time server 194.164.164.175:123 (1.flatcar.pool.ntp.org). Aug 19 00:16:06.744447 systemd-timesyncd[1384]: Initial clock synchronization to Tue 2025-08-19 00:16:06.485042 UTC. Aug 19 00:16:06.744912 tar[1502]: linux-arm64/LICENSE Aug 19 00:16:06.748720 extend-filesystems[1484]: Found /dev/sda6 Aug 19 00:16:06.754117 tar[1502]: linux-arm64/helm Aug 19 00:16:06.747206 (ntainerd)[1513]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR Aug 19 00:16:06.754554 coreos-metadata[1480]: Aug 19 00:16:06.750 INFO Fetch successful Aug 19 00:16:06.754554 coreos-metadata[1480]: Aug 19 00:16:06.751 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Aug 19 00:16:06.754443 systemd[1]: motdgen.service: Deactivated successfully. Aug 19 00:16:06.755619 dbus-daemon[1481]: [system] SELinux support is enabled Aug 19 00:16:06.755978 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Aug 19 00:16:06.756897 systemd[1]: Started dbus.service - D-Bus System Message Bus. Aug 19 00:16:06.765560 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Aug 19 00:16:06.765600 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Aug 19 00:16:06.767074 coreos-metadata[1480]: Aug 19 00:16:06.766 INFO Fetch successful Aug 19 00:16:06.768392 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Aug 19 00:16:06.768420 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Aug 19 00:16:06.774768 extend-filesystems[1484]: Found /dev/sda9 Aug 19 00:16:06.787834 update_engine[1492]: I20250819 00:16:06.780817 1492 main.cc:92] Flatcar Update Engine starting Aug 19 00:16:06.789346 extend-filesystems[1484]: Checking size of /dev/sda9 Aug 19 00:16:06.803768 systemd[1]: Started update-engine.service - Update Engine. Aug 19 00:16:06.804740 update_engine[1492]: I20250819 00:16:06.804376 1492 update_check_scheduler.cc:74] Next update check in 2m28s Aug 19 00:16:06.804777 jq[1516]: true Aug 19 00:16:06.816535 systemd[1]: Started locksmithd.service - Cluster reboot manager. Aug 19 00:16:06.833169 extend-filesystems[1484]: Resized partition /dev/sda9 Aug 19 00:16:06.843160 extend-filesystems[1535]: resize2fs 1.47.2 (1-Jan-2025) Aug 19 00:16:06.863412 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 9393147 blocks Aug 19 00:16:06.939266 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:16:06.977641 systemd-logind[1491]: New seat seat0. Aug 19 00:16:06.980521 systemd[1]: Started systemd-logind.service - User Login Management. Aug 19 00:16:07.004803 bash[1568]: Updated "/home/core/.ssh/authorized_keys" Aug 19 00:16:07.010193 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Aug 19 00:16:07.015633 systemd[1]: Starting sshkeys.service... Aug 19 00:16:07.057150 kernel: EXT4-fs (sda9): resized filesystem to 9393147 Aug 19 00:16:07.075198 extend-filesystems[1535]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Aug 19 00:16:07.075198 extend-filesystems[1535]: old_desc_blocks = 1, new_desc_blocks = 5 Aug 19 00:16:07.075198 extend-filesystems[1535]: The filesystem on /dev/sda9 is now 9393147 (4k) blocks long. Aug 19 00:16:07.079579 extend-filesystems[1484]: Resized filesystem in /dev/sda9 Aug 19 00:16:07.076533 systemd[1]: extend-filesystems.service: Deactivated successfully. Aug 19 00:16:07.078217 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Aug 19 00:16:07.081234 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Aug 19 00:16:07.090684 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Aug 19 00:16:07.091514 systemd-logind[1491]: Watching system buttons on /dev/input/event0 (Power Button) Aug 19 00:16:07.096834 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Aug 19 00:16:07.097729 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Aug 19 00:16:07.132800 systemd-logind[1491]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Aug 19 00:16:07.192569 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Aug 19 00:16:07.192870 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:16:07.199468 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Aug 19 00:16:07.259196 containerd[1513]: time="2025-08-19T00:16:07Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Aug 19 00:16:07.261360 containerd[1513]: time="2025-08-19T00:16:07.261066527Z" level=info msg="starting containerd" revision=fb4c30d4ede3531652d86197bf3fc9515e5276d9 version=v2.0.5 Aug 19 00:16:07.297375 coreos-metadata[1580]: Aug 19 00:16:07.295 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Aug 19 00:16:07.308610 coreos-metadata[1580]: Aug 19 00:16:07.308 INFO Fetch successful Aug 19 00:16:07.311331 unknown[1580]: wrote ssh authorized keys file for user: core Aug 19 00:16:07.328151 containerd[1513]: time="2025-08-19T00:16:07.327860332Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="11.998µs" Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335307966Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335371673Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335577464Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335598944Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335626928Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335687732Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335702440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335962146Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335982117Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.335993573Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.336001663Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337034 containerd[1513]: time="2025-08-19T00:16:07.336151913Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337528 containerd[1513]: time="2025-08-19T00:16:07.336354336Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337528 containerd[1513]: time="2025-08-19T00:16:07.336382242Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Aug 19 00:16:07.337528 containerd[1513]: time="2025-08-19T00:16:07.336393234Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Aug 19 00:16:07.337528 containerd[1513]: time="2025-08-19T00:16:07.336559469Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Aug 19 00:16:07.338585 containerd[1513]: time="2025-08-19T00:16:07.337998959Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Aug 19 00:16:07.339004 containerd[1513]: time="2025-08-19T00:16:07.338935680Z" level=info msg="metadata content store policy set" policy=shared Aug 19 00:16:07.346441 containerd[1513]: time="2025-08-19T00:16:07.346369109Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Aug 19 00:16:07.346744 containerd[1513]: time="2025-08-19T00:16:07.346598393Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Aug 19 00:16:07.346744 containerd[1513]: time="2025-08-19T00:16:07.346625834Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Aug 19 00:16:07.346916 containerd[1513]: time="2025-08-19T00:16:07.346890842Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Aug 19 00:16:07.347408 containerd[1513]: time="2025-08-19T00:16:07.346972392Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Aug 19 00:16:07.347408 containerd[1513]: time="2025-08-19T00:16:07.346991087Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Aug 19 00:16:07.347408 containerd[1513]: time="2025-08-19T00:16:07.347004633Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Aug 19 00:16:07.347408 containerd[1513]: time="2025-08-19T00:16:07.347030681Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Aug 19 00:16:07.347408 containerd[1513]: time="2025-08-19T00:16:07.347045002Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Aug 19 00:16:07.347408 containerd[1513]: time="2025-08-19T00:16:07.347072830Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Aug 19 00:16:07.347408 containerd[1513]: time="2025-08-19T00:16:07.347084751Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Aug 19 00:16:07.351858 containerd[1513]: time="2025-08-19T00:16:07.351815576Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352177809Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352220810Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352241749Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352260752Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352276389Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352294464Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352308049Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352324073Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352341025Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Aug 19 00:16:07.352377 containerd[1513]: time="2025-08-19T00:16:07.352357320Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Aug 19 00:16:07.353003 containerd[1513]: time="2025-08-19T00:16:07.352719166Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Aug 19 00:16:07.363934 containerd[1513]: time="2025-08-19T00:16:07.355187338Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Aug 19 00:16:07.364746 containerd[1513]: time="2025-08-19T00:16:07.364173970Z" level=info msg="Start snapshots syncer" Aug 19 00:16:07.364746 containerd[1513]: time="2025-08-19T00:16:07.364224905Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Aug 19 00:16:07.364746 containerd[1513]: time="2025-08-19T00:16:07.364481360Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Aug 19 00:16:07.364965 containerd[1513]: time="2025-08-19T00:16:07.364525134Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.366879593Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367383833Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367421569Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367435039Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367447850Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367461938Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367474207Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367486709Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367518833Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367529980Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367542095Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367579947Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367596435Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Aug 19 00:16:07.368243 containerd[1513]: time="2025-08-19T00:16:07.367605376Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 00:16:07.368612 containerd[1513]: time="2025-08-19T00:16:07.367615594Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Aug 19 00:16:07.368612 containerd[1513]: time="2025-08-19T00:16:07.367623645Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Aug 19 00:16:07.368612 containerd[1513]: time="2025-08-19T00:16:07.367634095Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Aug 19 00:16:07.368612 containerd[1513]: time="2025-08-19T00:16:07.367649692Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Aug 19 00:16:07.368612 containerd[1513]: time="2025-08-19T00:16:07.367759071Z" level=info msg="runtime interface created" Aug 19 00:16:07.368612 containerd[1513]: time="2025-08-19T00:16:07.367765380Z" level=info msg="created NRI interface" Aug 19 00:16:07.368612 containerd[1513]: time="2025-08-19T00:16:07.367773701Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Aug 19 00:16:07.368612 containerd[1513]: time="2025-08-19T00:16:07.367787673Z" level=info msg="Connect containerd service" Aug 19 00:16:07.368612 containerd[1513]: time="2025-08-19T00:16:07.367817824Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Aug 19 00:16:07.371922 containerd[1513]: time="2025-08-19T00:16:07.371830607Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 00:16:07.373388 update-ssh-keys[1595]: Updated "/home/core/.ssh/authorized_keys" Aug 19 00:16:07.375735 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Aug 19 00:16:07.396750 systemd[1]: Finished sshkeys.service. Aug 19 00:16:07.446217 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Aug 19 00:16:07.472664 locksmithd[1528]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Aug 19 00:16:07.574476 containerd[1513]: time="2025-08-19T00:16:07.574233004Z" level=info msg="Start subscribing containerd event" Aug 19 00:16:07.574476 containerd[1513]: time="2025-08-19T00:16:07.574310258Z" level=info msg="Start recovering state" Aug 19 00:16:07.574691 containerd[1513]: time="2025-08-19T00:16:07.574619467Z" level=info msg="Start event monitor" Aug 19 00:16:07.574691 containerd[1513]: time="2025-08-19T00:16:07.574641644Z" level=info msg="Start cni network conf syncer for default" Aug 19 00:16:07.574691 containerd[1513]: time="2025-08-19T00:16:07.574651862Z" level=info msg="Start streaming server" Aug 19 00:16:07.574691 containerd[1513]: time="2025-08-19T00:16:07.574662390Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Aug 19 00:16:07.574691 containerd[1513]: time="2025-08-19T00:16:07.574669589Z" level=info msg="runtime interface starting up..." Aug 19 00:16:07.574861 containerd[1513]: time="2025-08-19T00:16:07.574784811Z" level=info msg="starting plugins..." Aug 19 00:16:07.574861 containerd[1513]: time="2025-08-19T00:16:07.574808111Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Aug 19 00:16:07.577163 containerd[1513]: time="2025-08-19T00:16:07.576318004Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Aug 19 00:16:07.577163 containerd[1513]: time="2025-08-19T00:16:07.576377299Z" level=info msg=serving... address=/run/containerd/containerd.sock Aug 19 00:16:07.578478 containerd[1513]: time="2025-08-19T00:16:07.578222487Z" level=info msg="containerd successfully booted in 0.320943s" Aug 19 00:16:07.578336 systemd[1]: Started containerd.service - containerd container runtime. Aug 19 00:16:07.690533 tar[1502]: linux-arm64/README.md Aug 19 00:16:07.708989 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Aug 19 00:16:07.989070 sshd_keygen[1517]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Aug 19 00:16:08.012947 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Aug 19 00:16:08.018697 systemd[1]: Starting issuegen.service - Generate /run/issue... Aug 19 00:16:08.020630 systemd-networkd[1412]: eth0: Gained IPv6LL Aug 19 00:16:08.028652 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Aug 19 00:16:08.032027 systemd[1]: Reached target network-online.target - Network is Online. Aug 19 00:16:08.037000 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:16:08.039438 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Aug 19 00:16:08.040868 systemd[1]: issuegen.service: Deactivated successfully. Aug 19 00:16:08.042236 systemd[1]: Finished issuegen.service - Generate /run/issue. Aug 19 00:16:08.052251 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Aug 19 00:16:08.084781 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Aug 19 00:16:08.089560 systemd[1]: Started getty@tty1.service - Getty on tty1. Aug 19 00:16:08.093178 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Aug 19 00:16:08.094416 systemd[1]: Reached target getty.target - Login Prompts. Aug 19 00:16:08.096178 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Aug 19 00:16:08.340580 systemd-networkd[1412]: eth1: Gained IPv6LL Aug 19 00:16:08.936349 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:16:08.937984 systemd[1]: Reached target multi-user.target - Multi-User System. Aug 19 00:16:08.942269 systemd[1]: Startup finished in 2.285s (kernel) + 5.292s (initrd) + 5.086s (userspace) = 12.665s. Aug 19 00:16:08.950237 (kubelet)[1655]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:16:09.521581 kubelet[1655]: E0819 00:16:09.521496 1655 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:16:09.523889 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:16:09.524021 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:16:09.524684 systemd[1]: kubelet.service: Consumed 1.002s CPU time, 259.4M memory peak. Aug 19 00:16:19.525183 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Aug 19 00:16:19.527291 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:16:19.686380 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:16:19.697616 (kubelet)[1674]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:16:19.744808 kubelet[1674]: E0819 00:16:19.744739 1674 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:16:19.748718 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:16:19.748907 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:16:19.749806 systemd[1]: kubelet.service: Consumed 167ms CPU time, 104.8M memory peak. Aug 19 00:16:29.775048 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Aug 19 00:16:29.777339 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:16:29.942414 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:16:29.952863 (kubelet)[1689]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:16:30.005326 kubelet[1689]: E0819 00:16:30.005248 1689 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:16:30.008404 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:16:30.008756 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:16:30.009415 systemd[1]: kubelet.service: Consumed 176ms CPU time, 106.5M memory peak. Aug 19 00:16:40.025087 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Aug 19 00:16:40.027734 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:16:40.194231 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:16:40.207117 (kubelet)[1704]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:16:40.256898 kubelet[1704]: E0819 00:16:40.256838 1704 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:16:40.260234 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:16:40.260444 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:16:40.261213 systemd[1]: kubelet.service: Consumed 175ms CPU time, 104.2M memory peak. Aug 19 00:16:41.833694 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Aug 19 00:16:41.835516 systemd[1]: Started sshd@0-91.99.87.156:22-176.65.148.235:46344.service - OpenSSH per-connection server daemon (176.65.148.235:46344). Aug 19 00:16:45.703358 sshd[1712]: kex_exchange_identification: read: Connection reset by peer Aug 19 00:16:45.703358 sshd[1712]: Connection reset by 176.65.148.235 port 46344 Aug 19 00:16:45.704950 systemd[1]: sshd@0-91.99.87.156:22-176.65.148.235:46344.service: Deactivated successfully. Aug 19 00:16:50.177692 systemd[1]: Started sshd@1-91.99.87.156:22-139.178.89.65:33084.service - OpenSSH per-connection server daemon (139.178.89.65:33084). Aug 19 00:16:50.275254 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Aug 19 00:16:50.279491 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:16:50.459449 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:16:50.474050 (kubelet)[1728]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:16:50.526834 kubelet[1728]: E0819 00:16:50.526784 1728 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:16:50.533850 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:16:50.533992 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:16:50.535099 systemd[1]: kubelet.service: Consumed 172ms CPU time, 107.2M memory peak. Aug 19 00:16:51.179064 sshd[1717]: Accepted publickey for core from 139.178.89.65 port 33084 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:16:51.182940 sshd-session[1717]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:16:51.197720 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Aug 19 00:16:51.199281 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Aug 19 00:16:51.203927 systemd-logind[1491]: New session 1 of user core. Aug 19 00:16:51.228819 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Aug 19 00:16:51.232005 systemd[1]: Starting user@500.service - User Manager for UID 500... Aug 19 00:16:51.246868 (systemd)[1737]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Aug 19 00:16:51.250689 systemd-logind[1491]: New session c1 of user core. Aug 19 00:16:51.405220 systemd[1737]: Queued start job for default target default.target. Aug 19 00:16:51.417001 systemd[1737]: Created slice app.slice - User Application Slice. Aug 19 00:16:51.417059 systemd[1737]: Reached target paths.target - Paths. Aug 19 00:16:51.417125 systemd[1737]: Reached target timers.target - Timers. Aug 19 00:16:51.419965 systemd[1737]: Starting dbus.socket - D-Bus User Message Bus Socket... Aug 19 00:16:51.453254 systemd[1737]: Listening on dbus.socket - D-Bus User Message Bus Socket. Aug 19 00:16:51.453394 systemd[1737]: Reached target sockets.target - Sockets. Aug 19 00:16:51.453603 systemd[1737]: Reached target basic.target - Basic System. Aug 19 00:16:51.453687 systemd[1]: Started user@500.service - User Manager for UID 500. Aug 19 00:16:51.454358 systemd[1737]: Reached target default.target - Main User Target. Aug 19 00:16:51.454414 systemd[1737]: Startup finished in 195ms. Aug 19 00:16:51.465506 systemd[1]: Started session-1.scope - Session 1 of User core. Aug 19 00:16:52.155893 systemd[1]: Started sshd@2-91.99.87.156:22-139.178.89.65:33096.service - OpenSSH per-connection server daemon (139.178.89.65:33096). Aug 19 00:16:52.278350 update_engine[1492]: I20250819 00:16:52.278030 1492 update_attempter.cc:509] Updating boot flags... Aug 19 00:16:53.164570 sshd[1748]: Accepted publickey for core from 139.178.89.65 port 33096 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:16:53.166751 sshd-session[1748]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:16:53.173735 systemd-logind[1491]: New session 2 of user core. Aug 19 00:16:53.185577 systemd[1]: Started session-2.scope - Session 2 of User core. Aug 19 00:16:53.850172 sshd[1767]: Connection closed by 139.178.89.65 port 33096 Aug 19 00:16:53.851243 sshd-session[1748]: pam_unix(sshd:session): session closed for user core Aug 19 00:16:53.857490 systemd[1]: sshd@2-91.99.87.156:22-139.178.89.65:33096.service: Deactivated successfully. Aug 19 00:16:53.859584 systemd[1]: session-2.scope: Deactivated successfully. Aug 19 00:16:53.860845 systemd-logind[1491]: Session 2 logged out. Waiting for processes to exit. Aug 19 00:16:53.862832 systemd-logind[1491]: Removed session 2. Aug 19 00:16:54.031656 systemd[1]: Started sshd@3-91.99.87.156:22-139.178.89.65:33106.service - OpenSSH per-connection server daemon (139.178.89.65:33106). Aug 19 00:16:55.058076 sshd[1773]: Accepted publickey for core from 139.178.89.65 port 33106 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:16:55.060508 sshd-session[1773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:16:55.067291 systemd-logind[1491]: New session 3 of user core. Aug 19 00:16:55.075406 systemd[1]: Started session-3.scope - Session 3 of User core. Aug 19 00:16:55.744269 sshd[1776]: Connection closed by 139.178.89.65 port 33106 Aug 19 00:16:55.744948 sshd-session[1773]: pam_unix(sshd:session): session closed for user core Aug 19 00:16:55.750046 systemd-logind[1491]: Session 3 logged out. Waiting for processes to exit. Aug 19 00:16:55.750516 systemd[1]: sshd@3-91.99.87.156:22-139.178.89.65:33106.service: Deactivated successfully. Aug 19 00:16:55.753202 systemd[1]: session-3.scope: Deactivated successfully. Aug 19 00:16:55.756763 systemd-logind[1491]: Removed session 3. Aug 19 00:16:55.916369 systemd[1]: Started sshd@4-91.99.87.156:22-139.178.89.65:33118.service - OpenSSH per-connection server daemon (139.178.89.65:33118). Aug 19 00:16:56.923091 sshd[1782]: Accepted publickey for core from 139.178.89.65 port 33118 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:16:56.925754 sshd-session[1782]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:16:56.930824 systemd-logind[1491]: New session 4 of user core. Aug 19 00:16:56.939463 systemd[1]: Started session-4.scope - Session 4 of User core. Aug 19 00:16:57.607019 sshd[1785]: Connection closed by 139.178.89.65 port 33118 Aug 19 00:16:57.608229 sshd-session[1782]: pam_unix(sshd:session): session closed for user core Aug 19 00:16:57.615466 systemd[1]: sshd@4-91.99.87.156:22-139.178.89.65:33118.service: Deactivated successfully. Aug 19 00:16:57.617754 systemd[1]: session-4.scope: Deactivated successfully. Aug 19 00:16:57.618891 systemd-logind[1491]: Session 4 logged out. Waiting for processes to exit. Aug 19 00:16:57.621489 systemd-logind[1491]: Removed session 4. Aug 19 00:16:57.783512 systemd[1]: Started sshd@5-91.99.87.156:22-139.178.89.65:33130.service - OpenSSH per-connection server daemon (139.178.89.65:33130). Aug 19 00:16:58.791780 sshd[1791]: Accepted publickey for core from 139.178.89.65 port 33130 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:16:58.793955 sshd-session[1791]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:16:58.801738 systemd-logind[1491]: New session 5 of user core. Aug 19 00:16:58.809478 systemd[1]: Started session-5.scope - Session 5 of User core. Aug 19 00:16:59.328346 sudo[1795]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Aug 19 00:16:59.328847 sudo[1795]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:16:59.348191 sudo[1795]: pam_unix(sudo:session): session closed for user root Aug 19 00:16:59.508834 sshd[1794]: Connection closed by 139.178.89.65 port 33130 Aug 19 00:16:59.510241 sshd-session[1791]: pam_unix(sshd:session): session closed for user core Aug 19 00:16:59.514618 systemd[1]: session-5.scope: Deactivated successfully. Aug 19 00:16:59.516661 systemd-logind[1491]: Session 5 logged out. Waiting for processes to exit. Aug 19 00:16:59.517326 systemd[1]: sshd@5-91.99.87.156:22-139.178.89.65:33130.service: Deactivated successfully. Aug 19 00:16:59.523059 systemd-logind[1491]: Removed session 5. Aug 19 00:16:59.704077 systemd[1]: Started sshd@6-91.99.87.156:22-139.178.89.65:51192.service - OpenSSH per-connection server daemon (139.178.89.65:51192). Aug 19 00:17:00.585693 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Aug 19 00:17:00.591061 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:17:00.766362 sshd[1801]: Accepted publickey for core from 139.178.89.65 port 51192 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:17:00.768679 sshd-session[1801]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:00.775233 systemd-logind[1491]: New session 6 of user core. Aug 19 00:17:00.777780 systemd[1]: Started session-6.scope - Session 6 of User core. Aug 19 00:17:00.781362 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:17:00.793593 (kubelet)[1811]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:17:00.843415 kubelet[1811]: E0819 00:17:00.843008 1811 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:17:00.846915 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:17:00.847125 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:17:00.848376 systemd[1]: kubelet.service: Consumed 184ms CPU time, 107.2M memory peak. Aug 19 00:17:01.321421 sudo[1821]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Aug 19 00:17:01.321715 sudo[1821]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:17:01.328898 sudo[1821]: pam_unix(sudo:session): session closed for user root Aug 19 00:17:01.335878 sudo[1820]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Aug 19 00:17:01.336602 sudo[1820]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:17:01.347839 systemd[1]: Starting audit-rules.service - Load Audit Rules... Aug 19 00:17:01.394702 augenrules[1843]: No rules Aug 19 00:17:01.396371 systemd[1]: audit-rules.service: Deactivated successfully. Aug 19 00:17:01.396761 systemd[1]: Finished audit-rules.service - Load Audit Rules. Aug 19 00:17:01.398884 sudo[1820]: pam_unix(sudo:session): session closed for user root Aug 19 00:17:01.569256 sshd[1813]: Connection closed by 139.178.89.65 port 51192 Aug 19 00:17:01.569939 sshd-session[1801]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:01.575602 systemd[1]: sshd@6-91.99.87.156:22-139.178.89.65:51192.service: Deactivated successfully. Aug 19 00:17:01.578331 systemd[1]: session-6.scope: Deactivated successfully. Aug 19 00:17:01.579908 systemd-logind[1491]: Session 6 logged out. Waiting for processes to exit. Aug 19 00:17:01.582055 systemd-logind[1491]: Removed session 6. Aug 19 00:17:01.734320 systemd[1]: Started sshd@7-91.99.87.156:22-139.178.89.65:51198.service - OpenSSH per-connection server daemon (139.178.89.65:51198). Aug 19 00:17:02.755020 sshd[1852]: Accepted publickey for core from 139.178.89.65 port 51198 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:17:02.756978 sshd-session[1852]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:17:02.762831 systemd-logind[1491]: New session 7 of user core. Aug 19 00:17:02.768495 systemd[1]: Started session-7.scope - Session 7 of User core. Aug 19 00:17:03.279045 sudo[1856]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Aug 19 00:17:03.279736 sudo[1856]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Aug 19 00:17:03.627396 systemd[1]: Starting docker.service - Docker Application Container Engine... Aug 19 00:17:03.649819 (dockerd)[1874]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Aug 19 00:17:03.890301 dockerd[1874]: time="2025-08-19T00:17:03.889936979Z" level=info msg="Starting up" Aug 19 00:17:03.892937 dockerd[1874]: time="2025-08-19T00:17:03.892868977Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Aug 19 00:17:03.910048 dockerd[1874]: time="2025-08-19T00:17:03.909959034Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Aug 19 00:17:03.932438 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport3252996390-merged.mount: Deactivated successfully. Aug 19 00:17:03.952106 dockerd[1874]: time="2025-08-19T00:17:03.952026638Z" level=info msg="Loading containers: start." Aug 19 00:17:03.965182 kernel: Initializing XFRM netlink socket Aug 19 00:17:04.249076 systemd-networkd[1412]: docker0: Link UP Aug 19 00:17:04.256230 dockerd[1874]: time="2025-08-19T00:17:04.256111857Z" level=info msg="Loading containers: done." Aug 19 00:17:04.277588 dockerd[1874]: time="2025-08-19T00:17:04.277465606Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Aug 19 00:17:04.277588 dockerd[1874]: time="2025-08-19T00:17:04.277595929Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Aug 19 00:17:04.277913 dockerd[1874]: time="2025-08-19T00:17:04.277806575Z" level=info msg="Initializing buildkit" Aug 19 00:17:04.308994 dockerd[1874]: time="2025-08-19T00:17:04.308922373Z" level=info msg="Completed buildkit initialization" Aug 19 00:17:04.320539 dockerd[1874]: time="2025-08-19T00:17:04.320353787Z" level=info msg="Daemon has completed initialization" Aug 19 00:17:04.320672 dockerd[1874]: time="2025-08-19T00:17:04.320422229Z" level=info msg="API listen on /run/docker.sock" Aug 19 00:17:04.320803 systemd[1]: Started docker.service - Docker Application Container Engine. Aug 19 00:17:05.168026 containerd[1513]: time="2025-08-19T00:17:05.167956188Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\"" Aug 19 00:17:05.783302 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1889004545.mount: Deactivated successfully. Aug 19 00:17:06.746348 containerd[1513]: time="2025-08-19T00:17:06.746257307Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:06.748586 containerd[1513]: time="2025-08-19T00:17:06.748535561Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.4: active requests=0, bytes read=27352705" Aug 19 00:17:06.750778 containerd[1513]: time="2025-08-19T00:17:06.750694292Z" level=info msg="ImageCreate event name:\"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:06.755940 containerd[1513]: time="2025-08-19T00:17:06.755864775Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:06.757648 containerd[1513]: time="2025-08-19T00:17:06.757558055Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.4\" with image id \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.4\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0d441d0d347145b3f02f20cb313239cdae86067643d7f70803fab8bac2d28876\", size \"27349413\" in 1.589538666s" Aug 19 00:17:06.757648 containerd[1513]: time="2025-08-19T00:17:06.757612856Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.4\" returns image reference \"sha256:8dd08b7ae4433dd43482755f08ee0afd6de00c6ece25a8dc5814ebb4b7978e98\"" Aug 19 00:17:06.759669 containerd[1513]: time="2025-08-19T00:17:06.759574503Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\"" Aug 19 00:17:07.908159 containerd[1513]: time="2025-08-19T00:17:07.908051531Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:07.910194 containerd[1513]: time="2025-08-19T00:17:07.910117579Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.4: active requests=0, bytes read=23536997" Aug 19 00:17:07.912086 containerd[1513]: time="2025-08-19T00:17:07.911989862Z" level=info msg="ImageCreate event name:\"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:07.915989 containerd[1513]: time="2025-08-19T00:17:07.915902031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:07.917309 containerd[1513]: time="2025-08-19T00:17:07.917058218Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.4\" with image id \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.4\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:bd22c2af2f30a8f818568b4d5fe131098fdd38267e9e07872cfc33e8f5876bc3\", size \"25093155\" in 1.157407152s" Aug 19 00:17:07.917309 containerd[1513]: time="2025-08-19T00:17:07.917100379Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.4\" returns image reference \"sha256:4e90c11ce4b770c38b26b3401b39c25e9871474a71ecb5eaea72082e21ba587d\"" Aug 19 00:17:07.919013 containerd[1513]: time="2025-08-19T00:17:07.918836578Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\"" Aug 19 00:17:08.990162 containerd[1513]: time="2025-08-19T00:17:08.989151645Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:08.992223 containerd[1513]: time="2025-08-19T00:17:08.992176431Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.4: active requests=0, bytes read=18292034" Aug 19 00:17:08.993499 containerd[1513]: time="2025-08-19T00:17:08.993457140Z" level=info msg="ImageCreate event name:\"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:08.998975 containerd[1513]: time="2025-08-19T00:17:08.998883860Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:09.000950 containerd[1513]: time="2025-08-19T00:17:09.000835623Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.4\" with image id \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.4\", repo digest \"registry.k8s.io/kube-scheduler@sha256:71533e5a960e2955a54164905e92dac516ec874a23e0bf31304db82650101a4a\", size \"19848210\" in 1.081953443s" Aug 19 00:17:09.000950 containerd[1513]: time="2025-08-19T00:17:09.000899064Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.4\" returns image reference \"sha256:10c245abf58045f1a856bebca4ed8e0abfabe4c0256d5a3f0c475fed70c8ce59\"" Aug 19 00:17:09.001674 containerd[1513]: time="2025-08-19T00:17:09.001623960Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\"" Aug 19 00:17:10.063079 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2272651153.mount: Deactivated successfully. Aug 19 00:17:10.494064 containerd[1513]: time="2025-08-19T00:17:10.493867389Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:10.496650 containerd[1513]: time="2025-08-19T00:17:10.496539644Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.4: active requests=0, bytes read=28199985" Aug 19 00:17:10.500752 containerd[1513]: time="2025-08-19T00:17:10.499703390Z" level=info msg="ImageCreate event name:\"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:10.504457 containerd[1513]: time="2025-08-19T00:17:10.504305325Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:10.504801 containerd[1513]: time="2025-08-19T00:17:10.504699853Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.4\" with image id \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\", repo tag \"registry.k8s.io/kube-proxy:v1.33.4\", repo digest \"registry.k8s.io/kube-proxy@sha256:bb04e9247da3aaeb96406b4d530a79fc865695b6807353dd1a28871df0d7f837\", size \"28198978\" in 1.503020171s" Aug 19 00:17:10.504878 containerd[1513]: time="2025-08-19T00:17:10.504801215Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.4\" returns image reference \"sha256:e19c0cda155dad39120317830ddb8b2bc22070f2c6a97973e96fb09ef504ee64\"" Aug 19 00:17:10.506007 containerd[1513]: time="2025-08-19T00:17:10.505960999Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Aug 19 00:17:11.025418 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 6. Aug 19 00:17:11.029278 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:17:11.080466 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount896655098.mount: Deactivated successfully. Aug 19 00:17:11.232368 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:17:11.245765 (kubelet)[2177]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Aug 19 00:17:11.322471 kubelet[2177]: E0819 00:17:11.321383 2177 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Aug 19 00:17:11.325733 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Aug 19 00:17:11.325868 systemd[1]: kubelet.service: Failed with result 'exit-code'. Aug 19 00:17:11.327466 systemd[1]: kubelet.service: Consumed 184ms CPU time, 105.1M memory peak. Aug 19 00:17:11.777896 containerd[1513]: time="2025-08-19T00:17:11.777846133Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:11.780852 containerd[1513]: time="2025-08-19T00:17:11.780812352Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=19152209" Aug 19 00:17:11.782221 containerd[1513]: time="2025-08-19T00:17:11.782180579Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:11.789104 containerd[1513]: time="2025-08-19T00:17:11.789040716Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:11.791561 containerd[1513]: time="2025-08-19T00:17:11.791491045Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.285247961s" Aug 19 00:17:11.791561 containerd[1513]: time="2025-08-19T00:17:11.791557446Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Aug 19 00:17:11.792498 containerd[1513]: time="2025-08-19T00:17:11.792203779Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Aug 19 00:17:12.323261 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2370676012.mount: Deactivated successfully. Aug 19 00:17:12.331217 containerd[1513]: time="2025-08-19T00:17:12.331097764Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 00:17:12.333450 containerd[1513]: time="2025-08-19T00:17:12.333349087Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=268723" Aug 19 00:17:12.334586 containerd[1513]: time="2025-08-19T00:17:12.334519270Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 00:17:12.338113 containerd[1513]: time="2025-08-19T00:17:12.338021257Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Aug 19 00:17:12.338787 containerd[1513]: time="2025-08-19T00:17:12.338642469Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 546.360328ms" Aug 19 00:17:12.338787 containerd[1513]: time="2025-08-19T00:17:12.338677390Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Aug 19 00:17:12.339383 containerd[1513]: time="2025-08-19T00:17:12.339363203Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Aug 19 00:17:12.825450 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2501111475.mount: Deactivated successfully. Aug 19 00:17:14.258234 containerd[1513]: time="2025-08-19T00:17:14.258065659Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:14.260556 containerd[1513]: time="2025-08-19T00:17:14.260476023Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=69465339" Aug 19 00:17:14.261221 containerd[1513]: time="2025-08-19T00:17:14.261150835Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:14.264842 containerd[1513]: time="2025-08-19T00:17:14.264763381Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:14.266833 containerd[1513]: time="2025-08-19T00:17:14.266569494Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 1.927170649s" Aug 19 00:17:14.266833 containerd[1513]: time="2025-08-19T00:17:14.266618174Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Aug 19 00:17:20.398746 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:17:20.398956 systemd[1]: kubelet.service: Consumed 184ms CPU time, 105.1M memory peak. Aug 19 00:17:20.402370 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:17:20.440334 systemd[1]: Reload requested from client PID 2311 ('systemctl') (unit session-7.scope)... Aug 19 00:17:20.440357 systemd[1]: Reloading... Aug 19 00:17:20.576170 zram_generator::config[2355]: No configuration found. Aug 19 00:17:20.791357 systemd[1]: Reloading finished in 350 ms. Aug 19 00:17:20.851618 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Aug 19 00:17:20.851900 systemd[1]: kubelet.service: Failed with result 'signal'. Aug 19 00:17:20.852585 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:17:20.852637 systemd[1]: kubelet.service: Consumed 118ms CPU time, 95M memory peak. Aug 19 00:17:20.855680 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:17:21.013815 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:17:21.034695 (kubelet)[2403]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 00:17:21.088880 kubelet[2403]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:17:21.088880 kubelet[2403]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 00:17:21.088880 kubelet[2403]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:17:21.088880 kubelet[2403]: I0819 00:17:21.088532 2403 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 00:17:21.554990 kubelet[2403]: I0819 00:17:21.554875 2403 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 19 00:17:21.554990 kubelet[2403]: I0819 00:17:21.554971 2403 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 00:17:21.555626 kubelet[2403]: I0819 00:17:21.555574 2403 server.go:956] "Client rotation is on, will bootstrap in background" Aug 19 00:17:21.597491 kubelet[2403]: E0819 00:17:21.597433 2403 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://91.99.87.156:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 91.99.87.156:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Aug 19 00:17:21.598816 kubelet[2403]: I0819 00:17:21.598624 2403 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 00:17:21.612653 kubelet[2403]: I0819 00:17:21.612617 2403 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 00:17:21.616106 kubelet[2403]: I0819 00:17:21.616028 2403 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 00:17:21.618847 kubelet[2403]: I0819 00:17:21.618735 2403 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 00:17:21.619180 kubelet[2403]: I0819 00:17:21.618829 2403 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426-0-0-8-661ee896d9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 00:17:21.619432 kubelet[2403]: I0819 00:17:21.619238 2403 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 00:17:21.619432 kubelet[2403]: I0819 00:17:21.619263 2403 container_manager_linux.go:303] "Creating device plugin manager" Aug 19 00:17:21.619633 kubelet[2403]: I0819 00:17:21.619561 2403 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:17:21.623754 kubelet[2403]: I0819 00:17:21.623709 2403 kubelet.go:480] "Attempting to sync node with API server" Aug 19 00:17:21.623754 kubelet[2403]: I0819 00:17:21.623746 2403 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 00:17:21.623856 kubelet[2403]: I0819 00:17:21.623775 2403 kubelet.go:386] "Adding apiserver pod source" Aug 19 00:17:21.625695 kubelet[2403]: I0819 00:17:21.625224 2403 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 00:17:21.629017 kubelet[2403]: E0819 00:17:21.628984 2403 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://91.99.87.156:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4426-0-0-8-661ee896d9&limit=500&resourceVersion=0\": dial tcp 91.99.87.156:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Aug 19 00:17:21.629565 kubelet[2403]: I0819 00:17:21.629543 2403 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 00:17:21.630614 kubelet[2403]: I0819 00:17:21.630592 2403 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 19 00:17:21.630827 kubelet[2403]: W0819 00:17:21.630813 2403 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Aug 19 00:17:21.638178 kubelet[2403]: E0819 00:17:21.638086 2403 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://91.99.87.156:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 91.99.87.156:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Aug 19 00:17:21.640195 kubelet[2403]: I0819 00:17:21.638672 2403 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 00:17:21.640195 kubelet[2403]: I0819 00:17:21.638730 2403 server.go:1289] "Started kubelet" Aug 19 00:17:21.643939 kubelet[2403]: I0819 00:17:21.643854 2403 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 00:17:21.647996 kubelet[2403]: E0819 00:17:21.646261 2403 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://91.99.87.156:6443/api/v1/namespaces/default/events\": dial tcp 91.99.87.156:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4426-0-0-8-661ee896d9.185d02ebd3d21244 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4426-0-0-8-661ee896d9,UID:ci-4426-0-0-8-661ee896d9,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4426-0-0-8-661ee896d9,},FirstTimestamp:2025-08-19 00:17:21.638691396 +0000 UTC m=+0.597912509,LastTimestamp:2025-08-19 00:17:21.638691396 +0000 UTC m=+0.597912509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4426-0-0-8-661ee896d9,}" Aug 19 00:17:21.650199 kubelet[2403]: I0819 00:17:21.650072 2403 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 00:17:21.651114 kubelet[2403]: I0819 00:17:21.651078 2403 server.go:317] "Adding debug handlers to kubelet server" Aug 19 00:17:21.652863 kubelet[2403]: I0819 00:17:21.652829 2403 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 19 00:17:21.656310 kubelet[2403]: I0819 00:17:21.656054 2403 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 00:17:21.656466 kubelet[2403]: I0819 00:17:21.656437 2403 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 00:17:21.656744 kubelet[2403]: I0819 00:17:21.656712 2403 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 00:17:21.657646 kubelet[2403]: E0819 00:17:21.657622 2403 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426-0-0-8-661ee896d9\" not found" Aug 19 00:17:21.657724 kubelet[2403]: I0819 00:17:21.657665 2403 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 00:17:21.658182 kubelet[2403]: I0819 00:17:21.657886 2403 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 00:17:21.658182 kubelet[2403]: I0819 00:17:21.657975 2403 reconciler.go:26] "Reconciler: start to sync state" Aug 19 00:17:21.658692 kubelet[2403]: E0819 00:17:21.658651 2403 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://91.99.87.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.87.156:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 19 00:17:21.659600 kubelet[2403]: I0819 00:17:21.659379 2403 factory.go:223] Registration of the systemd container factory successfully Aug 19 00:17:21.659600 kubelet[2403]: I0819 00:17:21.659548 2403 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 00:17:21.660338 kubelet[2403]: E0819 00:17:21.660206 2403 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.87.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-0-0-8-661ee896d9?timeout=10s\": dial tcp 91.99.87.156:6443: connect: connection refused" interval="200ms" Aug 19 00:17:21.660502 kubelet[2403]: E0819 00:17:21.660450 2403 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Aug 19 00:17:21.661351 kubelet[2403]: I0819 00:17:21.661323 2403 factory.go:223] Registration of the containerd container factory successfully Aug 19 00:17:21.680549 kubelet[2403]: I0819 00:17:21.680512 2403 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 19 00:17:21.680728 kubelet[2403]: I0819 00:17:21.680716 2403 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 19 00:17:21.680809 kubelet[2403]: I0819 00:17:21.680796 2403 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 00:17:21.680855 kubelet[2403]: I0819 00:17:21.680848 2403 kubelet.go:2436] "Starting kubelet main sync loop" Aug 19 00:17:21.681009 kubelet[2403]: E0819 00:17:21.680987 2403 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 00:17:21.686596 kubelet[2403]: E0819 00:17:21.686481 2403 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://91.99.87.156:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 91.99.87.156:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Aug 19 00:17:21.693343 kubelet[2403]: I0819 00:17:21.693304 2403 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 00:17:21.693343 kubelet[2403]: I0819 00:17:21.693326 2403 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 00:17:21.693570 kubelet[2403]: I0819 00:17:21.693358 2403 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:17:21.695927 kubelet[2403]: I0819 00:17:21.695889 2403 policy_none.go:49] "None policy: Start" Aug 19 00:17:21.695927 kubelet[2403]: I0819 00:17:21.695933 2403 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 00:17:21.696072 kubelet[2403]: I0819 00:17:21.695950 2403 state_mem.go:35] "Initializing new in-memory state store" Aug 19 00:17:21.705742 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Aug 19 00:17:21.726058 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Aug 19 00:17:21.730289 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Aug 19 00:17:21.740985 kubelet[2403]: E0819 00:17:21.740913 2403 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 19 00:17:21.741330 kubelet[2403]: I0819 00:17:21.741189 2403 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 00:17:21.741330 kubelet[2403]: I0819 00:17:21.741209 2403 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 00:17:21.741723 kubelet[2403]: I0819 00:17:21.741687 2403 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 00:17:21.742630 kubelet[2403]: E0819 00:17:21.742596 2403 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 00:17:21.742936 kubelet[2403]: E0819 00:17:21.742647 2403 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4426-0-0-8-661ee896d9\" not found" Aug 19 00:17:21.797361 systemd[1]: Created slice kubepods-burstable-podcb812f2df244cf96c0541d3c90293f04.slice - libcontainer container kubepods-burstable-podcb812f2df244cf96c0541d3c90293f04.slice. Aug 19 00:17:21.816967 kubelet[2403]: E0819 00:17:21.816805 2403 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-8-661ee896d9\" not found" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.826775 systemd[1]: Created slice kubepods-burstable-pod326e11fc8851a61617f778ad49dd8810.slice - libcontainer container kubepods-burstable-pod326e11fc8851a61617f778ad49dd8810.slice. Aug 19 00:17:21.841650 kubelet[2403]: E0819 00:17:21.841588 2403 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-8-661ee896d9\" not found" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.844593 kubelet[2403]: I0819 00:17:21.844469 2403 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.845914 kubelet[2403]: E0819 00:17:21.845861 2403 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.87.156:6443/api/v1/nodes\": dial tcp 91.99.87.156:6443: connect: connection refused" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.846914 systemd[1]: Created slice kubepods-burstable-pod72f08a9da9924fbf8727c28ab1922188.slice - libcontainer container kubepods-burstable-pod72f08a9da9924fbf8727c28ab1922188.slice. Aug 19 00:17:21.849987 kubelet[2403]: E0819 00:17:21.849925 2403 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-8-661ee896d9\" not found" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.858468 kubelet[2403]: I0819 00:17:21.858345 2403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-flexvolume-dir\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.858468 kubelet[2403]: I0819 00:17:21.858505 2403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-k8s-certs\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.858964 kubelet[2403]: I0819 00:17:21.858559 2403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/326e11fc8851a61617f778ad49dd8810-kubeconfig\") pod \"kube-scheduler-ci-4426-0-0-8-661ee896d9\" (UID: \"326e11fc8851a61617f778ad49dd8810\") " pod="kube-system/kube-scheduler-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.858964 kubelet[2403]: I0819 00:17:21.858671 2403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/72f08a9da9924fbf8727c28ab1922188-k8s-certs\") pod \"kube-apiserver-ci-4426-0-0-8-661ee896d9\" (UID: \"72f08a9da9924fbf8727c28ab1922188\") " pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.858964 kubelet[2403]: I0819 00:17:21.858708 2403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-ca-certs\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.858964 kubelet[2403]: I0819 00:17:21.858741 2403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-kubeconfig\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.858964 kubelet[2403]: I0819 00:17:21.858770 2403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.859272 kubelet[2403]: I0819 00:17:21.858801 2403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/72f08a9da9924fbf8727c28ab1922188-ca-certs\") pod \"kube-apiserver-ci-4426-0-0-8-661ee896d9\" (UID: \"72f08a9da9924fbf8727c28ab1922188\") " pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.859272 kubelet[2403]: I0819 00:17:21.858830 2403 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/72f08a9da9924fbf8727c28ab1922188-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426-0-0-8-661ee896d9\" (UID: \"72f08a9da9924fbf8727c28ab1922188\") " pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:21.861057 kubelet[2403]: E0819 00:17:21.860978 2403 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.87.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-0-0-8-661ee896d9?timeout=10s\": dial tcp 91.99.87.156:6443: connect: connection refused" interval="400ms" Aug 19 00:17:22.050540 kubelet[2403]: I0819 00:17:22.050105 2403 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:22.050684 kubelet[2403]: E0819 00:17:22.050586 2403 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.87.156:6443/api/v1/nodes\": dial tcp 91.99.87.156:6443: connect: connection refused" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:22.121804 containerd[1513]: time="2025-08-19T00:17:22.121595441Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426-0-0-8-661ee896d9,Uid:cb812f2df244cf96c0541d3c90293f04,Namespace:kube-system,Attempt:0,}" Aug 19 00:17:22.143314 containerd[1513]: time="2025-08-19T00:17:22.143095440Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426-0-0-8-661ee896d9,Uid:326e11fc8851a61617f778ad49dd8810,Namespace:kube-system,Attempt:0,}" Aug 19 00:17:22.160323 containerd[1513]: time="2025-08-19T00:17:22.160217694Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426-0-0-8-661ee896d9,Uid:72f08a9da9924fbf8727c28ab1922188,Namespace:kube-system,Attempt:0,}" Aug 19 00:17:22.165297 containerd[1513]: time="2025-08-19T00:17:22.164853883Z" level=info msg="connecting to shim 08610aac7490edbdcb2cd34a21b303b2be8801bf84c8767f6d652874fe8be92c" address="unix:///run/containerd/s/848f3b9b1bf391bb6279f8a327e45a528f691c3bb379b5ff521a819e9459302e" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:17:22.204718 containerd[1513]: time="2025-08-19T00:17:22.204673194Z" level=info msg="connecting to shim cc3eccd11b77559b95d902ee0d5e18f01ac3a5f6ebae409349527365b125cae7" address="unix:///run/containerd/s/410071481cd29778c450f7473d5cf68fce7b3515ccb028452043e30a6bc95b65" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:17:22.207442 systemd[1]: Started cri-containerd-08610aac7490edbdcb2cd34a21b303b2be8801bf84c8767f6d652874fe8be92c.scope - libcontainer container 08610aac7490edbdcb2cd34a21b303b2be8801bf84c8767f6d652874fe8be92c. Aug 19 00:17:22.225428 containerd[1513]: time="2025-08-19T00:17:22.225382581Z" level=info msg="connecting to shim 335c4fa9102d8ddeb96a8c2983990e7c2616e85348c3e2494785df944e322141" address="unix:///run/containerd/s/6eb399a64592f6958f2c66e6c3acae64c2d7700174a86fd1c408a20ebbafca38" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:17:22.255459 systemd[1]: Started cri-containerd-335c4fa9102d8ddeb96a8c2983990e7c2616e85348c3e2494785df944e322141.scope - libcontainer container 335c4fa9102d8ddeb96a8c2983990e7c2616e85348c3e2494785df944e322141. Aug 19 00:17:22.261948 kubelet[2403]: E0819 00:17:22.261897 2403 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://91.99.87.156:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4426-0-0-8-661ee896d9?timeout=10s\": dial tcp 91.99.87.156:6443: connect: connection refused" interval="800ms" Aug 19 00:17:22.266857 systemd[1]: Started cri-containerd-cc3eccd11b77559b95d902ee0d5e18f01ac3a5f6ebae409349527365b125cae7.scope - libcontainer container cc3eccd11b77559b95d902ee0d5e18f01ac3a5f6ebae409349527365b125cae7. Aug 19 00:17:22.276662 containerd[1513]: time="2025-08-19T00:17:22.276248416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4426-0-0-8-661ee896d9,Uid:cb812f2df244cf96c0541d3c90293f04,Namespace:kube-system,Attempt:0,} returns sandbox id \"08610aac7490edbdcb2cd34a21b303b2be8801bf84c8767f6d652874fe8be92c\"" Aug 19 00:17:22.286103 containerd[1513]: time="2025-08-19T00:17:22.286053921Z" level=info msg="CreateContainer within sandbox \"08610aac7490edbdcb2cd34a21b303b2be8801bf84c8767f6d652874fe8be92c\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Aug 19 00:17:22.298731 containerd[1513]: time="2025-08-19T00:17:22.298528986Z" level=info msg="Container cf26c430af29715563799428f9cf4ba4e2a1b5128265ecf315f64d67e5d4a1d2: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:17:22.311286 containerd[1513]: time="2025-08-19T00:17:22.311227615Z" level=info msg="CreateContainer within sandbox \"08610aac7490edbdcb2cd34a21b303b2be8801bf84c8767f6d652874fe8be92c\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"cf26c430af29715563799428f9cf4ba4e2a1b5128265ecf315f64d67e5d4a1d2\"" Aug 19 00:17:22.312535 containerd[1513]: time="2025-08-19T00:17:22.312481473Z" level=info msg="StartContainer for \"cf26c430af29715563799428f9cf4ba4e2a1b5128265ecf315f64d67e5d4a1d2\"" Aug 19 00:17:22.322217 containerd[1513]: time="2025-08-19T00:17:22.322166857Z" level=info msg="connecting to shim cf26c430af29715563799428f9cf4ba4e2a1b5128265ecf315f64d67e5d4a1d2" address="unix:///run/containerd/s/848f3b9b1bf391bb6279f8a327e45a528f691c3bb379b5ff521a819e9459302e" protocol=ttrpc version=3 Aug 19 00:17:22.331162 containerd[1513]: time="2025-08-19T00:17:22.330989188Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4426-0-0-8-661ee896d9,Uid:326e11fc8851a61617f778ad49dd8810,Namespace:kube-system,Attempt:0,} returns sandbox id \"cc3eccd11b77559b95d902ee0d5e18f01ac3a5f6ebae409349527365b125cae7\"" Aug 19 00:17:22.339866 containerd[1513]: time="2025-08-19T00:17:22.339822359Z" level=info msg="CreateContainer within sandbox \"cc3eccd11b77559b95d902ee0d5e18f01ac3a5f6ebae409349527365b125cae7\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Aug 19 00:17:22.340965 containerd[1513]: time="2025-08-19T00:17:22.340887175Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4426-0-0-8-661ee896d9,Uid:72f08a9da9924fbf8727c28ab1922188,Namespace:kube-system,Attempt:0,} returns sandbox id \"335c4fa9102d8ddeb96a8c2983990e7c2616e85348c3e2494785df944e322141\"" Aug 19 00:17:22.354185 containerd[1513]: time="2025-08-19T00:17:22.353536082Z" level=info msg="Container 6eeefe68dc1b73d22810ebeff6991fe4cd73df77148e8f5c7cfe05ffe072c777: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:17:22.358388 systemd[1]: Started cri-containerd-cf26c430af29715563799428f9cf4ba4e2a1b5128265ecf315f64d67e5d4a1d2.scope - libcontainer container cf26c430af29715563799428f9cf4ba4e2a1b5128265ecf315f64d67e5d4a1d2. Aug 19 00:17:22.366010 containerd[1513]: time="2025-08-19T00:17:22.365968747Z" level=info msg="CreateContainer within sandbox \"335c4fa9102d8ddeb96a8c2983990e7c2616e85348c3e2494785df944e322141\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Aug 19 00:17:22.377224 containerd[1513]: time="2025-08-19T00:17:22.376856988Z" level=info msg="CreateContainer within sandbox \"cc3eccd11b77559b95d902ee0d5e18f01ac3a5f6ebae409349527365b125cae7\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"6eeefe68dc1b73d22810ebeff6991fe4cd73df77148e8f5c7cfe05ffe072c777\"" Aug 19 00:17:22.381807 containerd[1513]: time="2025-08-19T00:17:22.381352095Z" level=info msg="StartContainer for \"6eeefe68dc1b73d22810ebeff6991fe4cd73df77148e8f5c7cfe05ffe072c777\"" Aug 19 00:17:22.384189 containerd[1513]: time="2025-08-19T00:17:22.384053455Z" level=info msg="connecting to shim 6eeefe68dc1b73d22810ebeff6991fe4cd73df77148e8f5c7cfe05ffe072c777" address="unix:///run/containerd/s/410071481cd29778c450f7473d5cf68fce7b3515ccb028452043e30a6bc95b65" protocol=ttrpc version=3 Aug 19 00:17:22.393256 containerd[1513]: time="2025-08-19T00:17:22.392738144Z" level=info msg="Container 8f32befb35468c62f7a75f45f2d29f2053e6a336ddd7272ea22a091d66fc4e5a: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:17:22.407305 containerd[1513]: time="2025-08-19T00:17:22.407258719Z" level=info msg="CreateContainer within sandbox \"335c4fa9102d8ddeb96a8c2983990e7c2616e85348c3e2494785df944e322141\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"8f32befb35468c62f7a75f45f2d29f2053e6a336ddd7272ea22a091d66fc4e5a\"" Aug 19 00:17:22.410160 containerd[1513]: time="2025-08-19T00:17:22.410049161Z" level=info msg="StartContainer for \"8f32befb35468c62f7a75f45f2d29f2053e6a336ddd7272ea22a091d66fc4e5a\"" Aug 19 00:17:22.411663 containerd[1513]: time="2025-08-19T00:17:22.411617864Z" level=info msg="connecting to shim 8f32befb35468c62f7a75f45f2d29f2053e6a336ddd7272ea22a091d66fc4e5a" address="unix:///run/containerd/s/6eb399a64592f6958f2c66e6c3acae64c2d7700174a86fd1c408a20ebbafca38" protocol=ttrpc version=3 Aug 19 00:17:22.413736 systemd[1]: Started cri-containerd-6eeefe68dc1b73d22810ebeff6991fe4cd73df77148e8f5c7cfe05ffe072c777.scope - libcontainer container 6eeefe68dc1b73d22810ebeff6991fe4cd73df77148e8f5c7cfe05ffe072c777. Aug 19 00:17:22.436815 containerd[1513]: time="2025-08-19T00:17:22.436772957Z" level=info msg="StartContainer for \"cf26c430af29715563799428f9cf4ba4e2a1b5128265ecf315f64d67e5d4a1d2\" returns successfully" Aug 19 00:17:22.454412 kubelet[2403]: I0819 00:17:22.454292 2403 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:22.455075 kubelet[2403]: E0819 00:17:22.455014 2403 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://91.99.87.156:6443/api/v1/nodes\": dial tcp 91.99.87.156:6443: connect: connection refused" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:22.461555 systemd[1]: Started cri-containerd-8f32befb35468c62f7a75f45f2d29f2053e6a336ddd7272ea22a091d66fc4e5a.scope - libcontainer container 8f32befb35468c62f7a75f45f2d29f2053e6a336ddd7272ea22a091d66fc4e5a. Aug 19 00:17:22.495910 containerd[1513]: time="2025-08-19T00:17:22.495840674Z" level=info msg="StartContainer for \"6eeefe68dc1b73d22810ebeff6991fe4cd73df77148e8f5c7cfe05ffe072c777\" returns successfully" Aug 19 00:17:22.546558 containerd[1513]: time="2025-08-19T00:17:22.546501185Z" level=info msg="StartContainer for \"8f32befb35468c62f7a75f45f2d29f2053e6a336ddd7272ea22a091d66fc4e5a\" returns successfully" Aug 19 00:17:22.672496 kubelet[2403]: E0819 00:17:22.670897 2403 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://91.99.87.156:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 91.99.87.156:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Aug 19 00:17:22.700688 kubelet[2403]: E0819 00:17:22.700581 2403 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-8-661ee896d9\" not found" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:22.707044 kubelet[2403]: E0819 00:17:22.706911 2403 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-8-661ee896d9\" not found" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:22.709637 kubelet[2403]: E0819 00:17:22.709613 2403 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-8-661ee896d9\" not found" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:23.257736 kubelet[2403]: I0819 00:17:23.257690 2403 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:23.709650 kubelet[2403]: E0819 00:17:23.709492 2403 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-8-661ee896d9\" not found" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:23.711159 kubelet[2403]: E0819 00:17:23.710065 2403 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4426-0-0-8-661ee896d9\" not found" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:25.368870 kubelet[2403]: E0819 00:17:25.368198 2403 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4426-0-0-8-661ee896d9\" not found" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:25.505635 kubelet[2403]: I0819 00:17:25.505435 2403 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:25.505635 kubelet[2403]: E0819 00:17:25.505483 2403 kubelet_node_status.go:548] "Error updating node status, will retry" err="error getting node \"ci-4426-0-0-8-661ee896d9\": node \"ci-4426-0-0-8-661ee896d9\" not found" Aug 19 00:17:25.560596 kubelet[2403]: I0819 00:17:25.560007 2403 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:25.572450 kubelet[2403]: E0819 00:17:25.572414 2403 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:25.572643 kubelet[2403]: I0819 00:17:25.572629 2403 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:25.575266 kubelet[2403]: E0819 00:17:25.575205 2403 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4426-0-0-8-661ee896d9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:25.575595 kubelet[2403]: I0819 00:17:25.575452 2403 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:25.580701 kubelet[2403]: E0819 00:17:25.580661 2403 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426-0-0-8-661ee896d9\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:25.639408 kubelet[2403]: I0819 00:17:25.639267 2403 apiserver.go:52] "Watching apiserver" Aug 19 00:17:25.658747 kubelet[2403]: I0819 00:17:25.658681 2403 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 00:17:27.487680 systemd[1]: Reload requested from client PID 2680 ('systemctl') (unit session-7.scope)... Aug 19 00:17:27.487707 systemd[1]: Reloading... Aug 19 00:17:27.599257 zram_generator::config[2727]: No configuration found. Aug 19 00:17:27.815044 systemd[1]: Reloading finished in 326 ms. Aug 19 00:17:27.843832 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:17:27.858930 systemd[1]: kubelet.service: Deactivated successfully. Aug 19 00:17:27.859516 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:17:27.859721 systemd[1]: kubelet.service: Consumed 1.054s CPU time, 125.9M memory peak. Aug 19 00:17:27.863321 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Aug 19 00:17:28.024077 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Aug 19 00:17:28.036957 (kubelet)[2769]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Aug 19 00:17:28.093716 kubelet[2769]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:17:28.093716 kubelet[2769]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Aug 19 00:17:28.093716 kubelet[2769]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Aug 19 00:17:28.093716 kubelet[2769]: I0819 00:17:28.093419 2769 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Aug 19 00:17:28.105298 kubelet[2769]: I0819 00:17:28.104354 2769 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Aug 19 00:17:28.105298 kubelet[2769]: I0819 00:17:28.104421 2769 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Aug 19 00:17:28.105298 kubelet[2769]: I0819 00:17:28.104822 2769 server.go:956] "Client rotation is on, will bootstrap in background" Aug 19 00:17:28.106803 kubelet[2769]: I0819 00:17:28.106744 2769 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Aug 19 00:17:28.115669 kubelet[2769]: I0819 00:17:28.115624 2769 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Aug 19 00:17:28.126096 kubelet[2769]: I0819 00:17:28.126067 2769 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Aug 19 00:17:28.129631 kubelet[2769]: I0819 00:17:28.129566 2769 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Aug 19 00:17:28.129945 kubelet[2769]: I0819 00:17:28.129800 2769 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Aug 19 00:17:28.129993 kubelet[2769]: I0819 00:17:28.129829 2769 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4426-0-0-8-661ee896d9","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Aug 19 00:17:28.130322 kubelet[2769]: I0819 00:17:28.129997 2769 topology_manager.go:138] "Creating topology manager with none policy" Aug 19 00:17:28.130322 kubelet[2769]: I0819 00:17:28.130010 2769 container_manager_linux.go:303] "Creating device plugin manager" Aug 19 00:17:28.130322 kubelet[2769]: I0819 00:17:28.130055 2769 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:17:28.130322 kubelet[2769]: I0819 00:17:28.130269 2769 kubelet.go:480] "Attempting to sync node with API server" Aug 19 00:17:28.130322 kubelet[2769]: I0819 00:17:28.130282 2769 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Aug 19 00:17:28.131229 kubelet[2769]: I0819 00:17:28.131005 2769 kubelet.go:386] "Adding apiserver pod source" Aug 19 00:17:28.131229 kubelet[2769]: I0819 00:17:28.131056 2769 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Aug 19 00:17:28.135508 kubelet[2769]: I0819 00:17:28.135385 2769 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.5" apiVersion="v1" Aug 19 00:17:28.137234 kubelet[2769]: I0819 00:17:28.136775 2769 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Aug 19 00:17:28.144161 kubelet[2769]: I0819 00:17:28.142677 2769 watchdog_linux.go:99] "Systemd watchdog is not enabled" Aug 19 00:17:28.144161 kubelet[2769]: I0819 00:17:28.142741 2769 server.go:1289] "Started kubelet" Aug 19 00:17:28.144161 kubelet[2769]: I0819 00:17:28.143554 2769 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Aug 19 00:17:28.144161 kubelet[2769]: I0819 00:17:28.143825 2769 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Aug 19 00:17:28.144161 kubelet[2769]: I0819 00:17:28.143872 2769 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Aug 19 00:17:28.145493 kubelet[2769]: I0819 00:17:28.145452 2769 server.go:317] "Adding debug handlers to kubelet server" Aug 19 00:17:28.147215 kubelet[2769]: I0819 00:17:28.147178 2769 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Aug 19 00:17:28.151185 kubelet[2769]: I0819 00:17:28.150248 2769 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Aug 19 00:17:28.158594 kubelet[2769]: I0819 00:17:28.158546 2769 volume_manager.go:297] "Starting Kubelet Volume Manager" Aug 19 00:17:28.159360 kubelet[2769]: E0819 00:17:28.158821 2769 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4426-0-0-8-661ee896d9\" not found" Aug 19 00:17:28.161476 kubelet[2769]: I0819 00:17:28.160981 2769 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Aug 19 00:17:28.161476 kubelet[2769]: I0819 00:17:28.161173 2769 reconciler.go:26] "Reconciler: start to sync state" Aug 19 00:17:28.166524 kubelet[2769]: I0819 00:17:28.166243 2769 factory.go:223] Registration of the systemd container factory successfully Aug 19 00:17:28.166524 kubelet[2769]: I0819 00:17:28.166373 2769 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Aug 19 00:17:28.179170 kubelet[2769]: I0819 00:17:28.178313 2769 factory.go:223] Registration of the containerd container factory successfully Aug 19 00:17:28.209703 kubelet[2769]: I0819 00:17:28.209635 2769 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Aug 19 00:17:28.212329 kubelet[2769]: I0819 00:17:28.212291 2769 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Aug 19 00:17:28.212329 kubelet[2769]: I0819 00:17:28.212321 2769 status_manager.go:230] "Starting to sync pod status with apiserver" Aug 19 00:17:28.212502 kubelet[2769]: I0819 00:17:28.212343 2769 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Aug 19 00:17:28.212502 kubelet[2769]: I0819 00:17:28.212350 2769 kubelet.go:2436] "Starting kubelet main sync loop" Aug 19 00:17:28.212502 kubelet[2769]: E0819 00:17:28.212391 2769 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Aug 19 00:17:28.261888 kubelet[2769]: I0819 00:17:28.261858 2769 cpu_manager.go:221] "Starting CPU manager" policy="none" Aug 19 00:17:28.262054 kubelet[2769]: I0819 00:17:28.262039 2769 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Aug 19 00:17:28.262114 kubelet[2769]: I0819 00:17:28.262106 2769 state_mem.go:36] "Initialized new in-memory state store" Aug 19 00:17:28.262371 kubelet[2769]: I0819 00:17:28.262352 2769 state_mem.go:88] "Updated default CPUSet" cpuSet="" Aug 19 00:17:28.262476 kubelet[2769]: I0819 00:17:28.262435 2769 state_mem.go:96] "Updated CPUSet assignments" assignments={} Aug 19 00:17:28.262538 kubelet[2769]: I0819 00:17:28.262529 2769 policy_none.go:49] "None policy: Start" Aug 19 00:17:28.262591 kubelet[2769]: I0819 00:17:28.262582 2769 memory_manager.go:186] "Starting memorymanager" policy="None" Aug 19 00:17:28.262645 kubelet[2769]: I0819 00:17:28.262638 2769 state_mem.go:35] "Initializing new in-memory state store" Aug 19 00:17:28.262793 kubelet[2769]: I0819 00:17:28.262781 2769 state_mem.go:75] "Updated machine memory state" Aug 19 00:17:28.268995 kubelet[2769]: E0819 00:17:28.268934 2769 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Aug 19 00:17:28.269190 kubelet[2769]: I0819 00:17:28.269159 2769 eviction_manager.go:189] "Eviction manager: starting control loop" Aug 19 00:17:28.269251 kubelet[2769]: I0819 00:17:28.269180 2769 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Aug 19 00:17:28.270178 kubelet[2769]: I0819 00:17:28.269884 2769 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Aug 19 00:17:28.271792 kubelet[2769]: E0819 00:17:28.271556 2769 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Aug 19 00:17:28.313863 kubelet[2769]: I0819 00:17:28.313823 2769 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.315114 kubelet[2769]: I0819 00:17:28.314665 2769 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.315636 kubelet[2769]: I0819 00:17:28.315375 2769 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.386397 kubelet[2769]: I0819 00:17:28.386077 2769 kubelet_node_status.go:75] "Attempting to register node" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.399468 kubelet[2769]: I0819 00:17:28.399420 2769 kubelet_node_status.go:124] "Node was previously registered" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.399641 kubelet[2769]: I0819 00:17:28.399536 2769 kubelet_node_status.go:78] "Successfully registered node" node="ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.463144 kubelet[2769]: I0819 00:17:28.462788 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/326e11fc8851a61617f778ad49dd8810-kubeconfig\") pod \"kube-scheduler-ci-4426-0-0-8-661ee896d9\" (UID: \"326e11fc8851a61617f778ad49dd8810\") " pod="kube-system/kube-scheduler-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.463144 kubelet[2769]: I0819 00:17:28.462942 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-ca-certs\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.463144 kubelet[2769]: I0819 00:17:28.462985 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-k8s-certs\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.464008 kubelet[2769]: I0819 00:17:28.463698 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-kubeconfig\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.464008 kubelet[2769]: I0819 00:17:28.463858 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.464008 kubelet[2769]: I0819 00:17:28.463887 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/72f08a9da9924fbf8727c28ab1922188-ca-certs\") pod \"kube-apiserver-ci-4426-0-0-8-661ee896d9\" (UID: \"72f08a9da9924fbf8727c28ab1922188\") " pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.464008 kubelet[2769]: I0819 00:17:28.463920 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/72f08a9da9924fbf8727c28ab1922188-k8s-certs\") pod \"kube-apiserver-ci-4426-0-0-8-661ee896d9\" (UID: \"72f08a9da9924fbf8727c28ab1922188\") " pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.464008 kubelet[2769]: I0819 00:17:28.463936 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/72f08a9da9924fbf8727c28ab1922188-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4426-0-0-8-661ee896d9\" (UID: \"72f08a9da9924fbf8727c28ab1922188\") " pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:28.464293 kubelet[2769]: I0819 00:17:28.463955 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/cb812f2df244cf96c0541d3c90293f04-flexvolume-dir\") pod \"kube-controller-manager-ci-4426-0-0-8-661ee896d9\" (UID: \"cb812f2df244cf96c0541d3c90293f04\") " pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:29.133233 kubelet[2769]: I0819 00:17:29.133118 2769 apiserver.go:52] "Watching apiserver" Aug 19 00:17:29.162151 kubelet[2769]: I0819 00:17:29.162096 2769 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Aug 19 00:17:29.174405 kubelet[2769]: I0819 00:17:29.174324 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4426-0-0-8-661ee896d9" podStartSLOduration=1.1742894449999999 podStartE2EDuration="1.174289445s" podCreationTimestamp="2025-08-19 00:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:17:29.171677811 +0000 UTC m=+1.128059915" watchObservedRunningTime="2025-08-19 00:17:29.174289445 +0000 UTC m=+1.130671549" Aug 19 00:17:29.191726 kubelet[2769]: I0819 00:17:29.191574 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4426-0-0-8-661ee896d9" podStartSLOduration=1.191379948 podStartE2EDuration="1.191379948s" podCreationTimestamp="2025-08-19 00:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:17:29.191292386 +0000 UTC m=+1.147674490" watchObservedRunningTime="2025-08-19 00:17:29.191379948 +0000 UTC m=+1.147762092" Aug 19 00:17:29.221154 kubelet[2769]: I0819 00:17:29.220463 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" podStartSLOduration=1.220438207 podStartE2EDuration="1.220438207s" podCreationTimestamp="2025-08-19 00:17:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:17:29.205695374 +0000 UTC m=+1.162077478" watchObservedRunningTime="2025-08-19 00:17:29.220438207 +0000 UTC m=+1.176820311" Aug 19 00:17:29.238736 kubelet[2769]: I0819 00:17:29.238681 2769 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:29.254118 kubelet[2769]: E0819 00:17:29.253568 2769 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4426-0-0-8-661ee896d9\" already exists" pod="kube-system/kube-apiserver-ci-4426-0-0-8-661ee896d9" Aug 19 00:17:33.233813 kubelet[2769]: I0819 00:17:33.233651 2769 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Aug 19 00:17:33.234312 containerd[1513]: time="2025-08-19T00:17:33.234198099Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Aug 19 00:17:33.234593 kubelet[2769]: I0819 00:17:33.234502 2769 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Aug 19 00:17:34.423284 systemd[1]: Created slice kubepods-besteffort-podc11aaae3_0f23_46e7_a1a6_e478b114e746.slice - libcontainer container kubepods-besteffort-podc11aaae3_0f23_46e7_a1a6_e478b114e746.slice. Aug 19 00:17:34.485844 systemd[1]: Created slice kubepods-besteffort-pod66174b64_b5d9_40d5_bd40_174fa668aefe.slice - libcontainer container kubepods-besteffort-pod66174b64_b5d9_40d5_bd40_174fa668aefe.slice. Aug 19 00:17:34.505685 kubelet[2769]: I0819 00:17:34.505421 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/66174b64-b5d9-40d5-bd40-174fa668aefe-kube-proxy\") pod \"kube-proxy-rcxdh\" (UID: \"66174b64-b5d9-40d5-bd40-174fa668aefe\") " pod="kube-system/kube-proxy-rcxdh" Aug 19 00:17:34.505685 kubelet[2769]: I0819 00:17:34.505468 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66174b64-b5d9-40d5-bd40-174fa668aefe-lib-modules\") pod \"kube-proxy-rcxdh\" (UID: \"66174b64-b5d9-40d5-bd40-174fa668aefe\") " pod="kube-system/kube-proxy-rcxdh" Aug 19 00:17:34.505685 kubelet[2769]: I0819 00:17:34.505492 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fnkw\" (UniqueName: \"kubernetes.io/projected/66174b64-b5d9-40d5-bd40-174fa668aefe-kube-api-access-6fnkw\") pod \"kube-proxy-rcxdh\" (UID: \"66174b64-b5d9-40d5-bd40-174fa668aefe\") " pod="kube-system/kube-proxy-rcxdh" Aug 19 00:17:34.505685 kubelet[2769]: I0819 00:17:34.505515 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/c11aaae3-0f23-46e7-a1a6-e478b114e746-var-lib-calico\") pod \"tigera-operator-747864d56d-w4tcs\" (UID: \"c11aaae3-0f23-46e7-a1a6-e478b114e746\") " pod="tigera-operator/tigera-operator-747864d56d-w4tcs" Aug 19 00:17:34.505685 kubelet[2769]: I0819 00:17:34.505534 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/66174b64-b5d9-40d5-bd40-174fa668aefe-xtables-lock\") pod \"kube-proxy-rcxdh\" (UID: \"66174b64-b5d9-40d5-bd40-174fa668aefe\") " pod="kube-system/kube-proxy-rcxdh" Aug 19 00:17:34.506223 kubelet[2769]: I0819 00:17:34.505550 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f4tq\" (UniqueName: \"kubernetes.io/projected/c11aaae3-0f23-46e7-a1a6-e478b114e746-kube-api-access-4f4tq\") pod \"tigera-operator-747864d56d-w4tcs\" (UID: \"c11aaae3-0f23-46e7-a1a6-e478b114e746\") " pod="tigera-operator/tigera-operator-747864d56d-w4tcs" Aug 19 00:17:34.738195 containerd[1513]: time="2025-08-19T00:17:34.737967044Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-w4tcs,Uid:c11aaae3-0f23-46e7-a1a6-e478b114e746,Namespace:tigera-operator,Attempt:0,}" Aug 19 00:17:34.766282 containerd[1513]: time="2025-08-19T00:17:34.766211388Z" level=info msg="connecting to shim 3f44c93945524ce12883c5273f833a94e08a454a9b0271c0e1896587b9654fcf" address="unix:///run/containerd/s/b6308fb29cdb1880db778d33982296f960d3e59d2b6c7bee34d053abe461f498" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:17:34.790655 containerd[1513]: time="2025-08-19T00:17:34.790414883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rcxdh,Uid:66174b64-b5d9-40d5-bd40-174fa668aefe,Namespace:kube-system,Attempt:0,}" Aug 19 00:17:34.796441 systemd[1]: Started cri-containerd-3f44c93945524ce12883c5273f833a94e08a454a9b0271c0e1896587b9654fcf.scope - libcontainer container 3f44c93945524ce12883c5273f833a94e08a454a9b0271c0e1896587b9654fcf. Aug 19 00:17:34.832294 containerd[1513]: time="2025-08-19T00:17:34.832027870Z" level=info msg="connecting to shim 5c8f835e26371b65996feb74745ff4da991ecd7bf3ffdeb3b56979e8e2b657e2" address="unix:///run/containerd/s/e857365067f7ef9d9bc1810d0630c23408d4764720edca4c43c4bee02ffbcc9f" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:17:34.866906 systemd[1]: Started cri-containerd-5c8f835e26371b65996feb74745ff4da991ecd7bf3ffdeb3b56979e8e2b657e2.scope - libcontainer container 5c8f835e26371b65996feb74745ff4da991ecd7bf3ffdeb3b56979e8e2b657e2. Aug 19 00:17:34.875770 containerd[1513]: time="2025-08-19T00:17:34.875709562Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-747864d56d-w4tcs,Uid:c11aaae3-0f23-46e7-a1a6-e478b114e746,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3f44c93945524ce12883c5273f833a94e08a454a9b0271c0e1896587b9654fcf\"" Aug 19 00:17:34.879087 containerd[1513]: time="2025-08-19T00:17:34.879006762Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\"" Aug 19 00:17:34.910201 containerd[1513]: time="2025-08-19T00:17:34.910121901Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-rcxdh,Uid:66174b64-b5d9-40d5-bd40-174fa668aefe,Namespace:kube-system,Attempt:0,} returns sandbox id \"5c8f835e26371b65996feb74745ff4da991ecd7bf3ffdeb3b56979e8e2b657e2\"" Aug 19 00:17:34.918912 containerd[1513]: time="2025-08-19T00:17:34.918845487Z" level=info msg="CreateContainer within sandbox \"5c8f835e26371b65996feb74745ff4da991ecd7bf3ffdeb3b56979e8e2b657e2\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Aug 19 00:17:34.933841 containerd[1513]: time="2025-08-19T00:17:34.933758908Z" level=info msg="Container ac8ba4043a8b78cc95ef40e2f24db58ef66a4261d3cbe5ff2dab748f9d8749a9: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:17:34.943891 containerd[1513]: time="2025-08-19T00:17:34.943799631Z" level=info msg="CreateContainer within sandbox \"5c8f835e26371b65996feb74745ff4da991ecd7bf3ffdeb3b56979e8e2b657e2\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"ac8ba4043a8b78cc95ef40e2f24db58ef66a4261d3cbe5ff2dab748f9d8749a9\"" Aug 19 00:17:34.944637 containerd[1513]: time="2025-08-19T00:17:34.944570440Z" level=info msg="StartContainer for \"ac8ba4043a8b78cc95ef40e2f24db58ef66a4261d3cbe5ff2dab748f9d8749a9\"" Aug 19 00:17:34.949464 containerd[1513]: time="2025-08-19T00:17:34.949355018Z" level=info msg="connecting to shim ac8ba4043a8b78cc95ef40e2f24db58ef66a4261d3cbe5ff2dab748f9d8749a9" address="unix:///run/containerd/s/e857365067f7ef9d9bc1810d0630c23408d4764720edca4c43c4bee02ffbcc9f" protocol=ttrpc version=3 Aug 19 00:17:34.976512 systemd[1]: Started cri-containerd-ac8ba4043a8b78cc95ef40e2f24db58ef66a4261d3cbe5ff2dab748f9d8749a9.scope - libcontainer container ac8ba4043a8b78cc95ef40e2f24db58ef66a4261d3cbe5ff2dab748f9d8749a9. Aug 19 00:17:35.028182 containerd[1513]: time="2025-08-19T00:17:35.026630876Z" level=info msg="StartContainer for \"ac8ba4043a8b78cc95ef40e2f24db58ef66a4261d3cbe5ff2dab748f9d8749a9\" returns successfully" Aug 19 00:17:35.797626 kubelet[2769]: I0819 00:17:35.797507 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-rcxdh" podStartSLOduration=1.7972611889999999 podStartE2EDuration="1.797261189s" podCreationTimestamp="2025-08-19 00:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:17:35.273927971 +0000 UTC m=+7.230310115" watchObservedRunningTime="2025-08-19 00:17:35.797261189 +0000 UTC m=+7.753643293" Aug 19 00:17:36.833994 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2769847912.mount: Deactivated successfully. Aug 19 00:17:37.317363 containerd[1513]: time="2025-08-19T00:17:37.317256865Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:37.319090 containerd[1513]: time="2025-08-19T00:17:37.318812883Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.3: active requests=0, bytes read=22150610" Aug 19 00:17:37.320519 containerd[1513]: time="2025-08-19T00:17:37.320457462Z" level=info msg="ImageCreate event name:\"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:37.325789 containerd[1513]: time="2025-08-19T00:17:37.325726324Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:37.326675 containerd[1513]: time="2025-08-19T00:17:37.326601855Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.3\" with image id \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\", repo tag \"quay.io/tigera/operator:v1.38.3\", repo digest \"quay.io/tigera/operator@sha256:dbf1bad0def7b5955dc8e4aeee96e23ead0bc5822f6872518e685cd0ed484121\", size \"22146605\" in 2.447542092s" Aug 19 00:17:37.326675 containerd[1513]: time="2025-08-19T00:17:37.326648695Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.3\" returns image reference \"sha256:7f8a5b1dba618e907d5f7804e42b3bd7cd5766bc3b0a66da25ff2c687e356bb0\"" Aug 19 00:17:37.337467 containerd[1513]: time="2025-08-19T00:17:37.337395262Z" level=info msg="CreateContainer within sandbox \"3f44c93945524ce12883c5273f833a94e08a454a9b0271c0e1896587b9654fcf\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Aug 19 00:17:37.352876 containerd[1513]: time="2025-08-19T00:17:37.352128355Z" level=info msg="Container c494ecba111cbb37f061c98d637b3ff92fa01c46b38c6f297ee2fe2f25e47306: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:17:37.362552 containerd[1513]: time="2025-08-19T00:17:37.362504317Z" level=info msg="CreateContainer within sandbox \"3f44c93945524ce12883c5273f833a94e08a454a9b0271c0e1896587b9654fcf\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"c494ecba111cbb37f061c98d637b3ff92fa01c46b38c6f297ee2fe2f25e47306\"" Aug 19 00:17:37.364513 containerd[1513]: time="2025-08-19T00:17:37.364454060Z" level=info msg="StartContainer for \"c494ecba111cbb37f061c98d637b3ff92fa01c46b38c6f297ee2fe2f25e47306\"" Aug 19 00:17:37.368400 containerd[1513]: time="2025-08-19T00:17:37.368241985Z" level=info msg="connecting to shim c494ecba111cbb37f061c98d637b3ff92fa01c46b38c6f297ee2fe2f25e47306" address="unix:///run/containerd/s/b6308fb29cdb1880db778d33982296f960d3e59d2b6c7bee34d053abe461f498" protocol=ttrpc version=3 Aug 19 00:17:37.399534 systemd[1]: Started cri-containerd-c494ecba111cbb37f061c98d637b3ff92fa01c46b38c6f297ee2fe2f25e47306.scope - libcontainer container c494ecba111cbb37f061c98d637b3ff92fa01c46b38c6f297ee2fe2f25e47306. Aug 19 00:17:37.445974 containerd[1513]: time="2025-08-19T00:17:37.445503735Z" level=info msg="StartContainer for \"c494ecba111cbb37f061c98d637b3ff92fa01c46b38c6f297ee2fe2f25e47306\" returns successfully" Aug 19 00:17:40.682562 kubelet[2769]: I0819 00:17:40.682494 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-747864d56d-w4tcs" podStartSLOduration=4.231132504 podStartE2EDuration="6.682478761s" podCreationTimestamp="2025-08-19 00:17:34 +0000 UTC" firstStartedPulling="2025-08-19 00:17:34.878696718 +0000 UTC m=+6.835078822" lastFinishedPulling="2025-08-19 00:17:37.330042975 +0000 UTC m=+9.286425079" observedRunningTime="2025-08-19 00:17:38.292764477 +0000 UTC m=+10.249146621" watchObservedRunningTime="2025-08-19 00:17:40.682478761 +0000 UTC m=+12.638860865" Aug 19 00:17:43.973426 sudo[1856]: pam_unix(sudo:session): session closed for user root Aug 19 00:17:44.134735 sshd[1855]: Connection closed by 139.178.89.65 port 51198 Aug 19 00:17:44.134630 sshd-session[1852]: pam_unix(sshd:session): session closed for user core Aug 19 00:17:44.147011 systemd[1]: sshd@7-91.99.87.156:22-139.178.89.65:51198.service: Deactivated successfully. Aug 19 00:17:44.149910 systemd[1]: session-7.scope: Deactivated successfully. Aug 19 00:17:44.150166 systemd[1]: session-7.scope: Consumed 8.387s CPU time, 222.9M memory peak. Aug 19 00:17:44.151502 systemd-logind[1491]: Session 7 logged out. Waiting for processes to exit. Aug 19 00:17:44.153658 systemd-logind[1491]: Removed session 7. Aug 19 00:17:51.742299 systemd[1]: Created slice kubepods-besteffort-pode16290fe_93a5_4be4_8c7a_afd612b13123.slice - libcontainer container kubepods-besteffort-pode16290fe_93a5_4be4_8c7a_afd612b13123.slice. Aug 19 00:17:51.821655 kubelet[2769]: I0819 00:17:51.821594 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e16290fe-93a5-4be4-8c7a-afd612b13123-tigera-ca-bundle\") pod \"calico-typha-7c665f78b7-2n2hg\" (UID: \"e16290fe-93a5-4be4-8c7a-afd612b13123\") " pod="calico-system/calico-typha-7c665f78b7-2n2hg" Aug 19 00:17:51.822888 kubelet[2769]: I0819 00:17:51.821654 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbdld\" (UniqueName: \"kubernetes.io/projected/e16290fe-93a5-4be4-8c7a-afd612b13123-kube-api-access-jbdld\") pod \"calico-typha-7c665f78b7-2n2hg\" (UID: \"e16290fe-93a5-4be4-8c7a-afd612b13123\") " pod="calico-system/calico-typha-7c665f78b7-2n2hg" Aug 19 00:17:51.822888 kubelet[2769]: I0819 00:17:51.821710 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e16290fe-93a5-4be4-8c7a-afd612b13123-typha-certs\") pod \"calico-typha-7c665f78b7-2n2hg\" (UID: \"e16290fe-93a5-4be4-8c7a-afd612b13123\") " pod="calico-system/calico-typha-7c665f78b7-2n2hg" Aug 19 00:17:51.892187 systemd[1]: Created slice kubepods-besteffort-pod3e73fe59_04cf_4fe7_9888_7f3bfe4a14d1.slice - libcontainer container kubepods-besteffort-pod3e73fe59_04cf_4fe7_9888_7f3bfe4a14d1.slice. Aug 19 00:17:51.922621 kubelet[2769]: I0819 00:17:51.922559 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-cni-log-dir\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922621 kubelet[2769]: I0819 00:17:51.922616 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-var-run-calico\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922857 kubelet[2769]: I0819 00:17:51.922635 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-var-lib-calico\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922857 kubelet[2769]: I0819 00:17:51.922658 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-flexvol-driver-host\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922857 kubelet[2769]: I0819 00:17:51.922697 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-cni-net-dir\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922857 kubelet[2769]: I0819 00:17:51.922711 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-tigera-ca-bundle\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922857 kubelet[2769]: I0819 00:17:51.922728 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-xtables-lock\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922973 kubelet[2769]: I0819 00:17:51.922742 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-node-certs\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922973 kubelet[2769]: I0819 00:17:51.922769 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2xrb\" (UniqueName: \"kubernetes.io/projected/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-kube-api-access-z2xrb\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922973 kubelet[2769]: I0819 00:17:51.922786 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-policysync\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922973 kubelet[2769]: I0819 00:17:51.922813 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-cni-bin-dir\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:51.922973 kubelet[2769]: I0819 00:17:51.922829 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1-lib-modules\") pod \"calico-node-wwv6v\" (UID: \"3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1\") " pod="calico-system/calico-node-wwv6v" Aug 19 00:17:52.025537 kubelet[2769]: E0819 00:17:52.025498 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.025537 kubelet[2769]: W0819 00:17:52.025529 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.025690 kubelet[2769]: E0819 00:17:52.025562 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.025748 kubelet[2769]: E0819 00:17:52.025717 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.025748 kubelet[2769]: W0819 00:17:52.025730 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.025806 kubelet[2769]: E0819 00:17:52.025751 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.025895 kubelet[2769]: E0819 00:17:52.025877 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.025895 kubelet[2769]: W0819 00:17:52.025890 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.025944 kubelet[2769]: E0819 00:17:52.025899 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.026186 kubelet[2769]: E0819 00:17:52.026168 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.026186 kubelet[2769]: W0819 00:17:52.026185 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.026273 kubelet[2769]: E0819 00:17:52.026196 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.026432 kubelet[2769]: E0819 00:17:52.026414 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.026432 kubelet[2769]: W0819 00:17:52.026428 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.026515 kubelet[2769]: E0819 00:17:52.026438 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.027024 kubelet[2769]: E0819 00:17:52.026897 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.027024 kubelet[2769]: W0819 00:17:52.026918 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.027024 kubelet[2769]: E0819 00:17:52.026935 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.027273 kubelet[2769]: E0819 00:17:52.027248 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.027341 kubelet[2769]: W0819 00:17:52.027331 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.027395 kubelet[2769]: E0819 00:17:52.027384 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.029221 kubelet[2769]: E0819 00:17:52.029193 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.029435 kubelet[2769]: W0819 00:17:52.029320 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.029435 kubelet[2769]: E0819 00:17:52.029346 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.030164 kubelet[2769]: E0819 00:17:52.029780 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.030276 kubelet[2769]: W0819 00:17:52.030254 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.030336 kubelet[2769]: E0819 00:17:52.030325 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.031437 kubelet[2769]: E0819 00:17:52.031419 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.031679 kubelet[2769]: W0819 00:17:52.031576 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.031679 kubelet[2769]: E0819 00:17:52.031600 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.035183 kubelet[2769]: E0819 00:17:52.034012 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.035183 kubelet[2769]: W0819 00:17:52.034038 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.035183 kubelet[2769]: E0819 00:17:52.034058 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.040521 kubelet[2769]: E0819 00:17:52.040284 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.040521 kubelet[2769]: W0819 00:17:52.040311 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.040521 kubelet[2769]: E0819 00:17:52.040334 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.042465 kubelet[2769]: E0819 00:17:52.042360 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.042465 kubelet[2769]: W0819 00:17:52.042385 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.042465 kubelet[2769]: E0819 00:17:52.042414 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.048577 containerd[1513]: time="2025-08-19T00:17:52.048533384Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c665f78b7-2n2hg,Uid:e16290fe-93a5-4be4-8c7a-afd612b13123,Namespace:calico-system,Attempt:0,}" Aug 19 00:17:52.065174 kubelet[2769]: E0819 00:17:52.064346 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.065464 kubelet[2769]: W0819 00:17:52.065352 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.065464 kubelet[2769]: E0819 00:17:52.065394 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.098394 containerd[1513]: time="2025-08-19T00:17:52.097436682Z" level=info msg="connecting to shim 8d0dcf85a478667e1d312899e5c6b2034aaf95a6541eb2bb56a5f00406d11ec8" address="unix:///run/containerd/s/c32d46ecd096de946424ac665bda71bfa3573be7b1d9d0b2ac8f91223353ae94" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:17:52.139413 systemd[1]: Started cri-containerd-8d0dcf85a478667e1d312899e5c6b2034aaf95a6541eb2bb56a5f00406d11ec8.scope - libcontainer container 8d0dcf85a478667e1d312899e5c6b2034aaf95a6541eb2bb56a5f00406d11ec8. Aug 19 00:17:52.191034 kubelet[2769]: E0819 00:17:52.190975 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tc6qm" podUID="f3224d19-d7a0-4294-80f2-5149b99ec962" Aug 19 00:17:52.199640 containerd[1513]: time="2025-08-19T00:17:52.199533899Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wwv6v,Uid:3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1,Namespace:calico-system,Attempt:0,}" Aug 19 00:17:52.208977 kubelet[2769]: E0819 00:17:52.208870 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.208977 kubelet[2769]: W0819 00:17:52.208896 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.209739 kubelet[2769]: E0819 00:17:52.209195 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.211366 kubelet[2769]: E0819 00:17:52.211303 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.211601 kubelet[2769]: W0819 00:17:52.211326 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.211601 kubelet[2769]: E0819 00:17:52.211544 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.212103 kubelet[2769]: E0819 00:17:52.212087 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.212362 kubelet[2769]: W0819 00:17:52.212291 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.212362 kubelet[2769]: E0819 00:17:52.212316 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.213845 kubelet[2769]: E0819 00:17:52.213817 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.214347 kubelet[2769]: W0819 00:17:52.214085 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.214347 kubelet[2769]: E0819 00:17:52.214114 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.216307 kubelet[2769]: E0819 00:17:52.216282 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.216886 kubelet[2769]: W0819 00:17:52.216560 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.217117 kubelet[2769]: E0819 00:17:52.216975 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.219289 kubelet[2769]: E0819 00:17:52.218994 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.219289 kubelet[2769]: W0819 00:17:52.219177 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.219289 kubelet[2769]: E0819 00:17:52.219196 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.224202 kubelet[2769]: E0819 00:17:52.224157 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.224628 kubelet[2769]: W0819 00:17:52.224183 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.224628 kubelet[2769]: E0819 00:17:52.224498 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.225849 kubelet[2769]: E0819 00:17:52.225721 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.228249 kubelet[2769]: W0819 00:17:52.226250 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.229426 kubelet[2769]: E0819 00:17:52.229400 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.230205 kubelet[2769]: E0819 00:17:52.230036 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.230205 kubelet[2769]: W0819 00:17:52.230057 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.230205 kubelet[2769]: E0819 00:17:52.230074 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.231320 kubelet[2769]: E0819 00:17:52.231168 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.231320 kubelet[2769]: W0819 00:17:52.231228 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.231320 kubelet[2769]: E0819 00:17:52.231247 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.232452 kubelet[2769]: E0819 00:17:52.232277 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.232452 kubelet[2769]: W0819 00:17:52.232295 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.232452 kubelet[2769]: E0819 00:17:52.232313 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.232867 kubelet[2769]: E0819 00:17:52.232700 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.232867 kubelet[2769]: W0819 00:17:52.232718 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.232867 kubelet[2769]: E0819 00:17:52.232742 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.233431 kubelet[2769]: E0819 00:17:52.233259 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.233431 kubelet[2769]: W0819 00:17:52.233311 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.233431 kubelet[2769]: E0819 00:17:52.233328 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.234622 kubelet[2769]: E0819 00:17:52.234384 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.234622 kubelet[2769]: W0819 00:17:52.234405 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.234622 kubelet[2769]: E0819 00:17:52.234598 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.235236 kubelet[2769]: E0819 00:17:52.235216 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.235236 kubelet[2769]: W0819 00:17:52.235232 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.235327 kubelet[2769]: E0819 00:17:52.235245 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.235413 kubelet[2769]: E0819 00:17:52.235400 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.235413 kubelet[2769]: W0819 00:17:52.235411 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.235505 kubelet[2769]: E0819 00:17:52.235419 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.236358 kubelet[2769]: E0819 00:17:52.236224 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.236358 kubelet[2769]: W0819 00:17:52.236251 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.236358 kubelet[2769]: E0819 00:17:52.236270 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.236464 kubelet[2769]: E0819 00:17:52.236415 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.236464 kubelet[2769]: W0819 00:17:52.236422 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.236464 kubelet[2769]: E0819 00:17:52.236429 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.237037 kubelet[2769]: E0819 00:17:52.236971 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.237037 kubelet[2769]: W0819 00:17:52.237000 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.237037 kubelet[2769]: E0819 00:17:52.237011 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.237778 kubelet[2769]: E0819 00:17:52.237616 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.237778 kubelet[2769]: W0819 00:17:52.237637 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.237778 kubelet[2769]: E0819 00:17:52.237648 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.238321 kubelet[2769]: E0819 00:17:52.237988 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.238321 kubelet[2769]: W0819 00:17:52.238004 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.238321 kubelet[2769]: E0819 00:17:52.238015 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.238321 kubelet[2769]: I0819 00:17:52.238036 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f3224d19-d7a0-4294-80f2-5149b99ec962-kubelet-dir\") pod \"csi-node-driver-tc6qm\" (UID: \"f3224d19-d7a0-4294-80f2-5149b99ec962\") " pod="calico-system/csi-node-driver-tc6qm" Aug 19 00:17:52.239155 kubelet[2769]: E0819 00:17:52.238655 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.239155 kubelet[2769]: W0819 00:17:52.238679 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.239155 kubelet[2769]: E0819 00:17:52.238693 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.239155 kubelet[2769]: I0819 00:17:52.238713 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f3224d19-d7a0-4294-80f2-5149b99ec962-registration-dir\") pod \"csi-node-driver-tc6qm\" (UID: \"f3224d19-d7a0-4294-80f2-5149b99ec962\") " pod="calico-system/csi-node-driver-tc6qm" Aug 19 00:17:52.240805 kubelet[2769]: E0819 00:17:52.240772 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.240805 kubelet[2769]: W0819 00:17:52.240795 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.241078 kubelet[2769]: E0819 00:17:52.240813 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.241078 kubelet[2769]: I0819 00:17:52.240862 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/f3224d19-d7a0-4294-80f2-5149b99ec962-varrun\") pod \"csi-node-driver-tc6qm\" (UID: \"f3224d19-d7a0-4294-80f2-5149b99ec962\") " pod="calico-system/csi-node-driver-tc6qm" Aug 19 00:17:52.242283 kubelet[2769]: E0819 00:17:52.242072 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.242283 kubelet[2769]: W0819 00:17:52.242092 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.242283 kubelet[2769]: E0819 00:17:52.242108 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.244355 kubelet[2769]: E0819 00:17:52.244202 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.244355 kubelet[2769]: W0819 00:17:52.244236 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.244355 kubelet[2769]: E0819 00:17:52.244256 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.245641 kubelet[2769]: E0819 00:17:52.245619 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.245895 kubelet[2769]: W0819 00:17:52.245740 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.245895 kubelet[2769]: E0819 00:17:52.245767 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.248969 kubelet[2769]: E0819 00:17:52.248567 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.248969 kubelet[2769]: W0819 00:17:52.248592 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.248969 kubelet[2769]: E0819 00:17:52.248612 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.248969 kubelet[2769]: I0819 00:17:52.248692 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7phk4\" (UniqueName: \"kubernetes.io/projected/f3224d19-d7a0-4294-80f2-5149b99ec962-kube-api-access-7phk4\") pod \"csi-node-driver-tc6qm\" (UID: \"f3224d19-d7a0-4294-80f2-5149b99ec962\") " pod="calico-system/csi-node-driver-tc6qm" Aug 19 00:17:52.248969 kubelet[2769]: E0819 00:17:52.248859 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.248969 kubelet[2769]: W0819 00:17:52.248869 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.248969 kubelet[2769]: E0819 00:17:52.248879 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.249549 kubelet[2769]: E0819 00:17:52.249467 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.249549 kubelet[2769]: W0819 00:17:52.249483 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.249950 kubelet[2769]: E0819 00:17:52.249750 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.250312 kubelet[2769]: E0819 00:17:52.250125 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.250312 kubelet[2769]: W0819 00:17:52.250266 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.250312 kubelet[2769]: E0819 00:17:52.250284 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.251228 kubelet[2769]: E0819 00:17:52.250872 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.251228 kubelet[2769]: W0819 00:17:52.250891 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.251228 kubelet[2769]: E0819 00:17:52.250904 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.251431 kubelet[2769]: E0819 00:17:52.251402 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.251431 kubelet[2769]: W0819 00:17:52.251419 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.251952 kubelet[2769]: E0819 00:17:52.251432 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.251952 kubelet[2769]: E0819 00:17:52.251922 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.251952 kubelet[2769]: W0819 00:17:52.251939 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.251952 kubelet[2769]: E0819 00:17:52.251952 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.252772 kubelet[2769]: I0819 00:17:52.251981 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f3224d19-d7a0-4294-80f2-5149b99ec962-socket-dir\") pod \"csi-node-driver-tc6qm\" (UID: \"f3224d19-d7a0-4294-80f2-5149b99ec962\") " pod="calico-system/csi-node-driver-tc6qm" Aug 19 00:17:52.253217 kubelet[2769]: E0819 00:17:52.253171 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.253217 kubelet[2769]: W0819 00:17:52.253199 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.253217 kubelet[2769]: E0819 00:17:52.253218 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.253432 kubelet[2769]: E0819 00:17:52.253391 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.253432 kubelet[2769]: W0819 00:17:52.253408 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.253432 kubelet[2769]: E0819 00:17:52.253420 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.254542 containerd[1513]: time="2025-08-19T00:17:52.254492673Z" level=info msg="connecting to shim edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf" address="unix:///run/containerd/s/622a4c40d5f7d2eb1e752e49b0f00af816af9d7f5c43dd2528bdbeabbf95a588" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:17:52.302049 systemd[1]: Started cri-containerd-edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf.scope - libcontainer container edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf. Aug 19 00:17:52.333322 containerd[1513]: time="2025-08-19T00:17:52.333077025Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7c665f78b7-2n2hg,Uid:e16290fe-93a5-4be4-8c7a-afd612b13123,Namespace:calico-system,Attempt:0,} returns sandbox id \"8d0dcf85a478667e1d312899e5c6b2034aaf95a6541eb2bb56a5f00406d11ec8\"" Aug 19 00:17:52.336716 containerd[1513]: time="2025-08-19T00:17:52.336678448Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\"" Aug 19 00:17:52.354629 kubelet[2769]: E0819 00:17:52.354448 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.354629 kubelet[2769]: W0819 00:17:52.354545 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.355967 kubelet[2769]: E0819 00:17:52.355475 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.357088 kubelet[2769]: E0819 00:17:52.357068 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.357228 kubelet[2769]: W0819 00:17:52.357211 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.357296 kubelet[2769]: E0819 00:17:52.357284 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.357856 kubelet[2769]: E0819 00:17:52.357836 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.358165 kubelet[2769]: W0819 00:17:52.357942 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.358165 kubelet[2769]: E0819 00:17:52.357963 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.358663 kubelet[2769]: E0819 00:17:52.358533 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.358663 kubelet[2769]: W0819 00:17:52.358548 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.358960 kubelet[2769]: E0819 00:17:52.358944 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.361579 kubelet[2769]: E0819 00:17:52.361399 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.361579 kubelet[2769]: W0819 00:17:52.361420 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.361579 kubelet[2769]: E0819 00:17:52.361439 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.362034 kubelet[2769]: E0819 00:17:52.361805 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.362034 kubelet[2769]: W0819 00:17:52.361819 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.362034 kubelet[2769]: E0819 00:17:52.361831 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.362837 kubelet[2769]: E0819 00:17:52.362643 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.362837 kubelet[2769]: W0819 00:17:52.362659 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.362837 kubelet[2769]: E0819 00:17:52.362671 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.363221 kubelet[2769]: E0819 00:17:52.363205 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.363295 kubelet[2769]: W0819 00:17:52.363283 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.363897 kubelet[2769]: E0819 00:17:52.363671 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.364262 kubelet[2769]: E0819 00:17:52.364245 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.364509 kubelet[2769]: W0819 00:17:52.364412 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.364509 kubelet[2769]: E0819 00:17:52.364430 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.365263 kubelet[2769]: E0819 00:17:52.365247 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.365484 kubelet[2769]: W0819 00:17:52.365360 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.365484 kubelet[2769]: E0819 00:17:52.365465 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.366492 kubelet[2769]: E0819 00:17:52.366417 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.366492 kubelet[2769]: W0819 00:17:52.366433 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.366492 kubelet[2769]: E0819 00:17:52.366446 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.367331 kubelet[2769]: E0819 00:17:52.367287 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.367331 kubelet[2769]: W0819 00:17:52.367300 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.367331 kubelet[2769]: E0819 00:17:52.367312 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.368445 kubelet[2769]: E0819 00:17:52.368302 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.368445 kubelet[2769]: W0819 00:17:52.368321 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.368445 kubelet[2769]: E0819 00:17:52.368334 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.369223 kubelet[2769]: E0819 00:17:52.369205 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.369494 kubelet[2769]: W0819 00:17:52.369459 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.369494 kubelet[2769]: E0819 00:17:52.369480 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.370670 kubelet[2769]: E0819 00:17:52.370402 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.370670 kubelet[2769]: W0819 00:17:52.370418 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.370670 kubelet[2769]: E0819 00:17:52.370430 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.371329 kubelet[2769]: E0819 00:17:52.371254 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.371329 kubelet[2769]: W0819 00:17:52.371291 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.371329 kubelet[2769]: E0819 00:17:52.371316 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.372173 kubelet[2769]: E0819 00:17:52.372107 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.372498 kubelet[2769]: W0819 00:17:52.372415 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.372498 kubelet[2769]: E0819 00:17:52.372433 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.373730 kubelet[2769]: E0819 00:17:52.373656 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.373730 kubelet[2769]: W0819 00:17:52.373671 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.373730 kubelet[2769]: E0819 00:17:52.373683 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.374481 kubelet[2769]: E0819 00:17:52.374432 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.374481 kubelet[2769]: W0819 00:17:52.374446 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.374481 kubelet[2769]: E0819 00:17:52.374459 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.375476 kubelet[2769]: E0819 00:17:52.375407 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.375476 kubelet[2769]: W0819 00:17:52.375421 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.375476 kubelet[2769]: E0819 00:17:52.375433 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.376056 kubelet[2769]: E0819 00:17:52.376039 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.376182 kubelet[2769]: W0819 00:17:52.376123 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.376182 kubelet[2769]: E0819 00:17:52.376160 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.377421 kubelet[2769]: E0819 00:17:52.377285 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.377421 kubelet[2769]: W0819 00:17:52.377304 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.377421 kubelet[2769]: E0819 00:17:52.377316 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.377844 kubelet[2769]: E0819 00:17:52.377786 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.377844 kubelet[2769]: W0819 00:17:52.377810 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.377844 kubelet[2769]: E0819 00:17:52.377822 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.378672 kubelet[2769]: E0819 00:17:52.378647 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.379172 kubelet[2769]: W0819 00:17:52.379102 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.379436 kubelet[2769]: E0819 00:17:52.379260 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.379680 kubelet[2769]: E0819 00:17:52.379666 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.379769 kubelet[2769]: W0819 00:17:52.379754 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.379871 kubelet[2769]: E0819 00:17:52.379834 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:52.418878 containerd[1513]: time="2025-08-19T00:17:52.418759950Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wwv6v,Uid:3e73fe59-04cf-4fe7-9888-7f3bfe4a14d1,Namespace:calico-system,Attempt:0,} returns sandbox id \"edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf\"" Aug 19 00:17:52.427499 kubelet[2769]: E0819 00:17:52.427398 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:52.427499 kubelet[2769]: W0819 00:17:52.427423 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:52.427499 kubelet[2769]: E0819 00:17:52.427447 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:53.891402 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1989435872.mount: Deactivated successfully. Aug 19 00:17:54.213029 kubelet[2769]: E0819 00:17:54.212871 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tc6qm" podUID="f3224d19-d7a0-4294-80f2-5149b99ec962" Aug 19 00:17:54.998252 containerd[1513]: time="2025-08-19T00:17:54.998183229Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:55.000874 containerd[1513]: time="2025-08-19T00:17:55.000814201Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.2: active requests=0, bytes read=33087207" Aug 19 00:17:55.003020 containerd[1513]: time="2025-08-19T00:17:55.002940924Z" level=info msg="ImageCreate event name:\"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:55.007219 containerd[1513]: time="2025-08-19T00:17:55.007107536Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:55.008790 containerd[1513]: time="2025-08-19T00:17:55.008722808Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.2\" with image id \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:da29d745efe5eb7d25f765d3aa439f3fe60710a458efe39c285e58b02bd961af\", size \"33087061\" in 2.671777975s" Aug 19 00:17:55.008790 containerd[1513]: time="2025-08-19T00:17:55.008773285Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.2\" returns image reference \"sha256:bd819526ff844d29b60cd75e846a1f55306016ff269d881d52a9b6c7b2eef0b2\"" Aug 19 00:17:55.010969 containerd[1513]: time="2025-08-19T00:17:55.010889209Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\"" Aug 19 00:17:55.033454 containerd[1513]: time="2025-08-19T00:17:55.033325623Z" level=info msg="CreateContainer within sandbox \"8d0dcf85a478667e1d312899e5c6b2034aaf95a6541eb2bb56a5f00406d11ec8\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Aug 19 00:17:55.048028 containerd[1513]: time="2025-08-19T00:17:55.047434891Z" level=info msg="Container fe3a69408416b5a60293c36e15dcea8cdf685f091724f1f3bb6ba657aef2ff6e: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:17:55.055657 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2374355557.mount: Deactivated successfully. Aug 19 00:17:55.061812 containerd[1513]: time="2025-08-19T00:17:55.061703711Z" level=info msg="CreateContainer within sandbox \"8d0dcf85a478667e1d312899e5c6b2034aaf95a6541eb2bb56a5f00406d11ec8\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"fe3a69408416b5a60293c36e15dcea8cdf685f091724f1f3bb6ba657aef2ff6e\"" Aug 19 00:17:55.064544 containerd[1513]: time="2025-08-19T00:17:55.064422122Z" level=info msg="StartContainer for \"fe3a69408416b5a60293c36e15dcea8cdf685f091724f1f3bb6ba657aef2ff6e\"" Aug 19 00:17:55.067383 containerd[1513]: time="2025-08-19T00:17:55.067107575Z" level=info msg="connecting to shim fe3a69408416b5a60293c36e15dcea8cdf685f091724f1f3bb6ba657aef2ff6e" address="unix:///run/containerd/s/c32d46ecd096de946424ac665bda71bfa3573be7b1d9d0b2ac8f91223353ae94" protocol=ttrpc version=3 Aug 19 00:17:55.100625 systemd[1]: Started cri-containerd-fe3a69408416b5a60293c36e15dcea8cdf685f091724f1f3bb6ba657aef2ff6e.scope - libcontainer container fe3a69408416b5a60293c36e15dcea8cdf685f091724f1f3bb6ba657aef2ff6e. Aug 19 00:17:55.165960 containerd[1513]: time="2025-08-19T00:17:55.165909773Z" level=info msg="StartContainer for \"fe3a69408416b5a60293c36e15dcea8cdf685f091724f1f3bb6ba657aef2ff6e\" returns successfully" Aug 19 00:17:55.362455 kubelet[2769]: E0819 00:17:55.362394 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.362455 kubelet[2769]: W0819 00:17:55.362421 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.362455 kubelet[2769]: E0819 00:17:55.362442 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.363503 kubelet[2769]: E0819 00:17:55.363074 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.363503 kubelet[2769]: W0819 00:17:55.363091 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.363503 kubelet[2769]: E0819 00:17:55.363152 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.363503 kubelet[2769]: E0819 00:17:55.363335 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.363503 kubelet[2769]: W0819 00:17:55.363343 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.363503 kubelet[2769]: E0819 00:17:55.363352 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.364608 kubelet[2769]: E0819 00:17:55.364496 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.364608 kubelet[2769]: W0819 00:17:55.364521 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.364608 kubelet[2769]: E0819 00:17:55.364595 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.366613 kubelet[2769]: E0819 00:17:55.366555 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.366613 kubelet[2769]: W0819 00:17:55.366579 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.366613 kubelet[2769]: E0819 00:17:55.366599 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.367029 kubelet[2769]: E0819 00:17:55.366797 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.367029 kubelet[2769]: W0819 00:17:55.366805 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.367029 kubelet[2769]: E0819 00:17:55.366814 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.368288 kubelet[2769]: E0819 00:17:55.368146 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.368626 kubelet[2769]: W0819 00:17:55.368364 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.368626 kubelet[2769]: E0819 00:17:55.368389 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.369142 kubelet[2769]: I0819 00:17:55.368908 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7c665f78b7-2n2hg" podStartSLOduration=1.695287801 podStartE2EDuration="4.368888554s" podCreationTimestamp="2025-08-19 00:17:51 +0000 UTC" firstStartedPulling="2025-08-19 00:17:52.336382746 +0000 UTC m=+24.292764850" lastFinishedPulling="2025-08-19 00:17:55.009983419 +0000 UTC m=+26.966365603" observedRunningTime="2025-08-19 00:17:55.365770124 +0000 UTC m=+27.322152228" watchObservedRunningTime="2025-08-19 00:17:55.368888554 +0000 UTC m=+27.325270738" Aug 19 00:17:55.369324 kubelet[2769]: E0819 00:17:55.369303 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.369690 kubelet[2769]: W0819 00:17:55.369378 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.369690 kubelet[2769]: E0819 00:17:55.369396 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.369997 kubelet[2769]: E0819 00:17:55.369968 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.370120 kubelet[2769]: W0819 00:17:55.370092 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.370896 kubelet[2769]: E0819 00:17:55.370877 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.371634 kubelet[2769]: E0819 00:17:55.371453 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.371634 kubelet[2769]: W0819 00:17:55.371469 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.371634 kubelet[2769]: E0819 00:17:55.371484 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.371812 kubelet[2769]: E0819 00:17:55.371801 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.374229 kubelet[2769]: W0819 00:17:55.374192 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.374435 kubelet[2769]: E0819 00:17:55.374300 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.375426 kubelet[2769]: E0819 00:17:55.375363 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.375426 kubelet[2769]: W0819 00:17:55.375383 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.375426 kubelet[2769]: E0819 00:17:55.375401 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.375890 kubelet[2769]: E0819 00:17:55.375830 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.375890 kubelet[2769]: W0819 00:17:55.375845 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.375890 kubelet[2769]: E0819 00:17:55.375857 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.376305 kubelet[2769]: E0819 00:17:55.376273 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.376305 kubelet[2769]: W0819 00:17:55.376287 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.376451 kubelet[2769]: E0819 00:17:55.376400 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.376668 kubelet[2769]: E0819 00:17:55.376656 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.376816 kubelet[2769]: W0819 00:17:55.376743 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.376816 kubelet[2769]: E0819 00:17:55.376758 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.392921 kubelet[2769]: E0819 00:17:55.392883 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.392921 kubelet[2769]: W0819 00:17:55.392910 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.392921 kubelet[2769]: E0819 00:17:55.392934 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.393282 kubelet[2769]: E0819 00:17:55.393265 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.393282 kubelet[2769]: W0819 00:17:55.393281 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.394225 kubelet[2769]: E0819 00:17:55.393292 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.394225 kubelet[2769]: E0819 00:17:55.393695 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.394225 kubelet[2769]: W0819 00:17:55.393708 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.394225 kubelet[2769]: E0819 00:17:55.393719 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.394225 kubelet[2769]: E0819 00:17:55.394208 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.394225 kubelet[2769]: W0819 00:17:55.394220 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.394482 kubelet[2769]: E0819 00:17:55.394232 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.394603 kubelet[2769]: E0819 00:17:55.394582 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.394603 kubelet[2769]: W0819 00:17:55.394600 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.394706 kubelet[2769]: E0819 00:17:55.394613 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.395293 kubelet[2769]: E0819 00:17:55.395274 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.395293 kubelet[2769]: W0819 00:17:55.395290 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.395426 kubelet[2769]: E0819 00:17:55.395304 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.395628 kubelet[2769]: E0819 00:17:55.395609 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.395698 kubelet[2769]: W0819 00:17:55.395686 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.395757 kubelet[2769]: E0819 00:17:55.395745 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.396103 kubelet[2769]: E0819 00:17:55.396084 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.396825 kubelet[2769]: W0819 00:17:55.396798 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.396920 kubelet[2769]: E0819 00:17:55.396905 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.397232 kubelet[2769]: E0819 00:17:55.397218 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.397371 kubelet[2769]: W0819 00:17:55.397304 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.397371 kubelet[2769]: E0819 00:17:55.397321 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.397711 kubelet[2769]: E0819 00:17:55.397587 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.397711 kubelet[2769]: W0819 00:17:55.397605 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.397711 kubelet[2769]: E0819 00:17:55.397616 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.399183 kubelet[2769]: E0819 00:17:55.398231 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.399504 kubelet[2769]: W0819 00:17:55.399293 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.399504 kubelet[2769]: E0819 00:17:55.399321 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.399809 kubelet[2769]: E0819 00:17:55.399794 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.399955 kubelet[2769]: W0819 00:17:55.399878 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.399955 kubelet[2769]: E0819 00:17:55.399897 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.400260 kubelet[2769]: E0819 00:17:55.400222 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.400260 kubelet[2769]: W0819 00:17:55.400235 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.400260 kubelet[2769]: E0819 00:17:55.400246 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.401348 kubelet[2769]: E0819 00:17:55.401298 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.401348 kubelet[2769]: W0819 00:17:55.401316 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.401348 kubelet[2769]: E0819 00:17:55.401332 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.401984 kubelet[2769]: E0819 00:17:55.401969 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.402155 kubelet[2769]: W0819 00:17:55.402055 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.402155 kubelet[2769]: E0819 00:17:55.402072 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.404042 kubelet[2769]: E0819 00:17:55.402379 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.404375 kubelet[2769]: W0819 00:17:55.404187 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.404375 kubelet[2769]: E0819 00:17:55.404217 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.405164 kubelet[2769]: E0819 00:17:55.405022 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.405164 kubelet[2769]: W0819 00:17:55.405036 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.405164 kubelet[2769]: E0819 00:17:55.405048 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:55.405404 kubelet[2769]: E0819 00:17:55.405391 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:55.405520 kubelet[2769]: W0819 00:17:55.405479 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:55.405520 kubelet[2769]: E0819 00:17:55.405495 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.213161 kubelet[2769]: E0819 00:17:56.213071 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tc6qm" podUID="f3224d19-d7a0-4294-80f2-5149b99ec962" Aug 19 00:17:56.333334 kubelet[2769]: I0819 00:17:56.332458 2769 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:17:56.384947 kubelet[2769]: E0819 00:17:56.384683 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.384947 kubelet[2769]: W0819 00:17:56.384714 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.384947 kubelet[2769]: E0819 00:17:56.384745 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.386952 kubelet[2769]: E0819 00:17:56.386122 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.386952 kubelet[2769]: W0819 00:17:56.386168 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.386952 kubelet[2769]: E0819 00:17:56.386192 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.386952 kubelet[2769]: E0819 00:17:56.386610 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.386952 kubelet[2769]: W0819 00:17:56.386625 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.386952 kubelet[2769]: E0819 00:17:56.386639 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.388408 kubelet[2769]: E0819 00:17:56.387739 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.388408 kubelet[2769]: W0819 00:17:56.387754 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.388408 kubelet[2769]: E0819 00:17:56.387768 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.390201 kubelet[2769]: E0819 00:17:56.389807 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.390201 kubelet[2769]: W0819 00:17:56.389830 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.390201 kubelet[2769]: E0819 00:17:56.389850 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.390960 kubelet[2769]: E0819 00:17:56.390939 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.391167 kubelet[2769]: W0819 00:17:56.391037 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.391167 kubelet[2769]: E0819 00:17:56.391060 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.391766 kubelet[2769]: E0819 00:17:56.391715 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.392039 kubelet[2769]: W0819 00:17:56.391871 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.392039 kubelet[2769]: E0819 00:17:56.391900 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.392962 kubelet[2769]: E0819 00:17:56.392537 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.392962 kubelet[2769]: W0819 00:17:56.392557 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.392962 kubelet[2769]: E0819 00:17:56.392578 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.393412 kubelet[2769]: E0819 00:17:56.393391 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.393948 kubelet[2769]: W0819 00:17:56.393518 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.393948 kubelet[2769]: E0819 00:17:56.393667 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.394679 kubelet[2769]: E0819 00:17:56.394412 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.394679 kubelet[2769]: W0819 00:17:56.394436 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.394679 kubelet[2769]: E0819 00:17:56.394457 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.395026 kubelet[2769]: E0819 00:17:56.395004 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.395170 kubelet[2769]: W0819 00:17:56.395105 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.396199 kubelet[2769]: E0819 00:17:56.395323 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.396199 kubelet[2769]: E0819 00:17:56.395974 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.396199 kubelet[2769]: W0819 00:17:56.395991 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.396199 kubelet[2769]: E0819 00:17:56.396016 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.396841 kubelet[2769]: E0819 00:17:56.396714 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.396841 kubelet[2769]: W0819 00:17:56.396734 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.396841 kubelet[2769]: E0819 00:17:56.396749 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.397073 kubelet[2769]: E0819 00:17:56.397061 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.397167 kubelet[2769]: W0819 00:17:56.397119 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.397238 kubelet[2769]: E0819 00:17:56.397226 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.397550 kubelet[2769]: E0819 00:17:56.397536 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.397709 kubelet[2769]: W0819 00:17:56.397625 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.397709 kubelet[2769]: E0819 00:17:56.397644 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.401836 kubelet[2769]: E0819 00:17:56.401782 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.401836 kubelet[2769]: W0819 00:17:56.401812 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.401836 kubelet[2769]: E0819 00:17:56.401835 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.402209 kubelet[2769]: E0819 00:17:56.402110 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.402209 kubelet[2769]: W0819 00:17:56.402145 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.402209 kubelet[2769]: E0819 00:17:56.402157 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.402511 kubelet[2769]: E0819 00:17:56.402338 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.402511 kubelet[2769]: W0819 00:17:56.402348 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.402511 kubelet[2769]: E0819 00:17:56.402358 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.402760 kubelet[2769]: E0819 00:17:56.402610 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.402760 kubelet[2769]: W0819 00:17:56.402619 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.402760 kubelet[2769]: E0819 00:17:56.402629 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.402972 kubelet[2769]: E0819 00:17:56.402785 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.402972 kubelet[2769]: W0819 00:17:56.402795 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.402972 kubelet[2769]: E0819 00:17:56.402804 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.402972 kubelet[2769]: E0819 00:17:56.402937 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.402972 kubelet[2769]: W0819 00:17:56.402947 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.402972 kubelet[2769]: E0819 00:17:56.402955 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.403266 kubelet[2769]: E0819 00:17:56.403142 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.403266 kubelet[2769]: W0819 00:17:56.403151 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.403266 kubelet[2769]: E0819 00:17:56.403159 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.403592 kubelet[2769]: E0819 00:17:56.403530 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.403592 kubelet[2769]: W0819 00:17:56.403549 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.403592 kubelet[2769]: E0819 00:17:56.403560 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.404114 kubelet[2769]: E0819 00:17:56.403700 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.404114 kubelet[2769]: W0819 00:17:56.403708 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.404114 kubelet[2769]: E0819 00:17:56.403715 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.404114 kubelet[2769]: E0819 00:17:56.404084 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.404114 kubelet[2769]: W0819 00:17:56.404101 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.404114 kubelet[2769]: E0819 00:17:56.404113 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.404387 kubelet[2769]: E0819 00:17:56.404371 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.404387 kubelet[2769]: W0819 00:17:56.404383 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.404451 kubelet[2769]: E0819 00:17:56.404395 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.404884 kubelet[2769]: E0819 00:17:56.404824 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.404884 kubelet[2769]: W0819 00:17:56.404841 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.404884 kubelet[2769]: E0819 00:17:56.404850 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.405050 kubelet[2769]: E0819 00:17:56.405025 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.405050 kubelet[2769]: W0819 00:17:56.405033 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.405050 kubelet[2769]: E0819 00:17:56.405041 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.405534 kubelet[2769]: E0819 00:17:56.405419 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.405674 kubelet[2769]: W0819 00:17:56.405654 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.405748 kubelet[2769]: E0819 00:17:56.405734 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.406160 kubelet[2769]: E0819 00:17:56.406099 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.406160 kubelet[2769]: W0819 00:17:56.406118 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.406680 kubelet[2769]: E0819 00:17:56.406308 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.406855 kubelet[2769]: E0819 00:17:56.406835 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.406947 kubelet[2769]: W0819 00:17:56.406930 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.407032 kubelet[2769]: E0819 00:17:56.407015 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.407305 kubelet[2769]: E0819 00:17:56.407291 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.407379 kubelet[2769]: W0819 00:17:56.407367 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.407430 kubelet[2769]: E0819 00:17:56.407421 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:56.407810 kubelet[2769]: E0819 00:17:56.407750 2769 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Aug 19 00:17:56.407810 kubelet[2769]: W0819 00:17:56.407770 2769 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Aug 19 00:17:56.407810 kubelet[2769]: E0819 00:17:56.407785 2769 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Aug 19 00:17:57.281298 containerd[1513]: time="2025-08-19T00:17:57.281202523Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:57.283703 containerd[1513]: time="2025-08-19T00:17:57.283623319Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2: active requests=0, bytes read=4266981" Aug 19 00:17:57.284763 containerd[1513]: time="2025-08-19T00:17:57.284639387Z" level=info msg="ImageCreate event name:\"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:57.289384 containerd[1513]: time="2025-08-19T00:17:57.289306708Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:17:57.291565 containerd[1513]: time="2025-08-19T00:17:57.291500395Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" with image id \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:972be127eaecd7d1a2d5393b8d14f1ae8f88550bee83e0519e9590c7e15eb41b\", size \"5636182\" in 2.280570188s" Aug 19 00:17:57.291747 containerd[1513]: time="2025-08-19T00:17:57.291727983Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.2\" returns image reference \"sha256:53f638101e3d73f7dd5e42dc42fb3d94ae1978e8958677222c3de6ec1d8c3d4f\"" Aug 19 00:17:57.297367 containerd[1513]: time="2025-08-19T00:17:57.297298378Z" level=info msg="CreateContainer within sandbox \"edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Aug 19 00:17:57.311306 containerd[1513]: time="2025-08-19T00:17:57.311256701Z" level=info msg="Container 97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:17:57.331521 containerd[1513]: time="2025-08-19T00:17:57.331458425Z" level=info msg="CreateContainer within sandbox \"edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1\"" Aug 19 00:17:57.332840 containerd[1513]: time="2025-08-19T00:17:57.332542129Z" level=info msg="StartContainer for \"97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1\"" Aug 19 00:17:57.337013 containerd[1513]: time="2025-08-19T00:17:57.335939195Z" level=info msg="connecting to shim 97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1" address="unix:///run/containerd/s/622a4c40d5f7d2eb1e752e49b0f00af816af9d7f5c43dd2528bdbeabbf95a588" protocol=ttrpc version=3 Aug 19 00:17:57.366373 systemd[1]: Started cri-containerd-97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1.scope - libcontainer container 97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1. Aug 19 00:17:57.419971 containerd[1513]: time="2025-08-19T00:17:57.419916566Z" level=info msg="StartContainer for \"97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1\" returns successfully" Aug 19 00:17:57.431633 systemd[1]: cri-containerd-97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1.scope: Deactivated successfully. Aug 19 00:17:57.436696 containerd[1513]: time="2025-08-19T00:17:57.436558953Z" level=info msg="received exit event container_id:\"97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1\" id:\"97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1\" pid:3480 exited_at:{seconds:1755562677 nanos:435798312}" Aug 19 00:17:57.436933 containerd[1513]: time="2025-08-19T00:17:57.436879016Z" level=info msg="TaskExit event in podsandbox handler container_id:\"97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1\" id:\"97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1\" pid:3480 exited_at:{seconds:1755562677 nanos:435798312}" Aug 19 00:17:57.465982 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-97fbd8aa2251b11ad24b9e281b6a58cc5117ec70c35e9319278283759a13cbe1-rootfs.mount: Deactivated successfully. Aug 19 00:17:58.214630 kubelet[2769]: E0819 00:17:58.214466 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tc6qm" podUID="f3224d19-d7a0-4294-80f2-5149b99ec962" Aug 19 00:17:58.348851 containerd[1513]: time="2025-08-19T00:17:58.348631717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\"" Aug 19 00:18:00.215110 kubelet[2769]: E0819 00:18:00.215047 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tc6qm" podUID="f3224d19-d7a0-4294-80f2-5149b99ec962" Aug 19 00:18:02.055523 containerd[1513]: time="2025-08-19T00:18:02.055435061Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:02.057085 containerd[1513]: time="2025-08-19T00:18:02.057004512Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.2: active requests=0, bytes read=65888320" Aug 19 00:18:02.058511 containerd[1513]: time="2025-08-19T00:18:02.058414050Z" level=info msg="ImageCreate event name:\"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:02.061867 containerd[1513]: time="2025-08-19T00:18:02.061783663Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:02.063005 containerd[1513]: time="2025-08-19T00:18:02.062937413Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.2\" with image id \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:50686775cc60acb78bd92a66fa2d84e1700b2d8e43a718fbadbf35e59baefb4d\", size \"67257561\" in 3.714144624s" Aug 19 00:18:02.063005 containerd[1513]: time="2025-08-19T00:18:02.062980491Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.2\" returns image reference \"sha256:f6e344d58b3c5524e767c7d1dd4cb29c85ce820b0f3005a603532b6a22db5588\"" Aug 19 00:18:02.069961 containerd[1513]: time="2025-08-19T00:18:02.069890948Z" level=info msg="CreateContainer within sandbox \"edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Aug 19 00:18:02.084374 containerd[1513]: time="2025-08-19T00:18:02.083554711Z" level=info msg="Container 251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:02.100172 containerd[1513]: time="2025-08-19T00:18:02.100084107Z" level=info msg="CreateContainer within sandbox \"edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f\"" Aug 19 00:18:02.101295 containerd[1513]: time="2025-08-19T00:18:02.101041906Z" level=info msg="StartContainer for \"251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f\"" Aug 19 00:18:02.103658 containerd[1513]: time="2025-08-19T00:18:02.103553316Z" level=info msg="connecting to shim 251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f" address="unix:///run/containerd/s/622a4c40d5f7d2eb1e752e49b0f00af816af9d7f5c43dd2528bdbeabbf95a588" protocol=ttrpc version=3 Aug 19 00:18:02.133507 systemd[1]: Started cri-containerd-251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f.scope - libcontainer container 251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f. Aug 19 00:18:02.186719 containerd[1513]: time="2025-08-19T00:18:02.186670679Z" level=info msg="StartContainer for \"251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f\" returns successfully" Aug 19 00:18:02.214587 kubelet[2769]: E0819 00:18:02.214499 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tc6qm" podUID="f3224d19-d7a0-4294-80f2-5149b99ec962" Aug 19 00:18:02.793638 containerd[1513]: time="2025-08-19T00:18:02.793557289Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Aug 19 00:18:02.796984 systemd[1]: cri-containerd-251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f.scope: Deactivated successfully. Aug 19 00:18:02.797489 systemd[1]: cri-containerd-251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f.scope: Consumed 556ms CPU time, 190.7M memory peak, 165.8M written to disk. Aug 19 00:18:02.800796 containerd[1513]: time="2025-08-19T00:18:02.800595261Z" level=info msg="received exit event container_id:\"251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f\" id:\"251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f\" pid:3538 exited_at:{seconds:1755562682 nanos:800053885}" Aug 19 00:18:02.801958 containerd[1513]: time="2025-08-19T00:18:02.801116838Z" level=info msg="TaskExit event in podsandbox handler container_id:\"251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f\" id:\"251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f\" pid:3538 exited_at:{seconds:1755562682 nanos:800053885}" Aug 19 00:18:02.834847 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-251b05b743e0a59dee678c73a5f632a577bd9e094b1bd2ab1021071382b68f8f-rootfs.mount: Deactivated successfully. Aug 19 00:18:02.835716 kubelet[2769]: I0819 00:18:02.835543 2769 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Aug 19 00:18:02.904327 kubelet[2769]: E0819 00:18:02.903797 2769 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"coredns\" is forbidden: User \"system:node:ci-4426-0-0-8-661ee896d9\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4426-0-0-8-661ee896d9' and this object" logger="UnhandledError" reflector="object-\"kube-system\"/\"coredns\"" type="*v1.ConfigMap" Aug 19 00:18:02.904327 kubelet[2769]: I0819 00:18:02.903867 2769 status_manager.go:895] "Failed to get status for pod" podUID="1e621243-8897-4395-9a28-32253cceeffb" pod="kube-system/coredns-674b8bbfcf-gzg9f" err="pods \"coredns-674b8bbfcf-gzg9f\" is forbidden: User \"system:node:ci-4426-0-0-8-661ee896d9\" cannot get resource \"pods\" in API group \"\" in the namespace \"kube-system\": no relationship found between node 'ci-4426-0-0-8-661ee896d9' and this object" Aug 19 00:18:02.914919 systemd[1]: Created slice kubepods-burstable-pod1e621243_8897_4395_9a28_32253cceeffb.slice - libcontainer container kubepods-burstable-pod1e621243_8897_4395_9a28_32253cceeffb.slice. Aug 19 00:18:02.932850 systemd[1]: Created slice kubepods-besteffort-pod27ede6b7_98b7_419b_84aa_572c3e073e65.slice - libcontainer container kubepods-besteffort-pod27ede6b7_98b7_419b_84aa_572c3e073e65.slice. Aug 19 00:18:02.946718 systemd[1]: Created slice kubepods-besteffort-pod311af456_51ba_4192_9f4f_8560e7648f9f.slice - libcontainer container kubepods-besteffort-pod311af456_51ba_4192_9f4f_8560e7648f9f.slice. Aug 19 00:18:02.953515 kubelet[2769]: I0819 00:18:02.952800 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c01e7ff3-6abc-4602-b953-fe846dc88f55-config\") pod \"goldmane-768f4c5c69-tp2sl\" (UID: \"c01e7ff3-6abc-4602-b953-fe846dc88f55\") " pod="calico-system/goldmane-768f4c5c69-tp2sl" Aug 19 00:18:02.953515 kubelet[2769]: I0819 00:18:02.952847 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wxjn\" (UniqueName: \"kubernetes.io/projected/c01e7ff3-6abc-4602-b953-fe846dc88f55-kube-api-access-9wxjn\") pod \"goldmane-768f4c5c69-tp2sl\" (UID: \"c01e7ff3-6abc-4602-b953-fe846dc88f55\") " pod="calico-system/goldmane-768f4c5c69-tp2sl" Aug 19 00:18:02.953515 kubelet[2769]: I0819 00:18:02.952866 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5hmk\" (UniqueName: \"kubernetes.io/projected/1e621243-8897-4395-9a28-32253cceeffb-kube-api-access-q5hmk\") pod \"coredns-674b8bbfcf-gzg9f\" (UID: \"1e621243-8897-4395-9a28-32253cceeffb\") " pod="kube-system/coredns-674b8bbfcf-gzg9f" Aug 19 00:18:02.953515 kubelet[2769]: I0819 00:18:02.952884 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311af456-51ba-4192-9f4f-8560e7648f9f-whisker-ca-bundle\") pod \"whisker-7f9f9ff74b-qh8wr\" (UID: \"311af456-51ba-4192-9f4f-8560e7648f9f\") " pod="calico-system/whisker-7f9f9ff74b-qh8wr" Aug 19 00:18:02.953515 kubelet[2769]: I0819 00:18:02.952902 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9h8xk\" (UniqueName: \"kubernetes.io/projected/e16b483e-f67c-4447-aac6-6ca42f039a20-kube-api-access-9h8xk\") pod \"coredns-674b8bbfcf-zqxrl\" (UID: \"e16b483e-f67c-4447-aac6-6ca42f039a20\") " pod="kube-system/coredns-674b8bbfcf-zqxrl" Aug 19 00:18:02.953741 kubelet[2769]: I0819 00:18:02.952922 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/c01e7ff3-6abc-4602-b953-fe846dc88f55-goldmane-key-pair\") pod \"goldmane-768f4c5c69-tp2sl\" (UID: \"c01e7ff3-6abc-4602-b953-fe846dc88f55\") " pod="calico-system/goldmane-768f4c5c69-tp2sl" Aug 19 00:18:02.953741 kubelet[2769]: I0819 00:18:02.952937 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb6h7\" (UniqueName: \"kubernetes.io/projected/311af456-51ba-4192-9f4f-8560e7648f9f-kube-api-access-vb6h7\") pod \"whisker-7f9f9ff74b-qh8wr\" (UID: \"311af456-51ba-4192-9f4f-8560e7648f9f\") " pod="calico-system/whisker-7f9f9ff74b-qh8wr" Aug 19 00:18:02.953741 kubelet[2769]: I0819 00:18:02.952955 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/311af456-51ba-4192-9f4f-8560e7648f9f-whisker-backend-key-pair\") pod \"whisker-7f9f9ff74b-qh8wr\" (UID: \"311af456-51ba-4192-9f4f-8560e7648f9f\") " pod="calico-system/whisker-7f9f9ff74b-qh8wr" Aug 19 00:18:02.953741 kubelet[2769]: I0819 00:18:02.952969 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e16b483e-f67c-4447-aac6-6ca42f039a20-config-volume\") pod \"coredns-674b8bbfcf-zqxrl\" (UID: \"e16b483e-f67c-4447-aac6-6ca42f039a20\") " pod="kube-system/coredns-674b8bbfcf-zqxrl" Aug 19 00:18:02.953741 kubelet[2769]: I0819 00:18:02.952987 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5qwk\" (UniqueName: \"kubernetes.io/projected/89ca3d58-71bc-4f2a-83d4-594ab461a0d5-kube-api-access-w5qwk\") pod \"calico-apiserver-667fd66bbf-28jqt\" (UID: \"89ca3d58-71bc-4f2a-83d4-594ab461a0d5\") " pod="calico-apiserver/calico-apiserver-667fd66bbf-28jqt" Aug 19 00:18:02.953852 kubelet[2769]: I0819 00:18:02.953005 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/00acaa65-0ca5-43db-a1b6-460c4583ad3c-calico-apiserver-certs\") pod \"calico-apiserver-667fd66bbf-57zdj\" (UID: \"00acaa65-0ca5-43db-a1b6-460c4583ad3c\") " pod="calico-apiserver/calico-apiserver-667fd66bbf-57zdj" Aug 19 00:18:02.953852 kubelet[2769]: I0819 00:18:02.953025 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c01e7ff3-6abc-4602-b953-fe846dc88f55-goldmane-ca-bundle\") pod \"goldmane-768f4c5c69-tp2sl\" (UID: \"c01e7ff3-6abc-4602-b953-fe846dc88f55\") " pod="calico-system/goldmane-768f4c5c69-tp2sl" Aug 19 00:18:02.953852 kubelet[2769]: I0819 00:18:02.953041 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1e621243-8897-4395-9a28-32253cceeffb-config-volume\") pod \"coredns-674b8bbfcf-gzg9f\" (UID: \"1e621243-8897-4395-9a28-32253cceeffb\") " pod="kube-system/coredns-674b8bbfcf-gzg9f" Aug 19 00:18:02.953852 kubelet[2769]: I0819 00:18:02.953057 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/27ede6b7-98b7-419b-84aa-572c3e073e65-tigera-ca-bundle\") pod \"calico-kube-controllers-75649cc89b-2q5fl\" (UID: \"27ede6b7-98b7-419b-84aa-572c3e073e65\") " pod="calico-system/calico-kube-controllers-75649cc89b-2q5fl" Aug 19 00:18:02.953852 kubelet[2769]: I0819 00:18:02.953075 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2fq4\" (UniqueName: \"kubernetes.io/projected/00acaa65-0ca5-43db-a1b6-460c4583ad3c-kube-api-access-s2fq4\") pod \"calico-apiserver-667fd66bbf-57zdj\" (UID: \"00acaa65-0ca5-43db-a1b6-460c4583ad3c\") " pod="calico-apiserver/calico-apiserver-667fd66bbf-57zdj" Aug 19 00:18:02.954647 kubelet[2769]: I0819 00:18:02.953090 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpmmf\" (UniqueName: \"kubernetes.io/projected/27ede6b7-98b7-419b-84aa-572c3e073e65-kube-api-access-mpmmf\") pod \"calico-kube-controllers-75649cc89b-2q5fl\" (UID: \"27ede6b7-98b7-419b-84aa-572c3e073e65\") " pod="calico-system/calico-kube-controllers-75649cc89b-2q5fl" Aug 19 00:18:02.954647 kubelet[2769]: I0819 00:18:02.953114 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/89ca3d58-71bc-4f2a-83d4-594ab461a0d5-calico-apiserver-certs\") pod \"calico-apiserver-667fd66bbf-28jqt\" (UID: \"89ca3d58-71bc-4f2a-83d4-594ab461a0d5\") " pod="calico-apiserver/calico-apiserver-667fd66bbf-28jqt" Aug 19 00:18:02.960510 systemd[1]: Created slice kubepods-besteffort-pod89ca3d58_71bc_4f2a_83d4_594ab461a0d5.slice - libcontainer container kubepods-besteffort-pod89ca3d58_71bc_4f2a_83d4_594ab461a0d5.slice. Aug 19 00:18:02.977200 systemd[1]: Created slice kubepods-burstable-pode16b483e_f67c_4447_aac6_6ca42f039a20.slice - libcontainer container kubepods-burstable-pode16b483e_f67c_4447_aac6_6ca42f039a20.slice. Aug 19 00:18:02.984873 systemd[1]: Created slice kubepods-besteffort-pod00acaa65_0ca5_43db_a1b6_460c4583ad3c.slice - libcontainer container kubepods-besteffort-pod00acaa65_0ca5_43db_a1b6_460c4583ad3c.slice. Aug 19 00:18:02.997077 systemd[1]: Created slice kubepods-besteffort-podc01e7ff3_6abc_4602_b953_fe846dc88f55.slice - libcontainer container kubepods-besteffort-podc01e7ff3_6abc_4602_b953_fe846dc88f55.slice. Aug 19 00:18:03.245634 containerd[1513]: time="2025-08-19T00:18:03.244834080Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75649cc89b-2q5fl,Uid:27ede6b7-98b7-419b-84aa-572c3e073e65,Namespace:calico-system,Attempt:0,}" Aug 19 00:18:03.256345 containerd[1513]: time="2025-08-19T00:18:03.256205998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f9f9ff74b-qh8wr,Uid:311af456-51ba-4192-9f4f-8560e7648f9f,Namespace:calico-system,Attempt:0,}" Aug 19 00:18:03.270508 containerd[1513]: time="2025-08-19T00:18:03.270398037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-667fd66bbf-28jqt,Uid:89ca3d58-71bc-4f2a-83d4-594ab461a0d5,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:18:03.294926 containerd[1513]: time="2025-08-19T00:18:03.294330743Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-667fd66bbf-57zdj,Uid:00acaa65-0ca5-43db-a1b6-460c4583ad3c,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:18:03.302911 containerd[1513]: time="2025-08-19T00:18:03.302782185Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tp2sl,Uid:c01e7ff3-6abc-4602-b953-fe846dc88f55,Namespace:calico-system,Attempt:0,}" Aug 19 00:18:03.381888 containerd[1513]: time="2025-08-19T00:18:03.381480370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\"" Aug 19 00:18:03.418825 containerd[1513]: time="2025-08-19T00:18:03.418777829Z" level=error msg="Failed to destroy network for sandbox \"8f2ee6e9beb16043977cb7a5e98a0f0088494764f411e60a836db42a93cbdd13\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.432755 containerd[1513]: time="2025-08-19T00:18:03.432339975Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75649cc89b-2q5fl,Uid:27ede6b7-98b7-419b-84aa-572c3e073e65,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f2ee6e9beb16043977cb7a5e98a0f0088494764f411e60a836db42a93cbdd13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.436771 kubelet[2769]: E0819 00:18:03.436680 2769 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f2ee6e9beb16043977cb7a5e98a0f0088494764f411e60a836db42a93cbdd13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.437911 kubelet[2769]: E0819 00:18:03.437254 2769 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f2ee6e9beb16043977cb7a5e98a0f0088494764f411e60a836db42a93cbdd13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75649cc89b-2q5fl" Aug 19 00:18:03.437911 kubelet[2769]: E0819 00:18:03.437293 2769 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8f2ee6e9beb16043977cb7a5e98a0f0088494764f411e60a836db42a93cbdd13\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-75649cc89b-2q5fl" Aug 19 00:18:03.437911 kubelet[2769]: E0819 00:18:03.437350 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-75649cc89b-2q5fl_calico-system(27ede6b7-98b7-419b-84aa-572c3e073e65)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-75649cc89b-2q5fl_calico-system(27ede6b7-98b7-419b-84aa-572c3e073e65)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8f2ee6e9beb16043977cb7a5e98a0f0088494764f411e60a836db42a93cbdd13\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-75649cc89b-2q5fl" podUID="27ede6b7-98b7-419b-84aa-572c3e073e65" Aug 19 00:18:03.447204 containerd[1513]: time="2025-08-19T00:18:03.447023112Z" level=error msg="Failed to destroy network for sandbox \"079cb337dd165f5f9004a57be6a651c39099844b0aa37a44b35054afa3d8cad3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.451525 containerd[1513]: time="2025-08-19T00:18:03.450976465Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-7f9f9ff74b-qh8wr,Uid:311af456-51ba-4192-9f4f-8560e7648f9f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"079cb337dd165f5f9004a57be6a651c39099844b0aa37a44b35054afa3d8cad3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.451695 kubelet[2769]: E0819 00:18:03.451493 2769 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"079cb337dd165f5f9004a57be6a651c39099844b0aa37a44b35054afa3d8cad3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.451695 kubelet[2769]: E0819 00:18:03.451644 2769 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"079cb337dd165f5f9004a57be6a651c39099844b0aa37a44b35054afa3d8cad3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f9f9ff74b-qh8wr" Aug 19 00:18:03.452374 kubelet[2769]: E0819 00:18:03.451763 2769 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"079cb337dd165f5f9004a57be6a651c39099844b0aa37a44b35054afa3d8cad3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-7f9f9ff74b-qh8wr" Aug 19 00:18:03.452374 kubelet[2769]: E0819 00:18:03.451831 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-7f9f9ff74b-qh8wr_calico-system(311af456-51ba-4192-9f4f-8560e7648f9f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-7f9f9ff74b-qh8wr_calico-system(311af456-51ba-4192-9f4f-8560e7648f9f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"079cb337dd165f5f9004a57be6a651c39099844b0aa37a44b35054afa3d8cad3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-7f9f9ff74b-qh8wr" podUID="311af456-51ba-4192-9f4f-8560e7648f9f" Aug 19 00:18:03.467778 containerd[1513]: time="2025-08-19T00:18:03.467732115Z" level=error msg="Failed to destroy network for sandbox \"fb06d20ae3a90ac85aa490dd496b4afc36d94f84543a01446142b8e5552352a3\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.469540 containerd[1513]: time="2025-08-19T00:18:03.469467721Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-667fd66bbf-28jqt,Uid:89ca3d58-71bc-4f2a-83d4-594ab461a0d5,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb06d20ae3a90ac85aa490dd496b4afc36d94f84543a01446142b8e5552352a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.470285 kubelet[2769]: E0819 00:18:03.470247 2769 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb06d20ae3a90ac85aa490dd496b4afc36d94f84543a01446142b8e5552352a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.470459 kubelet[2769]: E0819 00:18:03.470412 2769 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb06d20ae3a90ac85aa490dd496b4afc36d94f84543a01446142b8e5552352a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-667fd66bbf-28jqt" Aug 19 00:18:03.470519 kubelet[2769]: E0819 00:18:03.470439 2769 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"fb06d20ae3a90ac85aa490dd496b4afc36d94f84543a01446142b8e5552352a3\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-667fd66bbf-28jqt" Aug 19 00:18:03.470741 kubelet[2769]: E0819 00:18:03.470631 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-667fd66bbf-28jqt_calico-apiserver(89ca3d58-71bc-4f2a-83d4-594ab461a0d5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-667fd66bbf-28jqt_calico-apiserver(89ca3d58-71bc-4f2a-83d4-594ab461a0d5)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"fb06d20ae3a90ac85aa490dd496b4afc36d94f84543a01446142b8e5552352a3\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-667fd66bbf-28jqt" podUID="89ca3d58-71bc-4f2a-83d4-594ab461a0d5" Aug 19 00:18:03.484924 containerd[1513]: time="2025-08-19T00:18:03.484874228Z" level=error msg="Failed to destroy network for sandbox \"4d3fffbbd1806ecfc2e247c3dee4b3cf4733e488dfb07878ea454725668f4d45\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.487365 containerd[1513]: time="2025-08-19T00:18:03.487282446Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tp2sl,Uid:c01e7ff3-6abc-4602-b953-fe846dc88f55,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d3fffbbd1806ecfc2e247c3dee4b3cf4733e488dfb07878ea454725668f4d45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.488682 kubelet[2769]: E0819 00:18:03.488549 2769 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d3fffbbd1806ecfc2e247c3dee4b3cf4733e488dfb07878ea454725668f4d45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.489505 kubelet[2769]: E0819 00:18:03.489203 2769 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d3fffbbd1806ecfc2e247c3dee4b3cf4733e488dfb07878ea454725668f4d45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-tp2sl" Aug 19 00:18:03.489505 kubelet[2769]: E0819 00:18:03.489246 2769 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4d3fffbbd1806ecfc2e247c3dee4b3cf4733e488dfb07878ea454725668f4d45\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-768f4c5c69-tp2sl" Aug 19 00:18:03.489505 kubelet[2769]: E0819 00:18:03.489324 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-768f4c5c69-tp2sl_calico-system(c01e7ff3-6abc-4602-b953-fe846dc88f55)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-768f4c5c69-tp2sl_calico-system(c01e7ff3-6abc-4602-b953-fe846dc88f55)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4d3fffbbd1806ecfc2e247c3dee4b3cf4733e488dfb07878ea454725668f4d45\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-768f4c5c69-tp2sl" podUID="c01e7ff3-6abc-4602-b953-fe846dc88f55" Aug 19 00:18:03.504514 containerd[1513]: time="2025-08-19T00:18:03.503692191Z" level=error msg="Failed to destroy network for sandbox \"7d8daa3eecee5bea4ca4a28a62e2ae0b5d3700449e90002710e15195b80e754d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.505623 containerd[1513]: time="2025-08-19T00:18:03.505570751Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-667fd66bbf-57zdj,Uid:00acaa65-0ca5-43db-a1b6-460c4583ad3c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d8daa3eecee5bea4ca4a28a62e2ae0b5d3700449e90002710e15195b80e754d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.508487 kubelet[2769]: E0819 00:18:03.505991 2769 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d8daa3eecee5bea4ca4a28a62e2ae0b5d3700449e90002710e15195b80e754d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:03.508487 kubelet[2769]: E0819 00:18:03.506055 2769 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d8daa3eecee5bea4ca4a28a62e2ae0b5d3700449e90002710e15195b80e754d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-667fd66bbf-57zdj" Aug 19 00:18:03.508487 kubelet[2769]: E0819 00:18:03.506079 2769 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7d8daa3eecee5bea4ca4a28a62e2ae0b5d3700449e90002710e15195b80e754d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-667fd66bbf-57zdj" Aug 19 00:18:03.508611 kubelet[2769]: E0819 00:18:03.506330 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-667fd66bbf-57zdj_calico-apiserver(00acaa65-0ca5-43db-a1b6-460c4583ad3c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-667fd66bbf-57zdj_calico-apiserver(00acaa65-0ca5-43db-a1b6-460c4583ad3c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7d8daa3eecee5bea4ca4a28a62e2ae0b5d3700449e90002710e15195b80e754d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-667fd66bbf-57zdj" podUID="00acaa65-0ca5-43db-a1b6-460c4583ad3c" Aug 19 00:18:04.055153 kubelet[2769]: E0819 00:18:04.054972 2769 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Aug 19 00:18:04.055356 kubelet[2769]: E0819 00:18:04.055327 2769 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e16b483e-f67c-4447-aac6-6ca42f039a20-config-volume podName:e16b483e-f67c-4447-aac6-6ca42f039a20 nodeName:}" failed. No retries permitted until 2025-08-19 00:18:04.555300809 +0000 UTC m=+36.511682913 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/e16b483e-f67c-4447-aac6-6ca42f039a20-config-volume") pod "coredns-674b8bbfcf-zqxrl" (UID: "e16b483e-f67c-4447-aac6-6ca42f039a20") : failed to sync configmap cache: timed out waiting for the condition Aug 19 00:18:04.057339 kubelet[2769]: E0819 00:18:04.057196 2769 configmap.go:193] Couldn't get configMap kube-system/coredns: failed to sync configmap cache: timed out waiting for the condition Aug 19 00:18:04.057339 kubelet[2769]: E0819 00:18:04.057281 2769 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1e621243-8897-4395-9a28-32253cceeffb-config-volume podName:1e621243-8897-4395-9a28-32253cceeffb nodeName:}" failed. No retries permitted until 2025-08-19 00:18:04.557262729 +0000 UTC m=+36.513644833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/1e621243-8897-4395-9a28-32253cceeffb-config-volume") pod "coredns-674b8bbfcf-gzg9f" (UID: "1e621243-8897-4395-9a28-32253cceeffb") : failed to sync configmap cache: timed out waiting for the condition Aug 19 00:18:04.224091 systemd[1]: Created slice kubepods-besteffort-podf3224d19_d7a0_4294_80f2_5149b99ec962.slice - libcontainer container kubepods-besteffort-podf3224d19_d7a0_4294_80f2_5149b99ec962.slice. Aug 19 00:18:04.229879 containerd[1513]: time="2025-08-19T00:18:04.229805846Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tc6qm,Uid:f3224d19-d7a0-4294-80f2-5149b99ec962,Namespace:calico-system,Attempt:0,}" Aug 19 00:18:04.304608 containerd[1513]: time="2025-08-19T00:18:04.304515820Z" level=error msg="Failed to destroy network for sandbox \"7cdc8e014b51d5d8429f6efd867e9fb7b6676fe3531369a8817a04d800510b31\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:04.308357 systemd[1]: run-netns-cni\x2d618f1e70\x2d1dc5\x2dafeb\x2d1ca0\x2dd1aa5ea63a12.mount: Deactivated successfully. Aug 19 00:18:04.311899 containerd[1513]: time="2025-08-19T00:18:04.311771802Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tc6qm,Uid:f3224d19-d7a0-4294-80f2-5149b99ec962,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7cdc8e014b51d5d8429f6efd867e9fb7b6676fe3531369a8817a04d800510b31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:04.312339 kubelet[2769]: E0819 00:18:04.312240 2769 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7cdc8e014b51d5d8429f6efd867e9fb7b6676fe3531369a8817a04d800510b31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:04.312339 kubelet[2769]: E0819 00:18:04.312327 2769 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7cdc8e014b51d5d8429f6efd867e9fb7b6676fe3531369a8817a04d800510b31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tc6qm" Aug 19 00:18:04.312578 kubelet[2769]: E0819 00:18:04.312350 2769 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7cdc8e014b51d5d8429f6efd867e9fb7b6676fe3531369a8817a04d800510b31\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tc6qm" Aug 19 00:18:04.312578 kubelet[2769]: E0819 00:18:04.312473 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tc6qm_calico-system(f3224d19-d7a0-4294-80f2-5149b99ec962)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tc6qm_calico-system(f3224d19-d7a0-4294-80f2-5149b99ec962)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7cdc8e014b51d5d8429f6efd867e9fb7b6676fe3531369a8817a04d800510b31\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tc6qm" podUID="f3224d19-d7a0-4294-80f2-5149b99ec962" Aug 19 00:18:04.724263 containerd[1513]: time="2025-08-19T00:18:04.723997322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gzg9f,Uid:1e621243-8897-4395-9a28-32253cceeffb,Namespace:kube-system,Attempt:0,}" Aug 19 00:18:04.783798 containerd[1513]: time="2025-08-19T00:18:04.783737109Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zqxrl,Uid:e16b483e-f67c-4447-aac6-6ca42f039a20,Namespace:kube-system,Attempt:0,}" Aug 19 00:18:04.797514 containerd[1513]: time="2025-08-19T00:18:04.797236115Z" level=error msg="Failed to destroy network for sandbox \"da776935945808626ea009897e9dad64d3dd93552af793d52b79bbc913a40545\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:04.802117 containerd[1513]: time="2025-08-19T00:18:04.800317349Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gzg9f,Uid:1e621243-8897-4395-9a28-32253cceeffb,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"da776935945808626ea009897e9dad64d3dd93552af793d52b79bbc913a40545\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:04.803664 kubelet[2769]: E0819 00:18:04.801296 2769 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da776935945808626ea009897e9dad64d3dd93552af793d52b79bbc913a40545\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:04.803664 kubelet[2769]: E0819 00:18:04.801377 2769 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da776935945808626ea009897e9dad64d3dd93552af793d52b79bbc913a40545\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gzg9f" Aug 19 00:18:04.803664 kubelet[2769]: E0819 00:18:04.801410 2769 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"da776935945808626ea009897e9dad64d3dd93552af793d52b79bbc913a40545\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-gzg9f" Aug 19 00:18:04.804621 kubelet[2769]: E0819 00:18:04.801506 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-gzg9f_kube-system(1e621243-8897-4395-9a28-32253cceeffb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-gzg9f_kube-system(1e621243-8897-4395-9a28-32253cceeffb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"da776935945808626ea009897e9dad64d3dd93552af793d52b79bbc913a40545\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-gzg9f" podUID="1e621243-8897-4395-9a28-32253cceeffb" Aug 19 00:18:04.859456 containerd[1513]: time="2025-08-19T00:18:04.859368685Z" level=error msg="Failed to destroy network for sandbox \"18ba3d7b92be4b93e718a8d40732ce10df3e070826ac69baed356ed5d44e87d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:04.861713 containerd[1513]: time="2025-08-19T00:18:04.861632912Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zqxrl,Uid:e16b483e-f67c-4447-aac6-6ca42f039a20,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"18ba3d7b92be4b93e718a8d40732ce10df3e070826ac69baed356ed5d44e87d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:04.862041 kubelet[2769]: E0819 00:18:04.861939 2769 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18ba3d7b92be4b93e718a8d40732ce10df3e070826ac69baed356ed5d44e87d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Aug 19 00:18:04.862041 kubelet[2769]: E0819 00:18:04.862009 2769 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18ba3d7b92be4b93e718a8d40732ce10df3e070826ac69baed356ed5d44e87d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zqxrl" Aug 19 00:18:04.862434 kubelet[2769]: E0819 00:18:04.862215 2769 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"18ba3d7b92be4b93e718a8d40732ce10df3e070826ac69baed356ed5d44e87d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-zqxrl" Aug 19 00:18:04.862434 kubelet[2769]: E0819 00:18:04.862321 2769 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-zqxrl_kube-system(e16b483e-f67c-4447-aac6-6ca42f039a20)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-zqxrl_kube-system(e16b483e-f67c-4447-aac6-6ca42f039a20)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"18ba3d7b92be4b93e718a8d40732ce10df3e070826ac69baed356ed5d44e87d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-zqxrl" podUID="e16b483e-f67c-4447-aac6-6ca42f039a20" Aug 19 00:18:05.104125 systemd[1]: run-netns-cni\x2d5713a832\x2d6acf\x2dd756\x2dec63\x2d40321043ae56.mount: Deactivated successfully. Aug 19 00:18:05.104264 systemd[1]: run-netns-cni\x2d5ca136ec\x2dd7ce\x2d7310\x2d2e1f\x2d54effd897d0a.mount: Deactivated successfully. Aug 19 00:18:10.844124 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount209988031.mount: Deactivated successfully. Aug 19 00:18:10.879556 containerd[1513]: time="2025-08-19T00:18:10.879474842Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:10.880874 containerd[1513]: time="2025-08-19T00:18:10.880783238Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.2: active requests=0, bytes read=152544909" Aug 19 00:18:10.883745 containerd[1513]: time="2025-08-19T00:18:10.883635741Z" level=info msg="ImageCreate event name:\"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:10.886731 containerd[1513]: time="2025-08-19T00:18:10.886601481Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:10.887567 containerd[1513]: time="2025-08-19T00:18:10.887515970Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.2\" with image id \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e94d49349cc361ef2216d27dda4a097278984d778279f66e79b0616c827c6760\", size \"152544771\" in 7.505707853s" Aug 19 00:18:10.887567 containerd[1513]: time="2025-08-19T00:18:10.887565488Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.2\" returns image reference \"sha256:1c6ddca599ddd18c061e797a7830b0aea985f8b023c5e43d815a9ed1088893a9\"" Aug 19 00:18:10.911843 containerd[1513]: time="2025-08-19T00:18:10.911780307Z" level=info msg="CreateContainer within sandbox \"edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Aug 19 00:18:10.927170 containerd[1513]: time="2025-08-19T00:18:10.923365955Z" level=info msg="Container 62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:10.938148 containerd[1513]: time="2025-08-19T00:18:10.938060816Z" level=info msg="CreateContainer within sandbox \"edec3b1bdb4c5da9fcdb0b3cacb32ddd91197c711a149e9612590807ef60e6cf\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\"" Aug 19 00:18:10.938989 containerd[1513]: time="2025-08-19T00:18:10.938902588Z" level=info msg="StartContainer for \"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\"" Aug 19 00:18:10.941102 containerd[1513]: time="2025-08-19T00:18:10.941051835Z" level=info msg="connecting to shim 62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4" address="unix:///run/containerd/s/622a4c40d5f7d2eb1e752e49b0f00af816af9d7f5c43dd2528bdbeabbf95a588" protocol=ttrpc version=3 Aug 19 00:18:10.994397 systemd[1]: Started cri-containerd-62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4.scope - libcontainer container 62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4. Aug 19 00:18:11.059419 containerd[1513]: time="2025-08-19T00:18:11.059352767Z" level=info msg="StartContainer for \"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\" returns successfully" Aug 19 00:18:11.230765 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Aug 19 00:18:11.231073 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Aug 19 00:18:11.420677 kubelet[2769]: I0819 00:18:11.420269 2769 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311af456-51ba-4192-9f4f-8560e7648f9f-whisker-ca-bundle\") pod \"311af456-51ba-4192-9f4f-8560e7648f9f\" (UID: \"311af456-51ba-4192-9f4f-8560e7648f9f\") " Aug 19 00:18:11.420677 kubelet[2769]: I0819 00:18:11.420332 2769 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/311af456-51ba-4192-9f4f-8560e7648f9f-whisker-backend-key-pair\") pod \"311af456-51ba-4192-9f4f-8560e7648f9f\" (UID: \"311af456-51ba-4192-9f4f-8560e7648f9f\") " Aug 19 00:18:11.420677 kubelet[2769]: I0819 00:18:11.420366 2769 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb6h7\" (UniqueName: \"kubernetes.io/projected/311af456-51ba-4192-9f4f-8560e7648f9f-kube-api-access-vb6h7\") pod \"311af456-51ba-4192-9f4f-8560e7648f9f\" (UID: \"311af456-51ba-4192-9f4f-8560e7648f9f\") " Aug 19 00:18:11.426893 kubelet[2769]: I0819 00:18:11.425775 2769 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/311af456-51ba-4192-9f4f-8560e7648f9f-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "311af456-51ba-4192-9f4f-8560e7648f9f" (UID: "311af456-51ba-4192-9f4f-8560e7648f9f"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Aug 19 00:18:11.435411 kubelet[2769]: I0819 00:18:11.435343 2769 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/311af456-51ba-4192-9f4f-8560e7648f9f-kube-api-access-vb6h7" (OuterVolumeSpecName: "kube-api-access-vb6h7") pod "311af456-51ba-4192-9f4f-8560e7648f9f" (UID: "311af456-51ba-4192-9f4f-8560e7648f9f"). InnerVolumeSpecName "kube-api-access-vb6h7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Aug 19 00:18:11.437366 kubelet[2769]: I0819 00:18:11.437320 2769 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/311af456-51ba-4192-9f4f-8560e7648f9f-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "311af456-51ba-4192-9f4f-8560e7648f9f" (UID: "311af456-51ba-4192-9f4f-8560e7648f9f"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Aug 19 00:18:11.456198 systemd[1]: Removed slice kubepods-besteffort-pod311af456_51ba_4192_9f4f_8560e7648f9f.slice - libcontainer container kubepods-besteffort-pod311af456_51ba_4192_9f4f_8560e7648f9f.slice. Aug 19 00:18:11.497450 kubelet[2769]: I0819 00:18:11.496155 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-wwv6v" podStartSLOduration=2.027484116 podStartE2EDuration="20.496114788s" podCreationTimestamp="2025-08-19 00:17:51 +0000 UTC" firstStartedPulling="2025-08-19 00:17:52.420572161 +0000 UTC m=+24.376954265" lastFinishedPulling="2025-08-19 00:18:10.889202793 +0000 UTC m=+42.845584937" observedRunningTime="2025-08-19 00:18:11.477656354 +0000 UTC m=+43.434038458" watchObservedRunningTime="2025-08-19 00:18:11.496114788 +0000 UTC m=+43.452496852" Aug 19 00:18:11.522653 kubelet[2769]: I0819 00:18:11.521449 2769 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/311af456-51ba-4192-9f4f-8560e7648f9f-whisker-backend-key-pair\") on node \"ci-4426-0-0-8-661ee896d9\" DevicePath \"\"" Aug 19 00:18:11.523279 kubelet[2769]: I0819 00:18:11.523251 2769 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vb6h7\" (UniqueName: \"kubernetes.io/projected/311af456-51ba-4192-9f4f-8560e7648f9f-kube-api-access-vb6h7\") on node \"ci-4426-0-0-8-661ee896d9\" DevicePath \"\"" Aug 19 00:18:11.523690 kubelet[2769]: I0819 00:18:11.523636 2769 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/311af456-51ba-4192-9f4f-8560e7648f9f-whisker-ca-bundle\") on node \"ci-4426-0-0-8-661ee896d9\" DevicePath \"\"" Aug 19 00:18:11.604209 systemd[1]: Created slice kubepods-besteffort-pod8d1bdf46_564a_45bf_ba8f_978b7a185835.slice - libcontainer container kubepods-besteffort-pod8d1bdf46_564a_45bf_ba8f_978b7a185835.slice. Aug 19 00:18:11.625569 kubelet[2769]: I0819 00:18:11.624699 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc8rz\" (UniqueName: \"kubernetes.io/projected/8d1bdf46-564a-45bf-ba8f-978b7a185835-kube-api-access-xc8rz\") pod \"whisker-886d9d6c4-bvq6m\" (UID: \"8d1bdf46-564a-45bf-ba8f-978b7a185835\") " pod="calico-system/whisker-886d9d6c4-bvq6m" Aug 19 00:18:11.625934 kubelet[2769]: I0819 00:18:11.625910 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8d1bdf46-564a-45bf-ba8f-978b7a185835-whisker-ca-bundle\") pod \"whisker-886d9d6c4-bvq6m\" (UID: \"8d1bdf46-564a-45bf-ba8f-978b7a185835\") " pod="calico-system/whisker-886d9d6c4-bvq6m" Aug 19 00:18:11.626066 kubelet[2769]: I0819 00:18:11.626050 2769 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/8d1bdf46-564a-45bf-ba8f-978b7a185835-whisker-backend-key-pair\") pod \"whisker-886d9d6c4-bvq6m\" (UID: \"8d1bdf46-564a-45bf-ba8f-978b7a185835\") " pod="calico-system/whisker-886d9d6c4-bvq6m" Aug 19 00:18:11.816195 containerd[1513]: time="2025-08-19T00:18:11.816143841Z" level=info msg="TaskExit event in podsandbox handler container_id:\"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\" id:\"a70ebd0d309c9fef04b5280f50f63076be4720eed54b70aa1d414b514b612b50\" pid:3853 exit_status:1 exited_at:{seconds:1755562691 nanos:815332907}" Aug 19 00:18:11.853721 systemd[1]: var-lib-kubelet-pods-311af456\x2d51ba\x2d4192\x2d9f4f\x2d8560e7648f9f-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dvb6h7.mount: Deactivated successfully. Aug 19 00:18:11.853940 systemd[1]: var-lib-kubelet-pods-311af456\x2d51ba\x2d4192\x2d9f4f\x2d8560e7648f9f-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Aug 19 00:18:11.912891 containerd[1513]: time="2025-08-19T00:18:11.912799187Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-886d9d6c4-bvq6m,Uid:8d1bdf46-564a-45bf-ba8f-978b7a185835,Namespace:calico-system,Attempt:0,}" Aug 19 00:18:12.052515 kubelet[2769]: I0819 00:18:12.052386 2769 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:18:12.146761 systemd-networkd[1412]: cali447abad99e3: Link UP Aug 19 00:18:12.149638 systemd-networkd[1412]: cali447abad99e3: Gained carrier Aug 19 00:18:12.171483 containerd[1513]: 2025-08-19 00:18:11.948 [INFO][3880] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Aug 19 00:18:12.171483 containerd[1513]: 2025-08-19 00:18:11.997 [INFO][3880] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0 whisker-886d9d6c4- calico-system 8d1bdf46-564a-45bf-ba8f-978b7a185835 882 0 2025-08-19 00:18:11 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:886d9d6c4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4426-0-0-8-661ee896d9 whisker-886d9d6c4-bvq6m eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali447abad99e3 [] [] }} ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Namespace="calico-system" Pod="whisker-886d9d6c4-bvq6m" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-" Aug 19 00:18:12.171483 containerd[1513]: 2025-08-19 00:18:11.997 [INFO][3880] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Namespace="calico-system" Pod="whisker-886d9d6c4-bvq6m" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" Aug 19 00:18:12.171483 containerd[1513]: 2025-08-19 00:18:12.058 [INFO][3891] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" HandleID="k8s-pod-network.108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Workload="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" Aug 19 00:18:12.171999 containerd[1513]: 2025-08-19 00:18:12.059 [INFO][3891] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" HandleID="k8s-pod-network.108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Workload="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400032a180), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-0-0-8-661ee896d9", "pod":"whisker-886d9d6c4-bvq6m", "timestamp":"2025-08-19 00:18:12.058640219 +0000 UTC"}, Hostname:"ci-4426-0-0-8-661ee896d9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:18:12.171999 containerd[1513]: 2025-08-19 00:18:12.059 [INFO][3891] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:18:12.171999 containerd[1513]: 2025-08-19 00:18:12.060 [INFO][3891] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:18:12.171999 containerd[1513]: 2025-08-19 00:18:12.060 [INFO][3891] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-8-661ee896d9' Aug 19 00:18:12.171999 containerd[1513]: 2025-08-19 00:18:12.074 [INFO][3891] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:12.171999 containerd[1513]: 2025-08-19 00:18:12.085 [INFO][3891] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:12.171999 containerd[1513]: 2025-08-19 00:18:12.099 [INFO][3891] ipam/ipam.go 511: Trying affinity for 192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:12.171999 containerd[1513]: 2025-08-19 00:18:12.103 [INFO][3891] ipam/ipam.go 158: Attempting to load block cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:12.171999 containerd[1513]: 2025-08-19 00:18:12.107 [INFO][3891] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:12.172385 containerd[1513]: 2025-08-19 00:18:12.108 [INFO][3891] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.83.128/26 handle="k8s-pod-network.108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:12.172385 containerd[1513]: 2025-08-19 00:18:12.110 [INFO][3891] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890 Aug 19 00:18:12.172385 containerd[1513]: 2025-08-19 00:18:12.119 [INFO][3891] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.83.128/26 handle="k8s-pod-network.108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:12.172385 containerd[1513]: 2025-08-19 00:18:12.128 [INFO][3891] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.83.129/26] block=192.168.83.128/26 handle="k8s-pod-network.108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:12.172385 containerd[1513]: 2025-08-19 00:18:12.128 [INFO][3891] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.83.129/26] handle="k8s-pod-network.108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:12.172385 containerd[1513]: 2025-08-19 00:18:12.128 [INFO][3891] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:18:12.172385 containerd[1513]: 2025-08-19 00:18:12.128 [INFO][3891] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.129/26] IPv6=[] ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" HandleID="k8s-pod-network.108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Workload="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" Aug 19 00:18:12.173088 containerd[1513]: 2025-08-19 00:18:12.134 [INFO][3880] cni-plugin/k8s.go 418: Populated endpoint ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Namespace="calico-system" Pod="whisker-886d9d6c4-bvq6m" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0", GenerateName:"whisker-886d9d6c4-", Namespace:"calico-system", SelfLink:"", UID:"8d1bdf46-564a-45bf-ba8f-978b7a185835", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 18, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"886d9d6c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"", Pod:"whisker-886d9d6c4-bvq6m", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.83.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali447abad99e3", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:12.173088 containerd[1513]: 2025-08-19 00:18:12.134 [INFO][3880] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.83.129/32] ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Namespace="calico-system" Pod="whisker-886d9d6c4-bvq6m" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" Aug 19 00:18:12.173229 containerd[1513]: 2025-08-19 00:18:12.134 [INFO][3880] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali447abad99e3 ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Namespace="calico-system" Pod="whisker-886d9d6c4-bvq6m" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" Aug 19 00:18:12.173229 containerd[1513]: 2025-08-19 00:18:12.150 [INFO][3880] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Namespace="calico-system" Pod="whisker-886d9d6c4-bvq6m" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" Aug 19 00:18:12.173321 containerd[1513]: 2025-08-19 00:18:12.151 [INFO][3880] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Namespace="calico-system" Pod="whisker-886d9d6c4-bvq6m" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0", GenerateName:"whisker-886d9d6c4-", Namespace:"calico-system", SelfLink:"", UID:"8d1bdf46-564a-45bf-ba8f-978b7a185835", ResourceVersion:"882", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 18, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"886d9d6c4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890", Pod:"whisker-886d9d6c4-bvq6m", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.83.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali447abad99e3", MAC:"0e:b9:0f:da:90:f0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:12.173403 containerd[1513]: 2025-08-19 00:18:12.168 [INFO][3880] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" Namespace="calico-system" Pod="whisker-886d9d6c4-bvq6m" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-whisker--886d9d6c4--bvq6m-eth0" Aug 19 00:18:12.223243 kubelet[2769]: I0819 00:18:12.223186 2769 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="311af456-51ba-4192-9f4f-8560e7648f9f" path="/var/lib/kubelet/pods/311af456-51ba-4192-9f4f-8560e7648f9f/volumes" Aug 19 00:18:12.228160 containerd[1513]: time="2025-08-19T00:18:12.227980154Z" level=info msg="connecting to shim 108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890" address="unix:///run/containerd/s/c30581a2fa832ddb2b6116d75f2ec646a5e564df6c092d3b80369f833b8df33a" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:18:12.265399 systemd[1]: Started cri-containerd-108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890.scope - libcontainer container 108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890. Aug 19 00:18:12.319510 containerd[1513]: time="2025-08-19T00:18:12.319280490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-886d9d6c4-bvq6m,Uid:8d1bdf46-564a-45bf-ba8f-978b7a185835,Namespace:calico-system,Attempt:0,} returns sandbox id \"108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890\"" Aug 19 00:18:12.323566 containerd[1513]: time="2025-08-19T00:18:12.323505956Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\"" Aug 19 00:18:12.552437 containerd[1513]: time="2025-08-19T00:18:12.552360239Z" level=info msg="TaskExit event in podsandbox handler container_id:\"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\" id:\"4ee5505035eb631442a0b3efdb97dc0e1b5e90a0efa246cb118ae4248feb163e\" pid:3970 exit_status:1 exited_at:{seconds:1755562692 nanos:551746418}" Aug 19 00:18:13.269460 systemd-networkd[1412]: cali447abad99e3: Gained IPv6LL Aug 19 00:18:13.584576 containerd[1513]: time="2025-08-19T00:18:13.584493160Z" level=info msg="TaskExit event in podsandbox handler container_id:\"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\" id:\"5496f92f16f426c9d2d0bf61398b71546edc433d84c3cc660944a953ab821acc\" pid:4120 exit_status:1 exited_at:{seconds:1755562693 nanos:581512292}" Aug 19 00:18:13.647520 systemd-networkd[1412]: vxlan.calico: Link UP Aug 19 00:18:13.647527 systemd-networkd[1412]: vxlan.calico: Gained carrier Aug 19 00:18:14.604947 containerd[1513]: time="2025-08-19T00:18:14.604244535Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:14.607491 containerd[1513]: time="2025-08-19T00:18:14.607447480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.2: active requests=0, bytes read=4605614" Aug 19 00:18:14.608590 containerd[1513]: time="2025-08-19T00:18:14.608545007Z" level=info msg="ImageCreate event name:\"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:14.613791 containerd[1513]: time="2025-08-19T00:18:14.613568937Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:14.618066 containerd[1513]: time="2025-08-19T00:18:14.617900528Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker:v3.30.2\" with image id \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\", repo tag \"ghcr.io/flatcar/calico/whisker:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker@sha256:31346d4524252a3b0d2a1d289c4985b8402b498b5ce82a12e682096ab7446678\", size \"5974847\" in 2.294347973s" Aug 19 00:18:14.618066 containerd[1513]: time="2025-08-19T00:18:14.617956926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.2\" returns image reference \"sha256:309942601a9ca6c4e92bcd09162824fef1c137a5c5d92fbbb45be0f29bfd1817\"" Aug 19 00:18:14.625520 containerd[1513]: time="2025-08-19T00:18:14.625454622Z" level=info msg="CreateContainer within sandbox \"108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890\" for container &ContainerMetadata{Name:whisker,Attempt:0,}" Aug 19 00:18:14.636168 containerd[1513]: time="2025-08-19T00:18:14.635709156Z" level=info msg="Container 0d8baa18ab7175373551fb106878f9d2b946f29f6278dced95cfcf576a3b26da: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:14.646622 containerd[1513]: time="2025-08-19T00:18:14.646565233Z" level=info msg="CreateContainer within sandbox \"108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890\" for &ContainerMetadata{Name:whisker,Attempt:0,} returns container id \"0d8baa18ab7175373551fb106878f9d2b946f29f6278dced95cfcf576a3b26da\"" Aug 19 00:18:14.649269 containerd[1513]: time="2025-08-19T00:18:14.648780326Z" level=info msg="StartContainer for \"0d8baa18ab7175373551fb106878f9d2b946f29f6278dced95cfcf576a3b26da\"" Aug 19 00:18:14.651216 containerd[1513]: time="2025-08-19T00:18:14.651053499Z" level=info msg="connecting to shim 0d8baa18ab7175373551fb106878f9d2b946f29f6278dced95cfcf576a3b26da" address="unix:///run/containerd/s/c30581a2fa832ddb2b6116d75f2ec646a5e564df6c092d3b80369f833b8df33a" protocol=ttrpc version=3 Aug 19 00:18:14.689432 systemd[1]: Started cri-containerd-0d8baa18ab7175373551fb106878f9d2b946f29f6278dced95cfcf576a3b26da.scope - libcontainer container 0d8baa18ab7175373551fb106878f9d2b946f29f6278dced95cfcf576a3b26da. Aug 19 00:18:14.755738 containerd[1513]: time="2025-08-19T00:18:14.755673378Z" level=info msg="StartContainer for \"0d8baa18ab7175373551fb106878f9d2b946f29f6278dced95cfcf576a3b26da\" returns successfully" Aug 19 00:18:14.757916 containerd[1513]: time="2025-08-19T00:18:14.757857632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\"" Aug 19 00:18:14.868450 systemd-networkd[1412]: vxlan.calico: Gained IPv6LL Aug 19 00:18:15.215251 containerd[1513]: time="2025-08-19T00:18:15.214782122Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-667fd66bbf-28jqt,Uid:89ca3d58-71bc-4f2a-83d4-594ab461a0d5,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:18:15.215464 containerd[1513]: time="2025-08-19T00:18:15.215210309Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tp2sl,Uid:c01e7ff3-6abc-4602-b953-fe846dc88f55,Namespace:calico-system,Attempt:0,}" Aug 19 00:18:15.414366 systemd-networkd[1412]: cali8cb90618027: Link UP Aug 19 00:18:15.418288 systemd-networkd[1412]: cali8cb90618027: Gained carrier Aug 19 00:18:15.443229 containerd[1513]: 2025-08-19 00:18:15.286 [INFO][4242] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0 calico-apiserver-667fd66bbf- calico-apiserver 89ca3d58-71bc-4f2a-83d4-594ab461a0d5 822 0 2025-08-19 00:17:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:667fd66bbf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426-0-0-8-661ee896d9 calico-apiserver-667fd66bbf-28jqt eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8cb90618027 [] [] }} ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-28jqt" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-" Aug 19 00:18:15.443229 containerd[1513]: 2025-08-19 00:18:15.286 [INFO][4242] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-28jqt" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" Aug 19 00:18:15.443229 containerd[1513]: 2025-08-19 00:18:15.335 [INFO][4268] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" HandleID="k8s-pod-network.c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Workload="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" Aug 19 00:18:15.443747 containerd[1513]: 2025-08-19 00:18:15.335 [INFO][4268] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" HandleID="k8s-pod-network.c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Workload="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426-0-0-8-661ee896d9", "pod":"calico-apiserver-667fd66bbf-28jqt", "timestamp":"2025-08-19 00:18:15.335465395 +0000 UTC"}, Hostname:"ci-4426-0-0-8-661ee896d9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:18:15.443747 containerd[1513]: 2025-08-19 00:18:15.335 [INFO][4268] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:18:15.443747 containerd[1513]: 2025-08-19 00:18:15.335 [INFO][4268] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:18:15.443747 containerd[1513]: 2025-08-19 00:18:15.335 [INFO][4268] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-8-661ee896d9' Aug 19 00:18:15.443747 containerd[1513]: 2025-08-19 00:18:15.351 [INFO][4268] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.443747 containerd[1513]: 2025-08-19 00:18:15.359 [INFO][4268] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.443747 containerd[1513]: 2025-08-19 00:18:15.368 [INFO][4268] ipam/ipam.go 511: Trying affinity for 192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.443747 containerd[1513]: 2025-08-19 00:18:15.373 [INFO][4268] ipam/ipam.go 158: Attempting to load block cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.443747 containerd[1513]: 2025-08-19 00:18:15.377 [INFO][4268] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.444301 containerd[1513]: 2025-08-19 00:18:15.377 [INFO][4268] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.83.128/26 handle="k8s-pod-network.c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.444301 containerd[1513]: 2025-08-19 00:18:15.379 [INFO][4268] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84 Aug 19 00:18:15.444301 containerd[1513]: 2025-08-19 00:18:15.387 [INFO][4268] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.83.128/26 handle="k8s-pod-network.c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.444301 containerd[1513]: 2025-08-19 00:18:15.397 [INFO][4268] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.83.130/26] block=192.168.83.128/26 handle="k8s-pod-network.c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.444301 containerd[1513]: 2025-08-19 00:18:15.397 [INFO][4268] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.83.130/26] handle="k8s-pod-network.c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.444301 containerd[1513]: 2025-08-19 00:18:15.397 [INFO][4268] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:18:15.444301 containerd[1513]: 2025-08-19 00:18:15.397 [INFO][4268] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.130/26] IPv6=[] ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" HandleID="k8s-pod-network.c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Workload="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" Aug 19 00:18:15.444689 containerd[1513]: 2025-08-19 00:18:15.403 [INFO][4242] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-28jqt" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0", GenerateName:"calico-apiserver-667fd66bbf-", Namespace:"calico-apiserver", SelfLink:"", UID:"89ca3d58-71bc-4f2a-83d4-594ab461a0d5", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"667fd66bbf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"", Pod:"calico-apiserver-667fd66bbf-28jqt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.83.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8cb90618027", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:15.445000 containerd[1513]: 2025-08-19 00:18:15.403 [INFO][4242] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.83.130/32] ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-28jqt" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" Aug 19 00:18:15.445000 containerd[1513]: 2025-08-19 00:18:15.403 [INFO][4242] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8cb90618027 ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-28jqt" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" Aug 19 00:18:15.445000 containerd[1513]: 2025-08-19 00:18:15.419 [INFO][4242] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-28jqt" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" Aug 19 00:18:15.445075 containerd[1513]: 2025-08-19 00:18:15.421 [INFO][4242] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-28jqt" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0", GenerateName:"calico-apiserver-667fd66bbf-", Namespace:"calico-apiserver", SelfLink:"", UID:"89ca3d58-71bc-4f2a-83d4-594ab461a0d5", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"667fd66bbf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84", Pod:"calico-apiserver-667fd66bbf-28jqt", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.83.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8cb90618027", MAC:"52:aa:6e:cb:b4:79", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:15.445150 containerd[1513]: 2025-08-19 00:18:15.436 [INFO][4242] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-28jqt" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--28jqt-eth0" Aug 19 00:18:15.490784 containerd[1513]: time="2025-08-19T00:18:15.490331360Z" level=info msg="connecting to shim c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84" address="unix:///run/containerd/s/26bd66b3763270a9a7542a175c3d1e0c928185cbbe9076de008f71446c78e204" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:18:15.529593 systemd-networkd[1412]: cali2059d0fc54c: Link UP Aug 19 00:18:15.530842 systemd-networkd[1412]: cali2059d0fc54c: Gained carrier Aug 19 00:18:15.535667 systemd[1]: Started cri-containerd-c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84.scope - libcontainer container c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84. Aug 19 00:18:15.560465 containerd[1513]: 2025-08-19 00:18:15.293 [INFO][4243] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0 goldmane-768f4c5c69- calico-system c01e7ff3-6abc-4602-b953-fe846dc88f55 823 0 2025-08-19 00:17:51 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:768f4c5c69 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4426-0-0-8-661ee896d9 goldmane-768f4c5c69-tp2sl eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali2059d0fc54c [] [] }} ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Namespace="calico-system" Pod="goldmane-768f4c5c69-tp2sl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-" Aug 19 00:18:15.560465 containerd[1513]: 2025-08-19 00:18:15.298 [INFO][4243] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Namespace="calico-system" Pod="goldmane-768f4c5c69-tp2sl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" Aug 19 00:18:15.560465 containerd[1513]: 2025-08-19 00:18:15.353 [INFO][4274] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" HandleID="k8s-pod-network.7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Workload="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" Aug 19 00:18:15.561707 containerd[1513]: 2025-08-19 00:18:15.353 [INFO][4274] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" HandleID="k8s-pod-network.7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Workload="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-0-0-8-661ee896d9", "pod":"goldmane-768f4c5c69-tp2sl", "timestamp":"2025-08-19 00:18:15.353229201 +0000 UTC"}, Hostname:"ci-4426-0-0-8-661ee896d9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:18:15.561707 containerd[1513]: 2025-08-19 00:18:15.353 [INFO][4274] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:18:15.561707 containerd[1513]: 2025-08-19 00:18:15.397 [INFO][4274] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:18:15.561707 containerd[1513]: 2025-08-19 00:18:15.398 [INFO][4274] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-8-661ee896d9' Aug 19 00:18:15.561707 containerd[1513]: 2025-08-19 00:18:15.451 [INFO][4274] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.561707 containerd[1513]: 2025-08-19 00:18:15.465 [INFO][4274] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.561707 containerd[1513]: 2025-08-19 00:18:15.481 [INFO][4274] ipam/ipam.go 511: Trying affinity for 192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.561707 containerd[1513]: 2025-08-19 00:18:15.486 [INFO][4274] ipam/ipam.go 158: Attempting to load block cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.561707 containerd[1513]: 2025-08-19 00:18:15.493 [INFO][4274] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.563088 containerd[1513]: 2025-08-19 00:18:15.493 [INFO][4274] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.83.128/26 handle="k8s-pod-network.7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.563088 containerd[1513]: 2025-08-19 00:18:15.498 [INFO][4274] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02 Aug 19 00:18:15.563088 containerd[1513]: 2025-08-19 00:18:15.508 [INFO][4274] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.83.128/26 handle="k8s-pod-network.7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.563088 containerd[1513]: 2025-08-19 00:18:15.520 [INFO][4274] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.83.131/26] block=192.168.83.128/26 handle="k8s-pod-network.7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.563088 containerd[1513]: 2025-08-19 00:18:15.520 [INFO][4274] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.83.131/26] handle="k8s-pod-network.7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:15.563088 containerd[1513]: 2025-08-19 00:18:15.520 [INFO][4274] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:18:15.563088 containerd[1513]: 2025-08-19 00:18:15.520 [INFO][4274] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.131/26] IPv6=[] ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" HandleID="k8s-pod-network.7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Workload="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" Aug 19 00:18:15.563729 containerd[1513]: 2025-08-19 00:18:15.526 [INFO][4243] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Namespace="calico-system" Pod="goldmane-768f4c5c69-tp2sl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"c01e7ff3-6abc-4602-b953-fe846dc88f55", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"", Pod:"goldmane-768f4c5c69-tp2sl", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.83.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2059d0fc54c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:15.563838 containerd[1513]: 2025-08-19 00:18:15.526 [INFO][4243] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.83.131/32] ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Namespace="calico-system" Pod="goldmane-768f4c5c69-tp2sl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" Aug 19 00:18:15.563838 containerd[1513]: 2025-08-19 00:18:15.526 [INFO][4243] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2059d0fc54c ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Namespace="calico-system" Pod="goldmane-768f4c5c69-tp2sl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" Aug 19 00:18:15.563838 containerd[1513]: 2025-08-19 00:18:15.534 [INFO][4243] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Namespace="calico-system" Pod="goldmane-768f4c5c69-tp2sl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" Aug 19 00:18:15.563987 containerd[1513]: 2025-08-19 00:18:15.535 [INFO][4243] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Namespace="calico-system" Pod="goldmane-768f4c5c69-tp2sl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0", GenerateName:"goldmane-768f4c5c69-", Namespace:"calico-system", SelfLink:"", UID:"c01e7ff3-6abc-4602-b953-fe846dc88f55", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 51, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"768f4c5c69", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02", Pod:"goldmane-768f4c5c69-tp2sl", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.83.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2059d0fc54c", MAC:"8e:bc:6f:d6:9b:65", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:15.564063 containerd[1513]: 2025-08-19 00:18:15.555 [INFO][4243] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" Namespace="calico-system" Pod="goldmane-768f4c5c69-tp2sl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-goldmane--768f4c5c69--tp2sl-eth0" Aug 19 00:18:15.615758 containerd[1513]: time="2025-08-19T00:18:15.615463744Z" level=info msg="connecting to shim 7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02" address="unix:///run/containerd/s/239b3d65b58d8aa91869fb3b73a9bfbd96cac8e07b23f41efaa643d2f9e077bc" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:18:15.671496 systemd[1]: Started cri-containerd-7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02.scope - libcontainer container 7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02. Aug 19 00:18:15.686076 containerd[1513]: time="2025-08-19T00:18:15.685789512Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-667fd66bbf-28jqt,Uid:89ca3d58-71bc-4f2a-83d4-594ab461a0d5,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84\"" Aug 19 00:18:15.728987 containerd[1513]: time="2025-08-19T00:18:15.728945305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-768f4c5c69-tp2sl,Uid:c01e7ff3-6abc-4602-b953-fe846dc88f55,Namespace:calico-system,Attempt:0,} returns sandbox id \"7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02\"" Aug 19 00:18:16.214390 containerd[1513]: time="2025-08-19T00:18:16.214337074Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-667fd66bbf-57zdj,Uid:00acaa65-0ca5-43db-a1b6-460c4583ad3c,Namespace:calico-apiserver,Attempt:0,}" Aug 19 00:18:16.386606 systemd-networkd[1412]: cali0f3b38a6071: Link UP Aug 19 00:18:16.388169 systemd-networkd[1412]: cali0f3b38a6071: Gained carrier Aug 19 00:18:16.413960 containerd[1513]: 2025-08-19 00:18:16.271 [INFO][4388] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0 calico-apiserver-667fd66bbf- calico-apiserver 00acaa65-0ca5-43db-a1b6-460c4583ad3c 821 0 2025-08-19 00:17:46 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:667fd66bbf projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4426-0-0-8-661ee896d9 calico-apiserver-667fd66bbf-57zdj eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0f3b38a6071 [] [] }} ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-57zdj" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-" Aug 19 00:18:16.413960 containerd[1513]: 2025-08-19 00:18:16.271 [INFO][4388] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-57zdj" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" Aug 19 00:18:16.413960 containerd[1513]: 2025-08-19 00:18:16.314 [INFO][4400] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" HandleID="k8s-pod-network.bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Workload="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" Aug 19 00:18:16.414566 containerd[1513]: 2025-08-19 00:18:16.314 [INFO][4400] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" HandleID="k8s-pod-network.bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Workload="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400032b420), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4426-0-0-8-661ee896d9", "pod":"calico-apiserver-667fd66bbf-57zdj", "timestamp":"2025-08-19 00:18:16.314577668 +0000 UTC"}, Hostname:"ci-4426-0-0-8-661ee896d9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:18:16.414566 containerd[1513]: 2025-08-19 00:18:16.315 [INFO][4400] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:18:16.414566 containerd[1513]: 2025-08-19 00:18:16.315 [INFO][4400] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:18:16.414566 containerd[1513]: 2025-08-19 00:18:16.315 [INFO][4400] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-8-661ee896d9' Aug 19 00:18:16.414566 containerd[1513]: 2025-08-19 00:18:16.328 [INFO][4400] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:16.414566 containerd[1513]: 2025-08-19 00:18:16.337 [INFO][4400] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:16.414566 containerd[1513]: 2025-08-19 00:18:16.345 [INFO][4400] ipam/ipam.go 511: Trying affinity for 192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:16.414566 containerd[1513]: 2025-08-19 00:18:16.349 [INFO][4400] ipam/ipam.go 158: Attempting to load block cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:16.414566 containerd[1513]: 2025-08-19 00:18:16.355 [INFO][4400] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:16.415094 containerd[1513]: 2025-08-19 00:18:16.355 [INFO][4400] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.83.128/26 handle="k8s-pod-network.bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:16.415094 containerd[1513]: 2025-08-19 00:18:16.358 [INFO][4400] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f Aug 19 00:18:16.415094 containerd[1513]: 2025-08-19 00:18:16.365 [INFO][4400] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.83.128/26 handle="k8s-pod-network.bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:16.415094 containerd[1513]: 2025-08-19 00:18:16.374 [INFO][4400] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.83.132/26] block=192.168.83.128/26 handle="k8s-pod-network.bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:16.415094 containerd[1513]: 2025-08-19 00:18:16.374 [INFO][4400] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.83.132/26] handle="k8s-pod-network.bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:16.415094 containerd[1513]: 2025-08-19 00:18:16.374 [INFO][4400] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:18:16.415094 containerd[1513]: 2025-08-19 00:18:16.374 [INFO][4400] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.132/26] IPv6=[] ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" HandleID="k8s-pod-network.bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Workload="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" Aug 19 00:18:16.415638 containerd[1513]: 2025-08-19 00:18:16.378 [INFO][4388] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-57zdj" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0", GenerateName:"calico-apiserver-667fd66bbf-", Namespace:"calico-apiserver", SelfLink:"", UID:"00acaa65-0ca5-43db-a1b6-460c4583ad3c", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"667fd66bbf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"", Pod:"calico-apiserver-667fd66bbf-57zdj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.83.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0f3b38a6071", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:16.415717 containerd[1513]: 2025-08-19 00:18:16.378 [INFO][4388] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.83.132/32] ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-57zdj" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" Aug 19 00:18:16.415717 containerd[1513]: 2025-08-19 00:18:16.378 [INFO][4388] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0f3b38a6071 ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-57zdj" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" Aug 19 00:18:16.415717 containerd[1513]: 2025-08-19 00:18:16.387 [INFO][4388] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-57zdj" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" Aug 19 00:18:16.415793 containerd[1513]: 2025-08-19 00:18:16.387 [INFO][4388] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-57zdj" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0", GenerateName:"calico-apiserver-667fd66bbf-", Namespace:"calico-apiserver", SelfLink:"", UID:"00acaa65-0ca5-43db-a1b6-460c4583ad3c", ResourceVersion:"821", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 46, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"667fd66bbf", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f", Pod:"calico-apiserver-667fd66bbf-57zdj", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.83.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0f3b38a6071", MAC:"e2:1a:67:56:d9:c5", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:16.415843 containerd[1513]: 2025-08-19 00:18:16.410 [INFO][4388] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" Namespace="calico-apiserver" Pod="calico-apiserver-667fd66bbf-57zdj" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--apiserver--667fd66bbf--57zdj-eth0" Aug 19 00:18:16.454887 containerd[1513]: time="2025-08-19T00:18:16.454325957Z" level=info msg="connecting to shim bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f" address="unix:///run/containerd/s/d08f4ca866412282ba9696fddb9ded9e3d68a7d11dbe05f54ad67288d99f026c" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:18:16.500489 systemd[1]: Started cri-containerd-bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f.scope - libcontainer container bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f. Aug 19 00:18:16.533223 systemd-networkd[1412]: cali8cb90618027: Gained IPv6LL Aug 19 00:18:16.573886 containerd[1513]: time="2025-08-19T00:18:16.573836293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-667fd66bbf-57zdj,Uid:00acaa65-0ca5-43db-a1b6-460c4583ad3c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f\"" Aug 19 00:18:16.854057 systemd-networkd[1412]: cali2059d0fc54c: Gained IPv6LL Aug 19 00:18:17.387258 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount550695036.mount: Deactivated successfully. Aug 19 00:18:17.424392 containerd[1513]: time="2025-08-19T00:18:17.424275585Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:17.429846 containerd[1513]: time="2025-08-19T00:18:17.429771716Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.2: active requests=0, bytes read=30814581" Aug 19 00:18:17.432587 containerd[1513]: time="2025-08-19T00:18:17.432512002Z" level=info msg="ImageCreate event name:\"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:17.438505 containerd[1513]: time="2025-08-19T00:18:17.438406282Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:17.440675 containerd[1513]: time="2025-08-19T00:18:17.440566863Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" with image id \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\", repo tag \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/whisker-backend@sha256:fbf7f21f5aba95930803ad7e7dea8b083220854eae72c2a7c51681c09c5614b5\", size \"30814411\" in 2.682656313s" Aug 19 00:18:17.440906 containerd[1513]: time="2025-08-19T00:18:17.440670621Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.2\" returns image reference \"sha256:8763d908c0cd23d0e87bc61ce1ba8371b86449688baf955e5eeff7f7d7e101c4\"" Aug 19 00:18:17.443524 containerd[1513]: time="2025-08-19T00:18:17.442556409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 00:18:17.451527 containerd[1513]: time="2025-08-19T00:18:17.451315852Z" level=info msg="CreateContainer within sandbox \"108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890\" for container &ContainerMetadata{Name:whisker-backend,Attempt:0,}" Aug 19 00:18:17.462803 containerd[1513]: time="2025-08-19T00:18:17.462741862Z" level=info msg="Container 3f54a367efbb2194f5ae2d8d3018a6ba8798dc4ce0111e093c7d73e2d93dc89d: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:17.482999 containerd[1513]: time="2025-08-19T00:18:17.482861757Z" level=info msg="CreateContainer within sandbox \"108fd86cade5d4aedabc54eba394985e75da855610cebb07dce569210ca7e890\" for &ContainerMetadata{Name:whisker-backend,Attempt:0,} returns container id \"3f54a367efbb2194f5ae2d8d3018a6ba8798dc4ce0111e093c7d73e2d93dc89d\"" Aug 19 00:18:17.484614 containerd[1513]: time="2025-08-19T00:18:17.484253799Z" level=info msg="StartContainer for \"3f54a367efbb2194f5ae2d8d3018a6ba8798dc4ce0111e093c7d73e2d93dc89d\"" Aug 19 00:18:17.486379 containerd[1513]: time="2025-08-19T00:18:17.486343463Z" level=info msg="connecting to shim 3f54a367efbb2194f5ae2d8d3018a6ba8798dc4ce0111e093c7d73e2d93dc89d" address="unix:///run/containerd/s/c30581a2fa832ddb2b6116d75f2ec646a5e564df6c092d3b80369f833b8df33a" protocol=ttrpc version=3 Aug 19 00:18:17.529485 systemd[1]: Started cri-containerd-3f54a367efbb2194f5ae2d8d3018a6ba8798dc4ce0111e093c7d73e2d93dc89d.scope - libcontainer container 3f54a367efbb2194f5ae2d8d3018a6ba8798dc4ce0111e093c7d73e2d93dc89d. Aug 19 00:18:17.585985 containerd[1513]: time="2025-08-19T00:18:17.585936363Z" level=info msg="StartContainer for \"3f54a367efbb2194f5ae2d8d3018a6ba8798dc4ce0111e093c7d73e2d93dc89d\" returns successfully" Aug 19 00:18:17.812626 systemd-networkd[1412]: cali0f3b38a6071: Gained IPv6LL Aug 19 00:18:18.218411 containerd[1513]: time="2025-08-19T00:18:18.218250129Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gzg9f,Uid:1e621243-8897-4395-9a28-32253cceeffb,Namespace:kube-system,Attempt:0,}" Aug 19 00:18:18.218901 containerd[1513]: time="2025-08-19T00:18:18.218810154Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75649cc89b-2q5fl,Uid:27ede6b7-98b7-419b-84aa-572c3e073e65,Namespace:calico-system,Attempt:0,}" Aug 19 00:18:18.449603 systemd-networkd[1412]: cali8f8fc99a968: Link UP Aug 19 00:18:18.451823 systemd-networkd[1412]: cali8f8fc99a968: Gained carrier Aug 19 00:18:18.496397 containerd[1513]: 2025-08-19 00:18:18.293 [INFO][4504] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0 coredns-674b8bbfcf- kube-system 1e621243-8897-4395-9a28-32253cceeffb 816 0 2025-08-19 00:17:34 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426-0-0-8-661ee896d9 coredns-674b8bbfcf-gzg9f eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali8f8fc99a968 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Namespace="kube-system" Pod="coredns-674b8bbfcf-gzg9f" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-" Aug 19 00:18:18.496397 containerd[1513]: 2025-08-19 00:18:18.294 [INFO][4504] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Namespace="kube-system" Pod="coredns-674b8bbfcf-gzg9f" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" Aug 19 00:18:18.496397 containerd[1513]: 2025-08-19 00:18:18.351 [INFO][4530] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" HandleID="k8s-pod-network.e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Workload="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" Aug 19 00:18:18.496882 containerd[1513]: 2025-08-19 00:18:18.351 [INFO][4530] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" HandleID="k8s-pod-network.e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Workload="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3710), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426-0-0-8-661ee896d9", "pod":"coredns-674b8bbfcf-gzg9f", "timestamp":"2025-08-19 00:18:18.351486991 +0000 UTC"}, Hostname:"ci-4426-0-0-8-661ee896d9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:18:18.496882 containerd[1513]: 2025-08-19 00:18:18.352 [INFO][4530] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:18:18.496882 containerd[1513]: 2025-08-19 00:18:18.352 [INFO][4530] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:18:18.496882 containerd[1513]: 2025-08-19 00:18:18.352 [INFO][4530] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-8-661ee896d9' Aug 19 00:18:18.496882 containerd[1513]: 2025-08-19 00:18:18.367 [INFO][4530] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.496882 containerd[1513]: 2025-08-19 00:18:18.381 [INFO][4530] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.496882 containerd[1513]: 2025-08-19 00:18:18.391 [INFO][4530] ipam/ipam.go 511: Trying affinity for 192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.496882 containerd[1513]: 2025-08-19 00:18:18.395 [INFO][4530] ipam/ipam.go 158: Attempting to load block cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.496882 containerd[1513]: 2025-08-19 00:18:18.399 [INFO][4530] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.497976 containerd[1513]: 2025-08-19 00:18:18.399 [INFO][4530] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.83.128/26 handle="k8s-pod-network.e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.497976 containerd[1513]: 2025-08-19 00:18:18.402 [INFO][4530] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759 Aug 19 00:18:18.497976 containerd[1513]: 2025-08-19 00:18:18.407 [INFO][4530] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.83.128/26 handle="k8s-pod-network.e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.497976 containerd[1513]: 2025-08-19 00:18:18.420 [INFO][4530] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.83.133/26] block=192.168.83.128/26 handle="k8s-pod-network.e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.497976 containerd[1513]: 2025-08-19 00:18:18.420 [INFO][4530] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.83.133/26] handle="k8s-pod-network.e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.497976 containerd[1513]: 2025-08-19 00:18:18.420 [INFO][4530] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:18:18.497976 containerd[1513]: 2025-08-19 00:18:18.420 [INFO][4530] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.133/26] IPv6=[] ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" HandleID="k8s-pod-network.e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Workload="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" Aug 19 00:18:18.498122 containerd[1513]: 2025-08-19 00:18:18.429 [INFO][4504] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Namespace="kube-system" Pod="coredns-674b8bbfcf-gzg9f" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"1e621243-8897-4395-9a28-32253cceeffb", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"", Pod:"coredns-674b8bbfcf-gzg9f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.83.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8f8fc99a968", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:18.498122 containerd[1513]: 2025-08-19 00:18:18.431 [INFO][4504] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.83.133/32] ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Namespace="kube-system" Pod="coredns-674b8bbfcf-gzg9f" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" Aug 19 00:18:18.498122 containerd[1513]: 2025-08-19 00:18:18.431 [INFO][4504] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8f8fc99a968 ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Namespace="kube-system" Pod="coredns-674b8bbfcf-gzg9f" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" Aug 19 00:18:18.498122 containerd[1513]: 2025-08-19 00:18:18.451 [INFO][4504] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Namespace="kube-system" Pod="coredns-674b8bbfcf-gzg9f" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" Aug 19 00:18:18.498122 containerd[1513]: 2025-08-19 00:18:18.453 [INFO][4504] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Namespace="kube-system" Pod="coredns-674b8bbfcf-gzg9f" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"1e621243-8897-4395-9a28-32253cceeffb", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759", Pod:"coredns-674b8bbfcf-gzg9f", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.83.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali8f8fc99a968", MAC:"f2:83:dd:88:04:78", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:18.498122 containerd[1513]: 2025-08-19 00:18:18.478 [INFO][4504] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" Namespace="kube-system" Pod="coredns-674b8bbfcf-gzg9f" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--gzg9f-eth0" Aug 19 00:18:18.540104 kubelet[2769]: I0819 00:18:18.539640 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/whisker-886d9d6c4-bvq6m" podStartSLOduration=2.420272969 podStartE2EDuration="7.539616492s" podCreationTimestamp="2025-08-19 00:18:11 +0000 UTC" firstStartedPulling="2025-08-19 00:18:12.322957613 +0000 UTC m=+44.279339717" lastFinishedPulling="2025-08-19 00:18:17.442301096 +0000 UTC m=+49.398683240" observedRunningTime="2025-08-19 00:18:18.53587827 +0000 UTC m=+50.492260374" watchObservedRunningTime="2025-08-19 00:18:18.539616492 +0000 UTC m=+50.495998556" Aug 19 00:18:18.578417 containerd[1513]: time="2025-08-19T00:18:18.578342395Z" level=info msg="connecting to shim e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759" address="unix:///run/containerd/s/2067f4fd4c2490bdac793327c98e2bfd96d4d6c0a66d6fe41ae9290161fd4c79" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:18:18.631692 systemd-networkd[1412]: cali112ea6df917: Link UP Aug 19 00:18:18.631911 systemd-networkd[1412]: cali112ea6df917: Gained carrier Aug 19 00:18:18.665369 systemd[1]: Started cri-containerd-e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759.scope - libcontainer container e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759. Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.331 [INFO][4515] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0 calico-kube-controllers-75649cc89b- calico-system 27ede6b7-98b7-419b-84aa-572c3e073e65 819 0 2025-08-19 00:17:52 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:75649cc89b projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4426-0-0-8-661ee896d9 calico-kube-controllers-75649cc89b-2q5fl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali112ea6df917 [] [] }} ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Namespace="calico-system" Pod="calico-kube-controllers-75649cc89b-2q5fl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.331 [INFO][4515] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Namespace="calico-system" Pod="calico-kube-controllers-75649cc89b-2q5fl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.378 [INFO][4535] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" HandleID="k8s-pod-network.d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Workload="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.379 [INFO][4535] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" HandleID="k8s-pod-network.d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Workload="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b100), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-0-0-8-661ee896d9", "pod":"calico-kube-controllers-75649cc89b-2q5fl", "timestamp":"2025-08-19 00:18:18.378654438 +0000 UTC"}, Hostname:"ci-4426-0-0-8-661ee896d9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.379 [INFO][4535] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.420 [INFO][4535] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.421 [INFO][4535] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-8-661ee896d9' Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.472 [INFO][4535] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.502 [INFO][4535] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.517 [INFO][4535] ipam/ipam.go 511: Trying affinity for 192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.525 [INFO][4535] ipam/ipam.go 158: Attempting to load block cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.536 [INFO][4535] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.536 [INFO][4535] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.83.128/26 handle="k8s-pod-network.d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.557 [INFO][4535] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798 Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.583 [INFO][4535] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.83.128/26 handle="k8s-pod-network.d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.623 [INFO][4535] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.83.134/26] block=192.168.83.128/26 handle="k8s-pod-network.d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.623 [INFO][4535] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.83.134/26] handle="k8s-pod-network.d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.623 [INFO][4535] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:18:18.675256 containerd[1513]: 2025-08-19 00:18:18.623 [INFO][4535] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.134/26] IPv6=[] ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" HandleID="k8s-pod-network.d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Workload="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" Aug 19 00:18:18.676024 containerd[1513]: 2025-08-19 00:18:18.627 [INFO][4515] cni-plugin/k8s.go 418: Populated endpoint ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Namespace="calico-system" Pod="calico-kube-controllers-75649cc89b-2q5fl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0", GenerateName:"calico-kube-controllers-75649cc89b-", Namespace:"calico-system", SelfLink:"", UID:"27ede6b7-98b7-419b-84aa-572c3e073e65", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75649cc89b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"", Pod:"calico-kube-controllers-75649cc89b-2q5fl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.83.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali112ea6df917", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:18.676024 containerd[1513]: 2025-08-19 00:18:18.627 [INFO][4515] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.83.134/32] ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Namespace="calico-system" Pod="calico-kube-controllers-75649cc89b-2q5fl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" Aug 19 00:18:18.676024 containerd[1513]: 2025-08-19 00:18:18.627 [INFO][4515] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali112ea6df917 ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Namespace="calico-system" Pod="calico-kube-controllers-75649cc89b-2q5fl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" Aug 19 00:18:18.676024 containerd[1513]: 2025-08-19 00:18:18.634 [INFO][4515] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Namespace="calico-system" Pod="calico-kube-controllers-75649cc89b-2q5fl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" Aug 19 00:18:18.676024 containerd[1513]: 2025-08-19 00:18:18.634 [INFO][4515] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Namespace="calico-system" Pod="calico-kube-controllers-75649cc89b-2q5fl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0", GenerateName:"calico-kube-controllers-75649cc89b-", Namespace:"calico-system", SelfLink:"", UID:"27ede6b7-98b7-419b-84aa-572c3e073e65", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"75649cc89b", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798", Pod:"calico-kube-controllers-75649cc89b-2q5fl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.83.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali112ea6df917", MAC:"22:89:8b:6c:c1:e7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:18.676024 containerd[1513]: 2025-08-19 00:18:18.671 [INFO][4515] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" Namespace="calico-system" Pod="calico-kube-controllers-75649cc89b-2q5fl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-calico--kube--controllers--75649cc89b--2q5fl-eth0" Aug 19 00:18:18.744900 containerd[1513]: time="2025-08-19T00:18:18.744843824Z" level=info msg="connecting to shim d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798" address="unix:///run/containerd/s/80ec4c75e10158cd990b394fc3fe1bdefe1348ca1d0bcbd15de7f909bb845bbc" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:18:18.793909 containerd[1513]: time="2025-08-19T00:18:18.793857537Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-gzg9f,Uid:1e621243-8897-4395-9a28-32253cceeffb,Namespace:kube-system,Attempt:0,} returns sandbox id \"e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759\"" Aug 19 00:18:18.805663 containerd[1513]: time="2025-08-19T00:18:18.805284557Z" level=info msg="CreateContainer within sandbox \"e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 00:18:18.807413 systemd[1]: Started cri-containerd-d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798.scope - libcontainer container d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798. Aug 19 00:18:18.826723 containerd[1513]: time="2025-08-19T00:18:18.826206248Z" level=info msg="Container db423ecee241a89a6c032c20c6dc23d3147062763c13dd78e254e9c6b8e71c3e: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:18.840078 containerd[1513]: time="2025-08-19T00:18:18.840011805Z" level=info msg="CreateContainer within sandbox \"e10e6d670ff916df32fae1d8ce606dce8da9dfdaa553c3ee273b14ca8109c759\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"db423ecee241a89a6c032c20c6dc23d3147062763c13dd78e254e9c6b8e71c3e\"" Aug 19 00:18:18.842429 containerd[1513]: time="2025-08-19T00:18:18.842384983Z" level=info msg="StartContainer for \"db423ecee241a89a6c032c20c6dc23d3147062763c13dd78e254e9c6b8e71c3e\"" Aug 19 00:18:18.845856 containerd[1513]: time="2025-08-19T00:18:18.845801973Z" level=info msg="connecting to shim db423ecee241a89a6c032c20c6dc23d3147062763c13dd78e254e9c6b8e71c3e" address="unix:///run/containerd/s/2067f4fd4c2490bdac793327c98e2bfd96d4d6c0a66d6fe41ae9290161fd4c79" protocol=ttrpc version=3 Aug 19 00:18:18.880628 systemd[1]: Started cri-containerd-db423ecee241a89a6c032c20c6dc23d3147062763c13dd78e254e9c6b8e71c3e.scope - libcontainer container db423ecee241a89a6c032c20c6dc23d3147062763c13dd78e254e9c6b8e71c3e. Aug 19 00:18:18.936368 containerd[1513]: time="2025-08-19T00:18:18.936315477Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-75649cc89b-2q5fl,Uid:27ede6b7-98b7-419b-84aa-572c3e073e65,Namespace:calico-system,Attempt:0,} returns sandbox id \"d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798\"" Aug 19 00:18:18.954396 containerd[1513]: time="2025-08-19T00:18:18.954331844Z" level=info msg="StartContainer for \"db423ecee241a89a6c032c20c6dc23d3147062763c13dd78e254e9c6b8e71c3e\" returns successfully" Aug 19 00:18:19.214029 containerd[1513]: time="2025-08-19T00:18:19.213606653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tc6qm,Uid:f3224d19-d7a0-4294-80f2-5149b99ec962,Namespace:calico-system,Attempt:0,}" Aug 19 00:18:19.425338 systemd-networkd[1412]: cali720af83c76b: Link UP Aug 19 00:18:19.427457 systemd-networkd[1412]: cali720af83c76b: Gained carrier Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.300 [INFO][4697] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0 csi-node-driver- calico-system f3224d19-d7a0-4294-80f2-5149b99ec962 718 0 2025-08-19 00:17:52 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:8967bcb6f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4426-0-0-8-661ee896d9 csi-node-driver-tc6qm eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali720af83c76b [] [] }} ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Namespace="calico-system" Pod="csi-node-driver-tc6qm" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.300 [INFO][4697] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Namespace="calico-system" Pod="csi-node-driver-tc6qm" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.335 [INFO][4708] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" HandleID="k8s-pod-network.592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Workload="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.335 [INFO][4708] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" HandleID="k8s-pod-network.592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Workload="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2ff0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4426-0-0-8-661ee896d9", "pod":"csi-node-driver-tc6qm", "timestamp":"2025-08-19 00:18:19.33562707 +0000 UTC"}, Hostname:"ci-4426-0-0-8-661ee896d9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.335 [INFO][4708] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.335 [INFO][4708] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.335 [INFO][4708] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-8-661ee896d9' Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.349 [INFO][4708] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.362 [INFO][4708] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.379 [INFO][4708] ipam/ipam.go 511: Trying affinity for 192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.384 [INFO][4708] ipam/ipam.go 158: Attempting to load block cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.388 [INFO][4708] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.389 [INFO][4708] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.83.128/26 handle="k8s-pod-network.592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.391 [INFO][4708] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.400 [INFO][4708] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.83.128/26 handle="k8s-pod-network.592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.412 [INFO][4708] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.83.135/26] block=192.168.83.128/26 handle="k8s-pod-network.592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.413 [INFO][4708] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.83.135/26] handle="k8s-pod-network.592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.413 [INFO][4708] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:18:19.448318 containerd[1513]: 2025-08-19 00:18:19.413 [INFO][4708] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.135/26] IPv6=[] ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" HandleID="k8s-pod-network.592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Workload="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" Aug 19 00:18:19.449590 containerd[1513]: 2025-08-19 00:18:19.417 [INFO][4697] cni-plugin/k8s.go 418: Populated endpoint ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Namespace="calico-system" Pod="csi-node-driver-tc6qm" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f3224d19-d7a0-4294-80f2-5149b99ec962", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"", Pod:"csi-node-driver-tc6qm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.83.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali720af83c76b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:19.449590 containerd[1513]: 2025-08-19 00:18:19.417 [INFO][4697] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.83.135/32] ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Namespace="calico-system" Pod="csi-node-driver-tc6qm" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" Aug 19 00:18:19.449590 containerd[1513]: 2025-08-19 00:18:19.417 [INFO][4697] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali720af83c76b ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Namespace="calico-system" Pod="csi-node-driver-tc6qm" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" Aug 19 00:18:19.449590 containerd[1513]: 2025-08-19 00:18:19.428 [INFO][4697] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Namespace="calico-system" Pod="csi-node-driver-tc6qm" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" Aug 19 00:18:19.449590 containerd[1513]: 2025-08-19 00:18:19.429 [INFO][4697] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Namespace="calico-system" Pod="csi-node-driver-tc6qm" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"f3224d19-d7a0-4294-80f2-5149b99ec962", ResourceVersion:"718", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"8967bcb6f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a", Pod:"csi-node-driver-tc6qm", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.83.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali720af83c76b", MAC:"be:3e:c1:89:3e:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:19.449590 containerd[1513]: 2025-08-19 00:18:19.443 [INFO][4697] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" Namespace="calico-system" Pod="csi-node-driver-tc6qm" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-csi--node--driver--tc6qm-eth0" Aug 19 00:18:19.482151 containerd[1513]: time="2025-08-19T00:18:19.481025253Z" level=info msg="connecting to shim 592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a" address="unix:///run/containerd/s/a7f86613149ccdc45acb58de0fa17f4fa0cab42ba0281693205a2cdb23b0f0d1" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:18:19.519868 systemd[1]: Started cri-containerd-592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a.scope - libcontainer container 592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a. Aug 19 00:18:19.566790 kubelet[2769]: I0819 00:18:19.566693 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-gzg9f" podStartSLOduration=45.566666515 podStartE2EDuration="45.566666515s" podCreationTimestamp="2025-08-19 00:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:18:19.54604376 +0000 UTC m=+51.502425864" watchObservedRunningTime="2025-08-19 00:18:19.566666515 +0000 UTC m=+51.523048659" Aug 19 00:18:19.615471 containerd[1513]: time="2025-08-19T00:18:19.615425996Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tc6qm,Uid:f3224d19-d7a0-4294-80f2-5149b99ec962,Namespace:calico-system,Attempt:0,} returns sandbox id \"592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a\"" Aug 19 00:18:20.217406 containerd[1513]: time="2025-08-19T00:18:20.217056031Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zqxrl,Uid:e16b483e-f67c-4447-aac6-6ca42f039a20,Namespace:kube-system,Attempt:0,}" Aug 19 00:18:20.309897 systemd-networkd[1412]: cali8f8fc99a968: Gained IPv6LL Aug 19 00:18:20.417055 systemd-networkd[1412]: cali11748bd4a02: Link UP Aug 19 00:18:20.417280 systemd-networkd[1412]: cali11748bd4a02: Gained carrier Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.271 [INFO][4775] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0 coredns-674b8bbfcf- kube-system e16b483e-f67c-4447-aac6-6ca42f039a20 820 0 2025-08-19 00:17:34 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4426-0-0-8-661ee896d9 coredns-674b8bbfcf-zqxrl eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali11748bd4a02 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqxrl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.271 [INFO][4775] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqxrl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.312 [INFO][4787] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" HandleID="k8s-pod-network.c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Workload="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.312 [INFO][4787] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" HandleID="k8s-pod-network.c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Workload="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cb640), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4426-0-0-8-661ee896d9", "pod":"coredns-674b8bbfcf-zqxrl", "timestamp":"2025-08-19 00:18:20.312010572 +0000 UTC"}, Hostname:"ci-4426-0-0-8-661ee896d9", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.312 [INFO][4787] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.312 [INFO][4787] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.312 [INFO][4787] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4426-0-0-8-661ee896d9' Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.360 [INFO][4787] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.367 [INFO][4787] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.375 [INFO][4787] ipam/ipam.go 511: Trying affinity for 192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.378 [INFO][4787] ipam/ipam.go 158: Attempting to load block cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.383 [INFO][4787] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.83.128/26 host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.383 [INFO][4787] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.83.128/26 handle="k8s-pod-network.c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.387 [INFO][4787] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56 Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.394 [INFO][4787] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.83.128/26 handle="k8s-pod-network.c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.407 [INFO][4787] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.83.136/26] block=192.168.83.128/26 handle="k8s-pod-network.c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.407 [INFO][4787] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.83.136/26] handle="k8s-pod-network.c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" host="ci-4426-0-0-8-661ee896d9" Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.407 [INFO][4787] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. Aug 19 00:18:20.437345 containerd[1513]: 2025-08-19 00:18:20.408 [INFO][4787] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.83.136/26] IPv6=[] ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" HandleID="k8s-pod-network.c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Workload="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" Aug 19 00:18:20.438317 containerd[1513]: 2025-08-19 00:18:20.412 [INFO][4775] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqxrl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e16b483e-f67c-4447-aac6-6ca42f039a20", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"", Pod:"coredns-674b8bbfcf-zqxrl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.83.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali11748bd4a02", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:20.438317 containerd[1513]: 2025-08-19 00:18:20.412 [INFO][4775] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.83.136/32] ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqxrl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" Aug 19 00:18:20.438317 containerd[1513]: 2025-08-19 00:18:20.412 [INFO][4775] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali11748bd4a02 ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqxrl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" Aug 19 00:18:20.438317 containerd[1513]: 2025-08-19 00:18:20.415 [INFO][4775] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqxrl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" Aug 19 00:18:20.438317 containerd[1513]: 2025-08-19 00:18:20.415 [INFO][4775] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqxrl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"e16b483e-f67c-4447-aac6-6ca42f039a20", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.August, 19, 0, 17, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4426-0-0-8-661ee896d9", ContainerID:"c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56", Pod:"coredns-674b8bbfcf-zqxrl", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.83.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali11748bd4a02", MAC:"06:42:41:6d:ae:28", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Aug 19 00:18:20.438317 containerd[1513]: 2025-08-19 00:18:20.432 [INFO][4775] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" Namespace="kube-system" Pod="coredns-674b8bbfcf-zqxrl" WorkloadEndpoint="ci--4426--0--0--8--661ee896d9-k8s-coredns--674b8bbfcf--zqxrl-eth0" Aug 19 00:18:20.472773 containerd[1513]: time="2025-08-19T00:18:20.472517299Z" level=info msg="connecting to shim c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56" address="unix:///run/containerd/s/3f9d13dd8d649fb4ba65cdd289d3e144106ba70fecb888e02f2d09a547443a5c" namespace=k8s.io protocol=ttrpc version=3 Aug 19 00:18:20.512572 systemd[1]: Started cri-containerd-c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56.scope - libcontainer container c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56. Aug 19 00:18:20.564852 systemd-networkd[1412]: cali112ea6df917: Gained IPv6LL Aug 19 00:18:20.572484 containerd[1513]: time="2025-08-19T00:18:20.572406839Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-zqxrl,Uid:e16b483e-f67c-4447-aac6-6ca42f039a20,Namespace:kube-system,Attempt:0,} returns sandbox id \"c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56\"" Aug 19 00:18:20.581727 containerd[1513]: time="2025-08-19T00:18:20.581531415Z" level=info msg="CreateContainer within sandbox \"c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Aug 19 00:18:20.604250 containerd[1513]: time="2025-08-19T00:18:20.602070309Z" level=info msg="Container 9eaffff87e77da9a90d4a6336b69f4512be82f8223d0eaf0611873d08f65fa5e: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:20.609860 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount721787115.mount: Deactivated successfully. Aug 19 00:18:20.617731 containerd[1513]: time="2025-08-19T00:18:20.617683084Z" level=info msg="CreateContainer within sandbox \"c3198c433cf06630236bcbb839fbf53c69c2ca4cb6b98ccd39dc5f71f9111c56\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9eaffff87e77da9a90d4a6336b69f4512be82f8223d0eaf0611873d08f65fa5e\"" Aug 19 00:18:20.619836 containerd[1513]: time="2025-08-19T00:18:20.619586957Z" level=info msg="StartContainer for \"9eaffff87e77da9a90d4a6336b69f4512be82f8223d0eaf0611873d08f65fa5e\"" Aug 19 00:18:20.621583 containerd[1513]: time="2025-08-19T00:18:20.621535389Z" level=info msg="connecting to shim 9eaffff87e77da9a90d4a6336b69f4512be82f8223d0eaf0611873d08f65fa5e" address="unix:///run/containerd/s/3f9d13dd8d649fb4ba65cdd289d3e144106ba70fecb888e02f2d09a547443a5c" protocol=ttrpc version=3 Aug 19 00:18:20.652114 systemd[1]: Started cri-containerd-9eaffff87e77da9a90d4a6336b69f4512be82f8223d0eaf0611873d08f65fa5e.scope - libcontainer container 9eaffff87e77da9a90d4a6336b69f4512be82f8223d0eaf0611873d08f65fa5e. Aug 19 00:18:20.702746 containerd[1513]: time="2025-08-19T00:18:20.702632392Z" level=info msg="StartContainer for \"9eaffff87e77da9a90d4a6336b69f4512be82f8223d0eaf0611873d08f65fa5e\" returns successfully" Aug 19 00:18:20.820629 systemd-networkd[1412]: cali720af83c76b: Gained IPv6LL Aug 19 00:18:21.595624 kubelet[2769]: I0819 00:18:21.595546 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-zqxrl" podStartSLOduration=47.595524464 podStartE2EDuration="47.595524464s" podCreationTimestamp="2025-08-19 00:17:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-08-19 00:18:21.565277305 +0000 UTC m=+53.521659449" watchObservedRunningTime="2025-08-19 00:18:21.595524464 +0000 UTC m=+53.551906568" Aug 19 00:18:22.100350 systemd-networkd[1412]: cali11748bd4a02: Gained IPv6LL Aug 19 00:18:22.424219 containerd[1513]: time="2025-08-19T00:18:22.424031740Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:22.426191 containerd[1513]: time="2025-08-19T00:18:22.425964815Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=44517149" Aug 19 00:18:22.427694 containerd[1513]: time="2025-08-19T00:18:22.427625017Z" level=info msg="ImageCreate event name:\"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:22.431724 containerd[1513]: time="2025-08-19T00:18:22.431510687Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:22.439163 containerd[1513]: time="2025-08-19T00:18:22.438630802Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 4.996001315s" Aug 19 00:18:22.439163 containerd[1513]: time="2025-08-19T00:18:22.438685281Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 19 00:18:22.446300 containerd[1513]: time="2025-08-19T00:18:22.446075070Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\"" Aug 19 00:18:22.451853 containerd[1513]: time="2025-08-19T00:18:22.451811858Z" level=info msg="CreateContainer within sandbox \"c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 00:18:22.491432 containerd[1513]: time="2025-08-19T00:18:22.491374104Z" level=info msg="Container 80a57294a649a80ce842abede4e6746326bbc2b871f5069907bba4a5e67e0af7: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:22.507789 containerd[1513]: time="2025-08-19T00:18:22.507701127Z" level=info msg="CreateContainer within sandbox \"c1a60290f414f60de9f790066256412e091173939cbc7a1bc4e6777220788e84\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"80a57294a649a80ce842abede4e6746326bbc2b871f5069907bba4a5e67e0af7\"" Aug 19 00:18:22.509930 containerd[1513]: time="2025-08-19T00:18:22.508571907Z" level=info msg="StartContainer for \"80a57294a649a80ce842abede4e6746326bbc2b871f5069907bba4a5e67e0af7\"" Aug 19 00:18:22.512168 containerd[1513]: time="2025-08-19T00:18:22.511505199Z" level=info msg="connecting to shim 80a57294a649a80ce842abede4e6746326bbc2b871f5069907bba4a5e67e0af7" address="unix:///run/containerd/s/26bd66b3763270a9a7542a175c3d1e0c928185cbbe9076de008f71446c78e204" protocol=ttrpc version=3 Aug 19 00:18:22.540433 systemd[1]: Started cri-containerd-80a57294a649a80ce842abede4e6746326bbc2b871f5069907bba4a5e67e0af7.scope - libcontainer container 80a57294a649a80ce842abede4e6746326bbc2b871f5069907bba4a5e67e0af7. Aug 19 00:18:22.599882 containerd[1513]: time="2025-08-19T00:18:22.599800919Z" level=info msg="StartContainer for \"80a57294a649a80ce842abede4e6746326bbc2b871f5069907bba4a5e67e0af7\" returns successfully" Aug 19 00:18:23.585061 kubelet[2769]: I0819 00:18:23.583883 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-667fd66bbf-28jqt" podStartSLOduration=30.829959763 podStartE2EDuration="37.58385641s" podCreationTimestamp="2025-08-19 00:17:46 +0000 UTC" firstStartedPulling="2025-08-19 00:18:15.689464366 +0000 UTC m=+47.645846470" lastFinishedPulling="2025-08-19 00:18:22.443361013 +0000 UTC m=+54.399743117" observedRunningTime="2025-08-19 00:18:23.581455384 +0000 UTC m=+55.537837488" watchObservedRunningTime="2025-08-19 00:18:23.58385641 +0000 UTC m=+55.540238514" Aug 19 00:18:25.312671 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1584121389.mount: Deactivated successfully. Aug 19 00:18:25.576333 kubelet[2769]: I0819 00:18:25.574392 2769 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:18:26.108294 containerd[1513]: time="2025-08-19T00:18:26.108233803Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:26.110833 containerd[1513]: time="2025-08-19T00:18:26.110743232Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.2: active requests=0, bytes read=61838790" Aug 19 00:18:26.112012 containerd[1513]: time="2025-08-19T00:18:26.111969887Z" level=info msg="ImageCreate event name:\"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:26.116027 containerd[1513]: time="2025-08-19T00:18:26.115978805Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:26.117714 containerd[1513]: time="2025-08-19T00:18:26.117495614Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" with image id \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\", repo tag \"ghcr.io/flatcar/calico/goldmane:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/goldmane@sha256:a2b761fd93d824431ad93e59e8e670cdf00b478f4b532145297e1e67f2768305\", size \"61838636\" in 3.671366225s" Aug 19 00:18:26.117714 containerd[1513]: time="2025-08-19T00:18:26.117538853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.2\" returns image reference \"sha256:1389d38feb576cfff09a57a2c028a53e51a72c658f295166960f770eaf07985f\"" Aug 19 00:18:26.118589 containerd[1513]: time="2025-08-19T00:18:26.118560033Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\"" Aug 19 00:18:26.123897 containerd[1513]: time="2025-08-19T00:18:26.123850445Z" level=info msg="CreateContainer within sandbox \"7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02\" for container &ContainerMetadata{Name:goldmane,Attempt:0,}" Aug 19 00:18:26.136162 containerd[1513]: time="2025-08-19T00:18:26.134836502Z" level=info msg="Container 4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:26.163659 containerd[1513]: time="2025-08-19T00:18:26.163601397Z" level=info msg="CreateContainer within sandbox \"7dd50b942b88ee1bb90bde42cf9e9ca1d767ec466092be8a09019a0c392eca02\" for &ContainerMetadata{Name:goldmane,Attempt:0,} returns container id \"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\"" Aug 19 00:18:26.167063 containerd[1513]: time="2025-08-19T00:18:26.167003848Z" level=info msg="StartContainer for \"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\"" Aug 19 00:18:26.170198 containerd[1513]: time="2025-08-19T00:18:26.170096505Z" level=info msg="connecting to shim 4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3" address="unix:///run/containerd/s/239b3d65b58d8aa91869fb3b73a9bfbd96cac8e07b23f41efaa643d2f9e077bc" protocol=ttrpc version=3 Aug 19 00:18:26.200441 systemd[1]: Started cri-containerd-4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3.scope - libcontainer container 4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3. Aug 19 00:18:26.303903 containerd[1513]: time="2025-08-19T00:18:26.303857387Z" level=info msg="StartContainer for \"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\" returns successfully" Aug 19 00:18:26.522896 containerd[1513]: time="2025-08-19T00:18:26.522672379Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:26.525686 containerd[1513]: time="2025-08-19T00:18:26.524430743Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.2: active requests=0, bytes read=77" Aug 19 00:18:26.530428 containerd[1513]: time="2025-08-19T00:18:26.530364343Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" with image id \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ec6b10660962e7caad70c47755049fad68f9fc2f7064e8bc7cb862583e02cc2b\", size \"45886406\" in 411.751911ms" Aug 19 00:18:26.530428 containerd[1513]: time="2025-08-19T00:18:26.530421982Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.2\" returns image reference \"sha256:3371ea1b18040228ef58c964e49b96f4291def748753dfbc0aef87a55f906b8f\"" Aug 19 00:18:26.534028 containerd[1513]: time="2025-08-19T00:18:26.532515739Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\"" Aug 19 00:18:26.538993 containerd[1513]: time="2025-08-19T00:18:26.538942728Z" level=info msg="CreateContainer within sandbox \"bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" Aug 19 00:18:26.554875 containerd[1513]: time="2025-08-19T00:18:26.554815126Z" level=info msg="Container a6692f6ad19c153172485885bb42b124d911017d87d95322bebfb684fe7bcd11: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:26.571930 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount673418150.mount: Deactivated successfully. Aug 19 00:18:26.587109 containerd[1513]: time="2025-08-19T00:18:26.586837795Z" level=info msg="CreateContainer within sandbox \"bda025a2ce0954b76db45aed3e320ee956d68e947c6d8132184d22e63fb87a6f\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"a6692f6ad19c153172485885bb42b124d911017d87d95322bebfb684fe7bcd11\"" Aug 19 00:18:26.592153 containerd[1513]: time="2025-08-19T00:18:26.590878033Z" level=info msg="StartContainer for \"a6692f6ad19c153172485885bb42b124d911017d87d95322bebfb684fe7bcd11\"" Aug 19 00:18:26.593910 containerd[1513]: time="2025-08-19T00:18:26.593848093Z" level=info msg="connecting to shim a6692f6ad19c153172485885bb42b124d911017d87d95322bebfb684fe7bcd11" address="unix:///run/containerd/s/d08f4ca866412282ba9696fddb9ded9e3d68a7d11dbe05f54ad67288d99f026c" protocol=ttrpc version=3 Aug 19 00:18:26.616091 kubelet[2769]: I0819 00:18:26.612952 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/goldmane-768f4c5c69-tp2sl" podStartSLOduration=25.227170948 podStartE2EDuration="35.612935745s" podCreationTimestamp="2025-08-19 00:17:51 +0000 UTC" firstStartedPulling="2025-08-19 00:18:15.732667118 +0000 UTC m=+47.689049222" lastFinishedPulling="2025-08-19 00:18:26.118431915 +0000 UTC m=+58.074814019" observedRunningTime="2025-08-19 00:18:26.612477594 +0000 UTC m=+58.568859778" watchObservedRunningTime="2025-08-19 00:18:26.612935745 +0000 UTC m=+58.569317849" Aug 19 00:18:26.641590 systemd[1]: Started cri-containerd-a6692f6ad19c153172485885bb42b124d911017d87d95322bebfb684fe7bcd11.scope - libcontainer container a6692f6ad19c153172485885bb42b124d911017d87d95322bebfb684fe7bcd11. Aug 19 00:18:26.780942 containerd[1513]: time="2025-08-19T00:18:26.780885411Z" level=info msg="StartContainer for \"a6692f6ad19c153172485885bb42b124d911017d87d95322bebfb684fe7bcd11\" returns successfully" Aug 19 00:18:26.868245 containerd[1513]: time="2025-08-19T00:18:26.868191317Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\" id:\"fcebe3723fb1a3254e31c780f48c85a361c55aa5e8261751fd07ab3a256d0f46\" pid:5010 exit_status:1 exited_at:{seconds:1755562706 nanos:866523070}" Aug 19 00:18:27.751792 containerd[1513]: time="2025-08-19T00:18:27.751730120Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\" id:\"68458067eb49f1b443ea323f1842c0eb8109ac221ce22df33b221426cb34d322\" pid:5058 exit_status:1 exited_at:{seconds:1755562707 nanos:749601082}" Aug 19 00:18:27.866875 kubelet[2769]: I0819 00:18:27.866793 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-667fd66bbf-57zdj" podStartSLOduration=31.910779737 podStartE2EDuration="41.866775535s" podCreationTimestamp="2025-08-19 00:17:46 +0000 UTC" firstStartedPulling="2025-08-19 00:18:16.57609027 +0000 UTC m=+48.532472374" lastFinishedPulling="2025-08-19 00:18:26.532085988 +0000 UTC m=+58.488468172" observedRunningTime="2025-08-19 00:18:27.619237448 +0000 UTC m=+59.575619512" watchObservedRunningTime="2025-08-19 00:18:27.866775535 +0000 UTC m=+59.823157639" Aug 19 00:18:28.597038 kubelet[2769]: I0819 00:18:28.594589 2769 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:18:29.809448 containerd[1513]: time="2025-08-19T00:18:29.809391185Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:29.810565 containerd[1513]: time="2025-08-19T00:18:29.810480405Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.2: active requests=0, bytes read=48128336" Aug 19 00:18:29.812045 containerd[1513]: time="2025-08-19T00:18:29.811928018Z" level=info msg="ImageCreate event name:\"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:29.815013 containerd[1513]: time="2025-08-19T00:18:29.814936082Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:29.815821 containerd[1513]: time="2025-08-19T00:18:29.815616790Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" with image id \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:5d3ecdec3cbbe8f7009077102e35e8a2141161b59c548cf3f97829177677cbce\", size \"49497545\" in 3.283054092s" Aug 19 00:18:29.815821 containerd[1513]: time="2025-08-19T00:18:29.815656589Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.2\" returns image reference \"sha256:ba9e7793995ca67a9b78aa06adda4e89cbd435b1e88ab1032ca665140517fa7a\"" Aug 19 00:18:29.817219 containerd[1513]: time="2025-08-19T00:18:29.817182241Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\"" Aug 19 00:18:29.844236 containerd[1513]: time="2025-08-19T00:18:29.843681072Z" level=info msg="CreateContainer within sandbox \"d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" Aug 19 00:18:29.864413 containerd[1513]: time="2025-08-19T00:18:29.864362370Z" level=info msg="Container ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:29.870794 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2378920348.mount: Deactivated successfully. Aug 19 00:18:29.875322 containerd[1513]: time="2025-08-19T00:18:29.875176010Z" level=info msg="CreateContainer within sandbox \"d087be815d2907dc97801720309f5c8cd2ec2f106a18d5f808195e6d59179798\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\"" Aug 19 00:18:29.876154 containerd[1513]: time="2025-08-19T00:18:29.875968436Z" level=info msg="StartContainer for \"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\"" Aug 19 00:18:29.879741 containerd[1513]: time="2025-08-19T00:18:29.879650808Z" level=info msg="connecting to shim ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6" address="unix:///run/containerd/s/80ec4c75e10158cd990b394fc3fe1bdefe1348ca1d0bcbd15de7f909bb845bbc" protocol=ttrpc version=3 Aug 19 00:18:29.908537 systemd[1]: Started cri-containerd-ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6.scope - libcontainer container ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6. Aug 19 00:18:29.968474 containerd[1513]: time="2025-08-19T00:18:29.968423729Z" level=info msg="StartContainer for \"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\" returns successfully" Aug 19 00:18:30.635442 kubelet[2769]: I0819 00:18:30.634051 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-75649cc89b-2q5fl" podStartSLOduration=27.757952081 podStartE2EDuration="38.634032169s" podCreationTimestamp="2025-08-19 00:17:52 +0000 UTC" firstStartedPulling="2025-08-19 00:18:18.940924756 +0000 UTC m=+50.897306860" lastFinishedPulling="2025-08-19 00:18:29.817004844 +0000 UTC m=+61.773386948" observedRunningTime="2025-08-19 00:18:30.631926166 +0000 UTC m=+62.588308270" watchObservedRunningTime="2025-08-19 00:18:30.634032169 +0000 UTC m=+62.590414273" Aug 19 00:18:30.663172 containerd[1513]: time="2025-08-19T00:18:30.663042450Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\" id:\"8cac2c86f7a140ce353a5506b45880da2f967a27843a363cf697eace44eb7d5b\" pid:5139 exited_at:{seconds:1755562710 nanos:658418853}" Aug 19 00:18:31.535622 containerd[1513]: time="2025-08-19T00:18:31.535329516Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:31.537908 containerd[1513]: time="2025-08-19T00:18:31.537831472Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.2: active requests=0, bytes read=8225702" Aug 19 00:18:31.540318 containerd[1513]: time="2025-08-19T00:18:31.540230711Z" level=info msg="ImageCreate event name:\"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:31.544371 containerd[1513]: time="2025-08-19T00:18:31.544326320Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:31.546048 containerd[1513]: time="2025-08-19T00:18:31.546004971Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.2\" with image id \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:e570128aa8067a2f06b96d3cc98afa2e0a4b9790b435ee36ca051c8e72aeb8d0\", size \"9594943\" in 1.728779531s" Aug 19 00:18:31.546219 containerd[1513]: time="2025-08-19T00:18:31.546197887Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.2\" returns image reference \"sha256:14ecfabbdbebd1f5a36708f8b11a95a43baddd6a935d7d78c89a9c333849fcd2\"" Aug 19 00:18:31.552850 containerd[1513]: time="2025-08-19T00:18:31.552784573Z" level=info msg="CreateContainer within sandbox \"592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" Aug 19 00:18:31.568167 containerd[1513]: time="2025-08-19T00:18:31.567432480Z" level=info msg="Container 0d2aedc9d31c2639e8aefd3591f77db922c0c898e473a347c12b9c25c310fd25: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:31.594517 containerd[1513]: time="2025-08-19T00:18:31.594433212Z" level=info msg="CreateContainer within sandbox \"592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"0d2aedc9d31c2639e8aefd3591f77db922c0c898e473a347c12b9c25c310fd25\"" Aug 19 00:18:31.596445 containerd[1513]: time="2025-08-19T00:18:31.596097623Z" level=info msg="StartContainer for \"0d2aedc9d31c2639e8aefd3591f77db922c0c898e473a347c12b9c25c310fd25\"" Aug 19 00:18:31.602976 containerd[1513]: time="2025-08-19T00:18:31.602872346Z" level=info msg="connecting to shim 0d2aedc9d31c2639e8aefd3591f77db922c0c898e473a347c12b9c25c310fd25" address="unix:///run/containerd/s/a7f86613149ccdc45acb58de0fa17f4fa0cab42ba0281693205a2cdb23b0f0d1" protocol=ttrpc version=3 Aug 19 00:18:31.640623 systemd[1]: Started cri-containerd-0d2aedc9d31c2639e8aefd3591f77db922c0c898e473a347c12b9c25c310fd25.scope - libcontainer container 0d2aedc9d31c2639e8aefd3591f77db922c0c898e473a347c12b9c25c310fd25. Aug 19 00:18:31.693830 containerd[1513]: time="2025-08-19T00:18:31.693657694Z" level=info msg="StartContainer for \"0d2aedc9d31c2639e8aefd3591f77db922c0c898e473a347c12b9c25c310fd25\" returns successfully" Aug 19 00:18:31.697500 containerd[1513]: time="2025-08-19T00:18:31.697429109Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\"" Aug 19 00:18:33.946618 containerd[1513]: time="2025-08-19T00:18:33.946552208Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:33.949636 containerd[1513]: time="2025-08-19T00:18:33.949573839Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2: active requests=0, bytes read=13754366" Aug 19 00:18:33.950595 containerd[1513]: time="2025-08-19T00:18:33.950543863Z" level=info msg="ImageCreate event name:\"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:33.956233 containerd[1513]: time="2025-08-19T00:18:33.956112133Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Aug 19 00:18:33.958581 containerd[1513]: time="2025-08-19T00:18:33.958528574Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" with image id \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:8fec2de12dfa51bae89d941938a07af2598eb8bfcab55d0dded1d9c193d7b99f\", size \"15123559\" in 2.261056826s" Aug 19 00:18:33.958581 containerd[1513]: time="2025-08-19T00:18:33.958577293Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.2\" returns image reference \"sha256:664ed31fb4687b0de23d6e6e116bc87b236790d7355871d3237c54452e02e27c\"" Aug 19 00:18:33.964485 containerd[1513]: time="2025-08-19T00:18:33.964395319Z" level=info msg="CreateContainer within sandbox \"592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" Aug 19 00:18:33.982280 containerd[1513]: time="2025-08-19T00:18:33.979714950Z" level=info msg="Container cec88afa7887329adc084242a1529eafeb6ec45d5080bdafbcdb8e99442e7ee3: CDI devices from CRI Config.CDIDevices: []" Aug 19 00:18:33.997307 containerd[1513]: time="2025-08-19T00:18:33.997259705Z" level=info msg="CreateContainer within sandbox \"592d784f9cebce3ae6702ee19c8ed59159112b28ca528da459715a7b3cf7059a\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"cec88afa7887329adc084242a1529eafeb6ec45d5080bdafbcdb8e99442e7ee3\"" Aug 19 00:18:33.998858 containerd[1513]: time="2025-08-19T00:18:33.998826519Z" level=info msg="StartContainer for \"cec88afa7887329adc084242a1529eafeb6ec45d5080bdafbcdb8e99442e7ee3\"" Aug 19 00:18:34.002701 containerd[1513]: time="2025-08-19T00:18:34.002665538Z" level=info msg="connecting to shim cec88afa7887329adc084242a1529eafeb6ec45d5080bdafbcdb8e99442e7ee3" address="unix:///run/containerd/s/a7f86613149ccdc45acb58de0fa17f4fa0cab42ba0281693205a2cdb23b0f0d1" protocol=ttrpc version=3 Aug 19 00:18:34.047539 systemd[1]: Started cri-containerd-cec88afa7887329adc084242a1529eafeb6ec45d5080bdafbcdb8e99442e7ee3.scope - libcontainer container cec88afa7887329adc084242a1529eafeb6ec45d5080bdafbcdb8e99442e7ee3. Aug 19 00:18:34.107557 containerd[1513]: time="2025-08-19T00:18:34.107506889Z" level=info msg="StartContainer for \"cec88afa7887329adc084242a1529eafeb6ec45d5080bdafbcdb8e99442e7ee3\" returns successfully" Aug 19 00:18:34.324827 kubelet[2769]: I0819 00:18:34.324675 2769 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 Aug 19 00:18:34.331766 kubelet[2769]: I0819 00:18:34.331533 2769 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock Aug 19 00:18:34.674719 kubelet[2769]: I0819 00:18:34.674480 2769 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-tc6qm" podStartSLOduration=28.332514714 podStartE2EDuration="42.674457773s" podCreationTimestamp="2025-08-19 00:17:52 +0000 UTC" firstStartedPulling="2025-08-19 00:18:19.617807375 +0000 UTC m=+51.574189479" lastFinishedPulling="2025-08-19 00:18:33.959750474 +0000 UTC m=+65.916132538" observedRunningTime="2025-08-19 00:18:34.672814399 +0000 UTC m=+66.629196503" watchObservedRunningTime="2025-08-19 00:18:34.674457773 +0000 UTC m=+66.630839917" Aug 19 00:18:35.278574 update_engine[1492]: I20250819 00:18:35.278486 1492 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Aug 19 00:18:35.278574 update_engine[1492]: I20250819 00:18:35.278549 1492 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Aug 19 00:18:35.279080 update_engine[1492]: I20250819 00:18:35.278820 1492 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Aug 19 00:18:35.279942 update_engine[1492]: I20250819 00:18:35.279796 1492 omaha_request_params.cc:62] Current group set to alpha Aug 19 00:18:35.281310 update_engine[1492]: I20250819 00:18:35.281054 1492 update_attempter.cc:499] Already updated boot flags. Skipping. Aug 19 00:18:35.281310 update_engine[1492]: I20250819 00:18:35.281090 1492 update_attempter.cc:643] Scheduling an action processor start. Aug 19 00:18:35.281310 update_engine[1492]: I20250819 00:18:35.281111 1492 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Aug 19 00:18:35.287171 update_engine[1492]: I20250819 00:18:35.286007 1492 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Aug 19 00:18:35.287171 update_engine[1492]: I20250819 00:18:35.286239 1492 omaha_request_action.cc:271] Posting an Omaha request to disabled Aug 19 00:18:35.287171 update_engine[1492]: I20250819 00:18:35.286262 1492 omaha_request_action.cc:272] Request: Aug 19 00:18:35.287171 update_engine[1492]: Aug 19 00:18:35.287171 update_engine[1492]: Aug 19 00:18:35.287171 update_engine[1492]: Aug 19 00:18:35.287171 update_engine[1492]: Aug 19 00:18:35.287171 update_engine[1492]: Aug 19 00:18:35.287171 update_engine[1492]: Aug 19 00:18:35.287171 update_engine[1492]: Aug 19 00:18:35.287171 update_engine[1492]: Aug 19 00:18:35.287171 update_engine[1492]: I20250819 00:18:35.286274 1492 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 19 00:18:35.295790 update_engine[1492]: I20250819 00:18:35.292709 1492 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 19 00:18:35.295790 update_engine[1492]: I20250819 00:18:35.293079 1492 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 19 00:18:35.295790 update_engine[1492]: E20250819 00:18:35.293588 1492 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 19 00:18:35.295790 update_engine[1492]: I20250819 00:18:35.293654 1492 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Aug 19 00:18:35.296236 locksmithd[1528]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Aug 19 00:18:40.118178 kubelet[2769]: I0819 00:18:40.115492 2769 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Aug 19 00:18:43.558087 containerd[1513]: time="2025-08-19T00:18:43.558017556Z" level=info msg="TaskExit event in podsandbox handler container_id:\"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\" id:\"3b04551b290704d4944dd6af1a48ec001986eac8e51197d37d07262e32139e4c\" pid:5244 exited_at:{seconds:1755562723 nanos:557628761}" Aug 19 00:18:45.285852 update_engine[1492]: I20250819 00:18:45.285735 1492 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 19 00:18:45.286329 update_engine[1492]: I20250819 00:18:45.286186 1492 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 19 00:18:45.287035 update_engine[1492]: I20250819 00:18:45.286695 1492 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 19 00:18:45.287317 update_engine[1492]: E20250819 00:18:45.287240 1492 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 19 00:18:45.287477 update_engine[1492]: I20250819 00:18:45.287420 1492 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Aug 19 00:18:50.370585 containerd[1513]: time="2025-08-19T00:18:50.370084239Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\" id:\"c320809b4194f8ada59b74bb76bebf599326cbc3472895f35bfb00ffec6de1dd\" pid:5274 exited_at:{seconds:1755562730 nanos:369023928}" Aug 19 00:18:55.286399 update_engine[1492]: I20250819 00:18:55.286319 1492 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 19 00:18:55.286768 update_engine[1492]: I20250819 00:18:55.286613 1492 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 19 00:18:55.286966 update_engine[1492]: I20250819 00:18:55.286927 1492 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 19 00:18:55.287454 update_engine[1492]: E20250819 00:18:55.287407 1492 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 19 00:18:55.287509 update_engine[1492]: I20250819 00:18:55.287497 1492 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Aug 19 00:18:57.783987 containerd[1513]: time="2025-08-19T00:18:57.783834984Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\" id:\"d03981394c01ceaee92be1ad4c8bbcec016492d0bbf16da2c17c6adc9cfb43d9\" pid:5302 exited_at:{seconds:1755562737 nanos:783426867}" Aug 19 00:19:00.687863 containerd[1513]: time="2025-08-19T00:19:00.687795517Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\" id:\"c1ffe2dc319ece377fbdbe56de0dfeb1ca30c9a5df9c430abd93496477a24318\" pid:5328 exited_at:{seconds:1755562740 nanos:687500119}" Aug 19 00:19:01.394274 containerd[1513]: time="2025-08-19T00:19:01.394037608Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\" id:\"926781019b144ac1c06cb01b04f01560363201e271020d1b1c6c5bb6947400c9\" pid:5350 exited_at:{seconds:1755562741 nanos:393645410}" Aug 19 00:19:05.287258 update_engine[1492]: I20250819 00:19:05.286262 1492 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 19 00:19:05.287258 update_engine[1492]: I20250819 00:19:05.286597 1492 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 19 00:19:05.287258 update_engine[1492]: I20250819 00:19:05.286947 1492 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 19 00:19:05.288160 update_engine[1492]: E20250819 00:19:05.288087 1492 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 19 00:19:05.288307 update_engine[1492]: I20250819 00:19:05.288275 1492 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Aug 19 00:19:05.288307 update_engine[1492]: I20250819 00:19:05.288300 1492 omaha_request_action.cc:617] Omaha request response: Aug 19 00:19:05.288424 update_engine[1492]: E20250819 00:19:05.288401 1492 omaha_request_action.cc:636] Omaha request network transfer failed. Aug 19 00:19:05.288498 update_engine[1492]: I20250819 00:19:05.288429 1492 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Aug 19 00:19:05.288498 update_engine[1492]: I20250819 00:19:05.288436 1492 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 19 00:19:05.288498 update_engine[1492]: I20250819 00:19:05.288441 1492 update_attempter.cc:306] Processing Done. Aug 19 00:19:05.288498 update_engine[1492]: E20250819 00:19:05.288460 1492 update_attempter.cc:619] Update failed. Aug 19 00:19:05.288498 update_engine[1492]: I20250819 00:19:05.288465 1492 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Aug 19 00:19:05.288498 update_engine[1492]: I20250819 00:19:05.288470 1492 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Aug 19 00:19:05.288498 update_engine[1492]: I20250819 00:19:05.288476 1492 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Aug 19 00:19:05.288701 update_engine[1492]: I20250819 00:19:05.288599 1492 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Aug 19 00:19:05.288701 update_engine[1492]: I20250819 00:19:05.288632 1492 omaha_request_action.cc:271] Posting an Omaha request to disabled Aug 19 00:19:05.288701 update_engine[1492]: I20250819 00:19:05.288640 1492 omaha_request_action.cc:272] Request: Aug 19 00:19:05.288701 update_engine[1492]: Aug 19 00:19:05.288701 update_engine[1492]: Aug 19 00:19:05.288701 update_engine[1492]: Aug 19 00:19:05.288701 update_engine[1492]: Aug 19 00:19:05.288701 update_engine[1492]: Aug 19 00:19:05.288701 update_engine[1492]: Aug 19 00:19:05.288701 update_engine[1492]: I20250819 00:19:05.288646 1492 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Aug 19 00:19:05.288971 update_engine[1492]: I20250819 00:19:05.288857 1492 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Aug 19 00:19:05.289485 update_engine[1492]: I20250819 00:19:05.289125 1492 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Aug 19 00:19:05.289607 update_engine[1492]: E20250819 00:19:05.289558 1492 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled Aug 19 00:19:05.289637 update_engine[1492]: I20250819 00:19:05.289626 1492 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Aug 19 00:19:05.289675 update_engine[1492]: I20250819 00:19:05.289635 1492 omaha_request_action.cc:617] Omaha request response: Aug 19 00:19:05.289675 update_engine[1492]: I20250819 00:19:05.289642 1492 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 19 00:19:05.289675 update_engine[1492]: I20250819 00:19:05.289647 1492 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Aug 19 00:19:05.289675 update_engine[1492]: I20250819 00:19:05.289652 1492 update_attempter.cc:306] Processing Done. Aug 19 00:19:05.289675 update_engine[1492]: I20250819 00:19:05.289659 1492 update_attempter.cc:310] Error event sent. Aug 19 00:19:05.289675 update_engine[1492]: I20250819 00:19:05.289667 1492 update_check_scheduler.cc:74] Next update check in 49m57s Aug 19 00:19:05.290521 locksmithd[1528]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Aug 19 00:19:05.290521 locksmithd[1528]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Aug 19 00:19:13.563590 containerd[1513]: time="2025-08-19T00:19:13.563543328Z" level=info msg="TaskExit event in podsandbox handler container_id:\"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\" id:\"a875cd95e799961c5a6220b896b233283e747c7a2731ec9ed9bd98699234df26\" pid:5375 exited_at:{seconds:1755562753 nanos:562497772}" Aug 19 00:19:27.752686 containerd[1513]: time="2025-08-19T00:19:27.752360311Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\" id:\"5e2dc8a3c4e1fd510a3276f96738d8dc420bf00fe592fbf7180fc619a3cb1af9\" pid:5401 exited_at:{seconds:1755562767 nanos:752003832}" Aug 19 00:19:30.670315 containerd[1513]: time="2025-08-19T00:19:30.670087136Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\" id:\"51b8e61e229160033513b970e0585fa22c4cb3e9e1f3518a902707e9a04c5836\" pid:5426 exited_at:{seconds:1755562770 nanos:667650582}" Aug 19 00:19:43.560749 containerd[1513]: time="2025-08-19T00:19:43.560697819Z" level=info msg="TaskExit event in podsandbox handler container_id:\"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\" id:\"aa6d28bf50be3a319ca2702763d8250085d1f47c5f48513f76f87f50ccd7e35f\" pid:5461 exited_at:{seconds:1755562783 nanos:560073780}" Aug 19 00:19:50.355890 containerd[1513]: time="2025-08-19T00:19:50.355812209Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\" id:\"34e8aa4a6d7b69ee85b35e5d0fba44088b8a122a6bafcbb602040b77dde7dcb6\" pid:5507 exited_at:{seconds:1755562790 nanos:355165050}" Aug 19 00:19:57.681448 containerd[1513]: time="2025-08-19T00:19:57.681385943Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\" id:\"abeb741a5082f94a7b0a72bc9a63c0d7cc60e7068fd5a96be05e7573940d1f05\" pid:5529 exited_at:{seconds:1755562797 nanos:680772261}" Aug 19 00:20:00.656405 containerd[1513]: time="2025-08-19T00:20:00.656346221Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\" id:\"030dcd7ccf3f6defafd3e891541530ecb0b318da7197fc14e0728eef8d4a23de\" pid:5553 exited_at:{seconds:1755562800 nanos:655899740}" Aug 19 00:20:01.158980 containerd[1513]: time="2025-08-19T00:20:01.158919208Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\" id:\"6b347a7a85346888065223285c916d4ff3a48e938fead72463f51169c7623141\" pid:5574 exited_at:{seconds:1755562801 nanos:158548527}" Aug 19 00:20:09.509530 systemd[1]: Started sshd@8-91.99.87.156:22-139.178.89.65:42072.service - OpenSSH per-connection server daemon (139.178.89.65:42072). Aug 19 00:20:10.530070 sshd[5590]: Accepted publickey for core from 139.178.89.65 port 42072 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:10.532893 sshd-session[5590]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:10.538592 systemd-logind[1491]: New session 8 of user core. Aug 19 00:20:10.547679 systemd[1]: Started session-8.scope - Session 8 of User core. Aug 19 00:20:11.319831 sshd[5593]: Connection closed by 139.178.89.65 port 42072 Aug 19 00:20:11.320292 sshd-session[5590]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:11.326843 systemd[1]: sshd@8-91.99.87.156:22-139.178.89.65:42072.service: Deactivated successfully. Aug 19 00:20:11.331003 systemd[1]: session-8.scope: Deactivated successfully. Aug 19 00:20:11.333097 systemd-logind[1491]: Session 8 logged out. Waiting for processes to exit. Aug 19 00:20:11.336074 systemd-logind[1491]: Removed session 8. Aug 19 00:20:13.598917 containerd[1513]: time="2025-08-19T00:20:13.598869749Z" level=info msg="TaskExit event in podsandbox handler container_id:\"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\" id:\"6303cec09d240ed095d8051518b4210fe8a851f627f8e722a6314929ead52c35\" pid:5618 exited_at:{seconds:1755562813 nanos:596885064}" Aug 19 00:20:16.493491 systemd[1]: Started sshd@9-91.99.87.156:22-139.178.89.65:42078.service - OpenSSH per-connection server daemon (139.178.89.65:42078). Aug 19 00:20:17.508217 sshd[5630]: Accepted publickey for core from 139.178.89.65 port 42078 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:17.510200 sshd-session[5630]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:17.519108 systemd-logind[1491]: New session 9 of user core. Aug 19 00:20:17.525944 systemd[1]: Started session-9.scope - Session 9 of User core. Aug 19 00:20:18.322991 sshd[5633]: Connection closed by 139.178.89.65 port 42078 Aug 19 00:20:18.323402 sshd-session[5630]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:18.333786 systemd[1]: sshd@9-91.99.87.156:22-139.178.89.65:42078.service: Deactivated successfully. Aug 19 00:20:18.336819 systemd[1]: session-9.scope: Deactivated successfully. Aug 19 00:20:18.340820 systemd-logind[1491]: Session 9 logged out. Waiting for processes to exit. Aug 19 00:20:18.342585 systemd-logind[1491]: Removed session 9. Aug 19 00:20:23.498294 systemd[1]: Started sshd@10-91.99.87.156:22-139.178.89.65:33790.service - OpenSSH per-connection server daemon (139.178.89.65:33790). Aug 19 00:20:24.510198 sshd[5647]: Accepted publickey for core from 139.178.89.65 port 33790 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:24.512022 sshd-session[5647]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:24.520294 systemd-logind[1491]: New session 10 of user core. Aug 19 00:20:24.526455 systemd[1]: Started session-10.scope - Session 10 of User core. Aug 19 00:20:25.282623 sshd[5650]: Connection closed by 139.178.89.65 port 33790 Aug 19 00:20:25.284049 sshd-session[5647]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:25.289815 systemd[1]: sshd@10-91.99.87.156:22-139.178.89.65:33790.service: Deactivated successfully. Aug 19 00:20:25.294713 systemd[1]: session-10.scope: Deactivated successfully. Aug 19 00:20:25.296209 systemd-logind[1491]: Session 10 logged out. Waiting for processes to exit. Aug 19 00:20:25.299599 systemd-logind[1491]: Removed session 10. Aug 19 00:20:25.476597 systemd[1]: Started sshd@11-91.99.87.156:22-139.178.89.65:33800.service - OpenSSH per-connection server daemon (139.178.89.65:33800). Aug 19 00:20:26.535602 sshd[5663]: Accepted publickey for core from 139.178.89.65 port 33800 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:26.537784 sshd-session[5663]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:26.544255 systemd-logind[1491]: New session 11 of user core. Aug 19 00:20:26.552795 systemd[1]: Started session-11.scope - Session 11 of User core. Aug 19 00:20:27.386156 sshd[5666]: Connection closed by 139.178.89.65 port 33800 Aug 19 00:20:27.386637 sshd-session[5663]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:27.394879 systemd[1]: sshd@11-91.99.87.156:22-139.178.89.65:33800.service: Deactivated successfully. Aug 19 00:20:27.398710 systemd[1]: session-11.scope: Deactivated successfully. Aug 19 00:20:27.400248 systemd-logind[1491]: Session 11 logged out. Waiting for processes to exit. Aug 19 00:20:27.402777 systemd-logind[1491]: Removed session 11. Aug 19 00:20:27.551164 systemd[1]: Started sshd@12-91.99.87.156:22-139.178.89.65:33806.service - OpenSSH per-connection server daemon (139.178.89.65:33806). Aug 19 00:20:27.698754 containerd[1513]: time="2025-08-19T00:20:27.698543271Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4dbb8e1de5ab82508ebfd03ab85f6180351d1457ac37e4fe54e64528626664a3\" id:\"c31171025e04469f84d4e90f95bda05997ca2cd417e499049a07dafce5dbeca2\" pid:5693 exited_at:{seconds:1755562827 nanos:698086790}" Aug 19 00:20:28.554438 sshd[5675]: Accepted publickey for core from 139.178.89.65 port 33806 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:28.557439 sshd-session[5675]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:28.562705 systemd-logind[1491]: New session 12 of user core. Aug 19 00:20:28.578482 systemd[1]: Started session-12.scope - Session 12 of User core. Aug 19 00:20:29.389230 sshd[5707]: Connection closed by 139.178.89.65 port 33806 Aug 19 00:20:29.387955 sshd-session[5675]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:29.394444 systemd[1]: sshd@12-91.99.87.156:22-139.178.89.65:33806.service: Deactivated successfully. Aug 19 00:20:29.394500 systemd-logind[1491]: Session 12 logged out. Waiting for processes to exit. Aug 19 00:20:29.398874 systemd[1]: session-12.scope: Deactivated successfully. Aug 19 00:20:29.402945 systemd-logind[1491]: Removed session 12. Aug 19 00:20:30.649433 containerd[1513]: time="2025-08-19T00:20:30.649357376Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\" id:\"dbbfd16fad678eb76d7d47e0f2906381f255ba534304240cc2ada69a0a09f50c\" pid:5732 exited_at:{seconds:1755562830 nanos:648705015}" Aug 19 00:20:34.571887 systemd[1]: Started sshd@13-91.99.87.156:22-139.178.89.65:57618.service - OpenSSH per-connection server daemon (139.178.89.65:57618). Aug 19 00:20:35.633260 sshd[5742]: Accepted publickey for core from 139.178.89.65 port 57618 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:35.635231 sshd-session[5742]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:35.643860 systemd-logind[1491]: New session 13 of user core. Aug 19 00:20:35.651586 systemd[1]: Started session-13.scope - Session 13 of User core. Aug 19 00:20:36.441936 sshd[5747]: Connection closed by 139.178.89.65 port 57618 Aug 19 00:20:36.442885 sshd-session[5742]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:36.450178 systemd[1]: sshd@13-91.99.87.156:22-139.178.89.65:57618.service: Deactivated successfully. Aug 19 00:20:36.452779 systemd[1]: session-13.scope: Deactivated successfully. Aug 19 00:20:36.454614 systemd-logind[1491]: Session 13 logged out. Waiting for processes to exit. Aug 19 00:20:36.456304 systemd-logind[1491]: Removed session 13. Aug 19 00:20:36.614561 systemd[1]: Started sshd@14-91.99.87.156:22-139.178.89.65:57634.service - OpenSSH per-connection server daemon (139.178.89.65:57634). Aug 19 00:20:37.617959 sshd[5759]: Accepted publickey for core from 139.178.89.65 port 57634 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:37.620420 sshd-session[5759]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:37.626292 systemd-logind[1491]: New session 14 of user core. Aug 19 00:20:37.632402 systemd[1]: Started session-14.scope - Session 14 of User core. Aug 19 00:20:38.553506 sshd[5762]: Connection closed by 139.178.89.65 port 57634 Aug 19 00:20:38.554484 sshd-session[5759]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:38.560971 systemd[1]: sshd@14-91.99.87.156:22-139.178.89.65:57634.service: Deactivated successfully. Aug 19 00:20:38.564281 systemd[1]: session-14.scope: Deactivated successfully. Aug 19 00:20:38.568520 systemd-logind[1491]: Session 14 logged out. Waiting for processes to exit. Aug 19 00:20:38.570625 systemd-logind[1491]: Removed session 14. Aug 19 00:20:38.728278 systemd[1]: Started sshd@15-91.99.87.156:22-139.178.89.65:57644.service - OpenSSH per-connection server daemon (139.178.89.65:57644). Aug 19 00:20:39.729185 sshd[5773]: Accepted publickey for core from 139.178.89.65 port 57644 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:39.732492 sshd-session[5773]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:39.738639 systemd-logind[1491]: New session 15 of user core. Aug 19 00:20:39.747709 systemd[1]: Started session-15.scope - Session 15 of User core. Aug 19 00:20:41.135750 sshd[5776]: Connection closed by 139.178.89.65 port 57644 Aug 19 00:20:41.135540 sshd-session[5773]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:41.144093 systemd[1]: sshd@15-91.99.87.156:22-139.178.89.65:57644.service: Deactivated successfully. Aug 19 00:20:41.150655 systemd[1]: session-15.scope: Deactivated successfully. Aug 19 00:20:41.155221 systemd-logind[1491]: Session 15 logged out. Waiting for processes to exit. Aug 19 00:20:41.160327 systemd-logind[1491]: Removed session 15. Aug 19 00:20:41.311587 systemd[1]: Started sshd@16-91.99.87.156:22-139.178.89.65:49426.service - OpenSSH per-connection server daemon (139.178.89.65:49426). Aug 19 00:20:42.340720 sshd[5796]: Accepted publickey for core from 139.178.89.65 port 49426 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:42.343497 sshd-session[5796]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:42.350585 systemd-logind[1491]: New session 16 of user core. Aug 19 00:20:42.357677 systemd[1]: Started session-16.scope - Session 16 of User core. Aug 19 00:20:43.251678 sshd[5799]: Connection closed by 139.178.89.65 port 49426 Aug 19 00:20:43.252256 sshd-session[5796]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:43.257196 systemd[1]: sshd@16-91.99.87.156:22-139.178.89.65:49426.service: Deactivated successfully. Aug 19 00:20:43.260260 systemd[1]: session-16.scope: Deactivated successfully. Aug 19 00:20:43.261709 systemd-logind[1491]: Session 16 logged out. Waiting for processes to exit. Aug 19 00:20:43.264884 systemd-logind[1491]: Removed session 16. Aug 19 00:20:43.438429 systemd[1]: Started sshd@17-91.99.87.156:22-139.178.89.65:49438.service - OpenSSH per-connection server daemon (139.178.89.65:49438). Aug 19 00:20:43.567974 containerd[1513]: time="2025-08-19T00:20:43.567821023Z" level=info msg="TaskExit event in podsandbox handler container_id:\"62da1006f11dc77598b997ffb1b2e41c9cdc25751502111960f505cb8c63dfa4\" id:\"a9668bb2ce429f2ade4a25303a0d0c997988422cdf9ebf642375ed526ee90cad\" pid:5823 exited_at:{seconds:1755562843 nanos:565883499}" Aug 19 00:20:44.453231 sshd[5808]: Accepted publickey for core from 139.178.89.65 port 49438 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:44.454293 sshd-session[5808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:44.462791 systemd-logind[1491]: New session 17 of user core. Aug 19 00:20:44.471391 systemd[1]: Started session-17.scope - Session 17 of User core. Aug 19 00:20:45.256155 sshd[5835]: Connection closed by 139.178.89.65 port 49438 Aug 19 00:20:45.256987 sshd-session[5808]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:45.266966 systemd[1]: sshd@17-91.99.87.156:22-139.178.89.65:49438.service: Deactivated successfully. Aug 19 00:20:45.272093 systemd[1]: session-17.scope: Deactivated successfully. Aug 19 00:20:45.275734 systemd-logind[1491]: Session 17 logged out. Waiting for processes to exit. Aug 19 00:20:45.279729 systemd-logind[1491]: Removed session 17. Aug 19 00:20:50.356725 containerd[1513]: time="2025-08-19T00:20:50.356573687Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ed47c1116b8e0392a4ae42297e9cf349b0965b1c21e94ba183387e9dd2cd42b6\" id:\"55225ea798b3a4d1819db94109f5031bdfc4f8762c52a95a480efbd00fb4d4d9\" pid:5862 exited_at:{seconds:1755562850 nanos:356010366}" Aug 19 00:20:50.456722 systemd[1]: Started sshd@18-91.99.87.156:22-139.178.89.65:34830.service - OpenSSH per-connection server daemon (139.178.89.65:34830). Aug 19 00:20:51.561684 sshd[5873]: Accepted publickey for core from 139.178.89.65 port 34830 ssh2: RSA SHA256:S/7Q4Z3mW24mSStjTkpIs5+u/ATtNepl8P4sEFUX7v4 Aug 19 00:20:51.564191 sshd-session[5873]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Aug 19 00:20:51.571798 systemd-logind[1491]: New session 18 of user core. Aug 19 00:20:51.576409 systemd[1]: Started session-18.scope - Session 18 of User core. Aug 19 00:20:52.397939 sshd[5876]: Connection closed by 139.178.89.65 port 34830 Aug 19 00:20:52.397736 sshd-session[5873]: pam_unix(sshd:session): session closed for user core Aug 19 00:20:52.404045 systemd[1]: sshd@18-91.99.87.156:22-139.178.89.65:34830.service: Deactivated successfully. Aug 19 00:20:52.409894 systemd[1]: session-18.scope: Deactivated successfully. Aug 19 00:20:52.412116 systemd-logind[1491]: Session 18 logged out. Waiting for processes to exit. Aug 19 00:20:52.416573 systemd-logind[1491]: Removed session 18.