May 16 16:41:39.831940 kernel: Linux version 6.12.20-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Fri May 16 14:52:24 -00 2025 May 16 16:41:39.831962 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=e3be1f8a550c199f4f838f30cb661b44d98bde818b7f263cba125cc457a9c137 May 16 16:41:39.831972 kernel: BIOS-provided physical RAM map: May 16 16:41:39.831978 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable May 16 16:41:39.831984 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved May 16 16:41:39.831991 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved May 16 16:41:39.831998 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000009cfdbfff] usable May 16 16:41:39.832007 kernel: BIOS-e820: [mem 0x000000009cfdc000-0x000000009cffffff] reserved May 16 16:41:39.832013 kernel: BIOS-e820: [mem 0x00000000b0000000-0x00000000bfffffff] reserved May 16 16:41:39.832020 kernel: BIOS-e820: [mem 0x00000000fed1c000-0x00000000fed1ffff] reserved May 16 16:41:39.832028 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved May 16 16:41:39.832037 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved May 16 16:41:39.832045 kernel: BIOS-e820: [mem 0x000000fd00000000-0x000000ffffffffff] reserved May 16 16:41:39.832053 kernel: NX (Execute Disable) protection: active May 16 16:41:39.832064 kernel: APIC: Static calls initialized May 16 16:41:39.832071 kernel: SMBIOS 2.8 present. May 16 16:41:39.832078 kernel: DMI: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.16.2-debian-1.16.2-1 04/01/2014 May 16 16:41:39.832085 kernel: DMI: Memory slots populated: 1/1 May 16 16:41:39.832092 kernel: Hypervisor detected: KVM May 16 16:41:39.832099 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00 May 16 16:41:39.832106 kernel: kvm-clock: using sched offset of 3256126866 cycles May 16 16:41:39.832113 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns May 16 16:41:39.832120 kernel: tsc: Detected 2794.746 MHz processor May 16 16:41:39.832127 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 16 16:41:39.832137 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 16 16:41:39.832144 kernel: last_pfn = 0x9cfdc max_arch_pfn = 0x400000000 May 16 16:41:39.832152 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs May 16 16:41:39.832169 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 16 16:41:39.832176 kernel: Using GB pages for direct mapping May 16 16:41:39.832191 kernel: ACPI: Early table checksum verification disabled May 16 16:41:39.832206 kernel: ACPI: RSDP 0x00000000000F59D0 000014 (v00 BOCHS ) May 16 16:41:39.832220 kernel: ACPI: RSDT 0x000000009CFE2408 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 16:41:39.832230 kernel: ACPI: FACP 0x000000009CFE21E8 0000F4 (v03 BOCHS BXPC 00000001 BXPC 00000001) May 16 16:41:39.832237 kernel: ACPI: DSDT 0x000000009CFE0040 0021A8 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 16:41:39.832244 kernel: ACPI: FACS 0x000000009CFE0000 000040 May 16 16:41:39.832251 kernel: ACPI: APIC 0x000000009CFE22DC 000090 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 16:41:39.832258 kernel: ACPI: HPET 0x000000009CFE236C 000038 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 16:41:39.832265 kernel: ACPI: MCFG 0x000000009CFE23A4 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 16:41:39.832277 kernel: ACPI: WAET 0x000000009CFE23E0 000028 (v01 BOCHS BXPC 00000001 BXPC 00000001) May 16 16:41:39.832284 kernel: ACPI: Reserving FACP table memory at [mem 0x9cfe21e8-0x9cfe22db] May 16 16:41:39.832297 kernel: ACPI: Reserving DSDT table memory at [mem 0x9cfe0040-0x9cfe21e7] May 16 16:41:39.832304 kernel: ACPI: Reserving FACS table memory at [mem 0x9cfe0000-0x9cfe003f] May 16 16:41:39.832311 kernel: ACPI: Reserving APIC table memory at [mem 0x9cfe22dc-0x9cfe236b] May 16 16:41:39.832319 kernel: ACPI: Reserving HPET table memory at [mem 0x9cfe236c-0x9cfe23a3] May 16 16:41:39.832326 kernel: ACPI: Reserving MCFG table memory at [mem 0x9cfe23a4-0x9cfe23df] May 16 16:41:39.832333 kernel: ACPI: Reserving WAET table memory at [mem 0x9cfe23e0-0x9cfe2407] May 16 16:41:39.832342 kernel: No NUMA configuration found May 16 16:41:39.832350 kernel: Faking a node at [mem 0x0000000000000000-0x000000009cfdbfff] May 16 16:41:39.832357 kernel: NODE_DATA(0) allocated [mem 0x9cfd4dc0-0x9cfdbfff] May 16 16:41:39.832364 kernel: Zone ranges: May 16 16:41:39.832372 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 16 16:41:39.832379 kernel: DMA32 [mem 0x0000000001000000-0x000000009cfdbfff] May 16 16:41:39.832386 kernel: Normal empty May 16 16:41:39.832393 kernel: Device empty May 16 16:41:39.832401 kernel: Movable zone start for each node May 16 16:41:39.832408 kernel: Early memory node ranges May 16 16:41:39.832417 kernel: node 0: [mem 0x0000000000001000-0x000000000009efff] May 16 16:41:39.832425 kernel: node 0: [mem 0x0000000000100000-0x000000009cfdbfff] May 16 16:41:39.832432 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000009cfdbfff] May 16 16:41:39.832439 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 16 16:41:39.832446 kernel: On node 0, zone DMA: 97 pages in unavailable ranges May 16 16:41:39.832454 kernel: On node 0, zone DMA32: 12324 pages in unavailable ranges May 16 16:41:39.832461 kernel: ACPI: PM-Timer IO Port: 0x608 May 16 16:41:39.832468 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) May 16 16:41:39.832476 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 16 16:41:39.832485 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) May 16 16:41:39.832493 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) May 16 16:41:39.832500 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 16 16:41:39.832507 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) May 16 16:41:39.832515 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) May 16 16:41:39.832522 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 16 16:41:39.832529 kernel: ACPI: HPET id: 0x8086a201 base: 0xfed00000 May 16 16:41:39.832537 kernel: TSC deadline timer available May 16 16:41:39.832544 kernel: CPU topo: Max. logical packages: 1 May 16 16:41:39.832553 kernel: CPU topo: Max. logical dies: 1 May 16 16:41:39.832561 kernel: CPU topo: Max. dies per package: 1 May 16 16:41:39.832589 kernel: CPU topo: Max. threads per core: 1 May 16 16:41:39.832596 kernel: CPU topo: Num. cores per package: 4 May 16 16:41:39.832603 kernel: CPU topo: Num. threads per package: 4 May 16 16:41:39.832610 kernel: CPU topo: Allowing 4 present CPUs plus 0 hotplug CPUs May 16 16:41:39.832618 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write() May 16 16:41:39.832625 kernel: kvm-guest: KVM setup pv remote TLB flush May 16 16:41:39.832632 kernel: kvm-guest: setup PV sched yield May 16 16:41:39.832639 kernel: [mem 0xc0000000-0xfed1bfff] available for PCI devices May 16 16:41:39.832650 kernel: Booting paravirtualized kernel on KVM May 16 16:41:39.832657 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 16 16:41:39.832665 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:4 nr_cpu_ids:4 nr_node_ids:1 May 16 16:41:39.832672 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u524288 May 16 16:41:39.832679 kernel: pcpu-alloc: s207832 r8192 d29736 u524288 alloc=1*2097152 May 16 16:41:39.832686 kernel: pcpu-alloc: [0] 0 1 2 3 May 16 16:41:39.832694 kernel: kvm-guest: PV spinlocks enabled May 16 16:41:39.832701 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 16 16:41:39.832710 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=e3be1f8a550c199f4f838f30cb661b44d98bde818b7f263cba125cc457a9c137 May 16 16:41:39.832720 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 16 16:41:39.832727 kernel: random: crng init done May 16 16:41:39.832734 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 16 16:41:39.832742 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) May 16 16:41:39.832749 kernel: Fallback order for Node 0: 0 May 16 16:41:39.832757 kernel: Built 1 zonelists, mobility grouping on. Total pages: 642938 May 16 16:41:39.832764 kernel: Policy zone: DMA32 May 16 16:41:39.832771 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 16 16:41:39.832781 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 May 16 16:41:39.832788 kernel: ftrace: allocating 40065 entries in 157 pages May 16 16:41:39.832795 kernel: ftrace: allocated 157 pages with 5 groups May 16 16:41:39.832803 kernel: Dynamic Preempt: voluntary May 16 16:41:39.832810 kernel: rcu: Preemptible hierarchical RCU implementation. May 16 16:41:39.832818 kernel: rcu: RCU event tracing is enabled. May 16 16:41:39.832825 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. May 16 16:41:39.832833 kernel: Trampoline variant of Tasks RCU enabled. May 16 16:41:39.832840 kernel: Rude variant of Tasks RCU enabled. May 16 16:41:39.832850 kernel: Tracing variant of Tasks RCU enabled. May 16 16:41:39.832857 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 16 16:41:39.832864 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 May 16 16:41:39.832872 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 16 16:41:39.832879 kernel: RCU Tasks Rude: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 16 16:41:39.832887 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. May 16 16:41:39.832894 kernel: NR_IRQS: 33024, nr_irqs: 456, preallocated irqs: 16 May 16 16:41:39.832901 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 16 16:41:39.832918 kernel: Console: colour VGA+ 80x25 May 16 16:41:39.832926 kernel: printk: legacy console [ttyS0] enabled May 16 16:41:39.832933 kernel: ACPI: Core revision 20240827 May 16 16:41:39.832941 kernel: clocksource: hpet: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 19112604467 ns May 16 16:41:39.832950 kernel: APIC: Switch to symmetric I/O mode setup May 16 16:41:39.832958 kernel: x2apic enabled May 16 16:41:39.832966 kernel: APIC: Switched APIC routing to: physical x2apic May 16 16:41:39.832973 kernel: kvm-guest: APIC: send_IPI_mask() replaced with kvm_send_ipi_mask() May 16 16:41:39.832982 kernel: kvm-guest: APIC: send_IPI_mask_allbutself() replaced with kvm_send_ipi_mask_allbutself() May 16 16:41:39.832991 kernel: kvm-guest: setup PV IPIs May 16 16:41:39.832999 kernel: ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 May 16 16:41:39.833006 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x2848ddd4e75, max_idle_ns: 440795346320 ns May 16 16:41:39.833014 kernel: Calibrating delay loop (skipped) preset value.. 5589.49 BogoMIPS (lpj=2794746) May 16 16:41:39.833022 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 16 16:41:39.833030 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127 May 16 16:41:39.833037 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0 May 16 16:41:39.833045 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 16 16:41:39.833053 kernel: Spectre V2 : Mitigation: Retpolines May 16 16:41:39.833062 kernel: Spectre V2 : Spectre v2 / SpectreRSB mitigation: Filling RSB on context switch May 16 16:41:39.833070 kernel: Spectre V2 : Spectre v2 / SpectreRSB : Filling RSB on VMEXIT May 16 16:41:39.833086 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls May 16 16:41:39.833094 kernel: RETBleed: Mitigation: untrained return thunk May 16 16:41:39.833110 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier May 16 16:41:39.833118 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl May 16 16:41:39.833125 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied! May 16 16:41:39.833134 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options. May 16 16:41:39.833144 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode May 16 16:41:39.833152 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 16 16:41:39.833163 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 16 16:41:39.833171 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 16 16:41:39.833179 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 16 16:41:39.833186 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format. May 16 16:41:39.833194 kernel: Freeing SMP alternatives memory: 32K May 16 16:41:39.833201 kernel: pid_max: default: 32768 minimum: 301 May 16 16:41:39.833209 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 16 16:41:39.833225 kernel: landlock: Up and running. May 16 16:41:39.833233 kernel: SELinux: Initializing. May 16 16:41:39.833241 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 16 16:41:39.833249 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) May 16 16:41:39.833257 kernel: smpboot: CPU0: AMD EPYC 7402P 24-Core Processor (family: 0x17, model: 0x31, stepping: 0x0) May 16 16:41:39.833264 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver. May 16 16:41:39.833272 kernel: ... version: 0 May 16 16:41:39.833280 kernel: ... bit width: 48 May 16 16:41:39.833287 kernel: ... generic registers: 6 May 16 16:41:39.833297 kernel: ... value mask: 0000ffffffffffff May 16 16:41:39.833305 kernel: ... max period: 00007fffffffffff May 16 16:41:39.833312 kernel: ... fixed-purpose events: 0 May 16 16:41:39.833320 kernel: ... event mask: 000000000000003f May 16 16:41:39.833327 kernel: signal: max sigframe size: 1776 May 16 16:41:39.833335 kernel: rcu: Hierarchical SRCU implementation. May 16 16:41:39.833343 kernel: rcu: Max phase no-delay instances is 400. May 16 16:41:39.833351 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 16 16:41:39.833358 kernel: smp: Bringing up secondary CPUs ... May 16 16:41:39.833368 kernel: smpboot: x86: Booting SMP configuration: May 16 16:41:39.833376 kernel: .... node #0, CPUs: #1 #2 #3 May 16 16:41:39.833383 kernel: smp: Brought up 1 node, 4 CPUs May 16 16:41:39.833391 kernel: smpboot: Total of 4 processors activated (22357.96 BogoMIPS) May 16 16:41:39.833399 kernel: Memory: 2428912K/2571752K available (14336K kernel code, 2438K rwdata, 9944K rodata, 54416K init, 2544K bss, 136904K reserved, 0K cma-reserved) May 16 16:41:39.833407 kernel: devtmpfs: initialized May 16 16:41:39.833414 kernel: x86/mm: Memory block size: 128MB May 16 16:41:39.833424 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 16 16:41:39.833432 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) May 16 16:41:39.833444 kernel: pinctrl core: initialized pinctrl subsystem May 16 16:41:39.833452 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 16 16:41:39.833460 kernel: audit: initializing netlink subsys (disabled) May 16 16:41:39.833468 kernel: audit: type=2000 audit(1747413696.658:1): state=initialized audit_enabled=0 res=1 May 16 16:41:39.833475 kernel: thermal_sys: Registered thermal governor 'step_wise' May 16 16:41:39.833483 kernel: thermal_sys: Registered thermal governor 'user_space' May 16 16:41:39.833490 kernel: cpuidle: using governor menu May 16 16:41:39.833498 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 16 16:41:39.833506 kernel: dca service started, version 1.12.1 May 16 16:41:39.833515 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] (base 0xb0000000) for domain 0000 [bus 00-ff] May 16 16:41:39.833523 kernel: PCI: ECAM [mem 0xb0000000-0xbfffffff] reserved as E820 entry May 16 16:41:39.833531 kernel: PCI: Using configuration type 1 for base access May 16 16:41:39.833538 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 16 16:41:39.833546 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 16 16:41:39.833554 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 16 16:41:39.833561 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 16 16:41:39.833582 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 16 16:41:39.833589 kernel: ACPI: Added _OSI(Module Device) May 16 16:41:39.833599 kernel: ACPI: Added _OSI(Processor Device) May 16 16:41:39.833607 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 16 16:41:39.833614 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 16 16:41:39.833622 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 16 16:41:39.833629 kernel: ACPI: Interpreter enabled May 16 16:41:39.833637 kernel: ACPI: PM: (supports S0 S3 S5) May 16 16:41:39.833644 kernel: ACPI: Using IOAPIC for interrupt routing May 16 16:41:39.833652 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 16 16:41:39.833660 kernel: PCI: Using E820 reservations for host bridge windows May 16 16:41:39.833669 kernel: ACPI: Enabled 2 GPEs in block 00 to 3F May 16 16:41:39.833677 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) May 16 16:41:39.833842 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] May 16 16:41:39.833960 kernel: acpi PNP0A08:00: _OSC: platform does not support [PCIeHotplug LTR] May 16 16:41:39.834077 kernel: acpi PNP0A08:00: _OSC: OS now controls [PME AER PCIeCapability] May 16 16:41:39.834087 kernel: PCI host bridge to bus 0000:00 May 16 16:41:39.834219 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7 window] May 16 16:41:39.834341 kernel: pci_bus 0000:00: root bus resource [io 0x0d00-0xffff window] May 16 16:41:39.834450 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window] May 16 16:41:39.834556 kernel: pci_bus 0000:00: root bus resource [mem 0x9d000000-0xafffffff window] May 16 16:41:39.834697 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window] May 16 16:41:39.834803 kernel: pci_bus 0000:00: root bus resource [mem 0x100000000-0x8ffffffff window] May 16 16:41:39.834906 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] May 16 16:41:39.835050 kernel: pci 0000:00:00.0: [8086:29c0] type 00 class 0x060000 conventional PCI endpoint May 16 16:41:39.835180 kernel: pci 0000:00:01.0: [1234:1111] type 00 class 0x030000 conventional PCI endpoint May 16 16:41:39.835304 kernel: pci 0000:00:01.0: BAR 0 [mem 0xfd000000-0xfdffffff pref] May 16 16:41:39.835417 kernel: pci 0000:00:01.0: BAR 2 [mem 0xfebd0000-0xfebd0fff] May 16 16:41:39.835529 kernel: pci 0000:00:01.0: ROM [mem 0xfebc0000-0xfebcffff pref] May 16 16:41:39.835664 kernel: pci 0000:00:01.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff] May 16 16:41:39.835789 kernel: pci 0000:00:02.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint May 16 16:41:39.835908 kernel: pci 0000:00:02.0: BAR 0 [io 0xc0c0-0xc0df] May 16 16:41:39.836022 kernel: pci 0000:00:02.0: BAR 1 [mem 0xfebd1000-0xfebd1fff] May 16 16:41:39.836135 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfe000000-0xfe003fff 64bit pref] May 16 16:41:39.836265 kernel: pci 0000:00:03.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint May 16 16:41:39.836381 kernel: pci 0000:00:03.0: BAR 0 [io 0xc000-0xc07f] May 16 16:41:39.836494 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfebd2000-0xfebd2fff] May 16 16:41:39.836632 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe004000-0xfe007fff 64bit pref] May 16 16:41:39.836760 kernel: pci 0000:00:04.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint May 16 16:41:39.836874 kernel: pci 0000:00:04.0: BAR 0 [io 0xc0e0-0xc0ff] May 16 16:41:39.836987 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfebd3000-0xfebd3fff] May 16 16:41:39.837099 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe008000-0xfe00bfff 64bit pref] May 16 16:41:39.837219 kernel: pci 0000:00:04.0: ROM [mem 0xfeb80000-0xfebbffff pref] May 16 16:41:39.837343 kernel: pci 0000:00:1f.0: [8086:2918] type 00 class 0x060100 conventional PCI endpoint May 16 16:41:39.837456 kernel: pci 0000:00:1f.0: quirk: [io 0x0600-0x067f] claimed by ICH6 ACPI/GPIO/TCO May 16 16:41:39.837597 kernel: pci 0000:00:1f.2: [8086:2922] type 00 class 0x010601 conventional PCI endpoint May 16 16:41:39.837713 kernel: pci 0000:00:1f.2: BAR 4 [io 0xc100-0xc11f] May 16 16:41:39.837826 kernel: pci 0000:00:1f.2: BAR 5 [mem 0xfebd4000-0xfebd4fff] May 16 16:41:39.837947 kernel: pci 0000:00:1f.3: [8086:2930] type 00 class 0x0c0500 conventional PCI endpoint May 16 16:41:39.838061 kernel: pci 0000:00:1f.3: BAR 4 [io 0x0700-0x073f] May 16 16:41:39.838072 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10 May 16 16:41:39.838080 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10 May 16 16:41:39.838091 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11 May 16 16:41:39.838098 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11 May 16 16:41:39.838106 kernel: ACPI: PCI: Interrupt link LNKE configured for IRQ 10 May 16 16:41:39.838114 kernel: ACPI: PCI: Interrupt link LNKF configured for IRQ 10 May 16 16:41:39.838121 kernel: ACPI: PCI: Interrupt link LNKG configured for IRQ 11 May 16 16:41:39.838129 kernel: ACPI: PCI: Interrupt link LNKH configured for IRQ 11 May 16 16:41:39.838136 kernel: ACPI: PCI: Interrupt link GSIA configured for IRQ 16 May 16 16:41:39.838144 kernel: ACPI: PCI: Interrupt link GSIB configured for IRQ 17 May 16 16:41:39.838151 kernel: ACPI: PCI: Interrupt link GSIC configured for IRQ 18 May 16 16:41:39.838161 kernel: ACPI: PCI: Interrupt link GSID configured for IRQ 19 May 16 16:41:39.838168 kernel: ACPI: PCI: Interrupt link GSIE configured for IRQ 20 May 16 16:41:39.838175 kernel: ACPI: PCI: Interrupt link GSIF configured for IRQ 21 May 16 16:41:39.838183 kernel: ACPI: PCI: Interrupt link GSIG configured for IRQ 22 May 16 16:41:39.838190 kernel: ACPI: PCI: Interrupt link GSIH configured for IRQ 23 May 16 16:41:39.838198 kernel: iommu: Default domain type: Translated May 16 16:41:39.838205 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 16 16:41:39.838220 kernel: PCI: Using ACPI for IRQ routing May 16 16:41:39.838228 kernel: PCI: pci_cache_line_size set to 64 bytes May 16 16:41:39.838237 kernel: e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] May 16 16:41:39.838245 kernel: e820: reserve RAM buffer [mem 0x9cfdc000-0x9fffffff] May 16 16:41:39.838362 kernel: pci 0000:00:01.0: vgaarb: setting as boot VGA device May 16 16:41:39.838474 kernel: pci 0000:00:01.0: vgaarb: bridge control possible May 16 16:41:39.838614 kernel: pci 0000:00:01.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none May 16 16:41:39.838626 kernel: vgaarb: loaded May 16 16:41:39.838633 kernel: hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 May 16 16:41:39.838641 kernel: hpet0: 3 comparators, 64-bit 100.000000 MHz counter May 16 16:41:39.838652 kernel: clocksource: Switched to clocksource kvm-clock May 16 16:41:39.838660 kernel: VFS: Disk quotas dquot_6.6.0 May 16 16:41:39.838667 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 16 16:41:39.838675 kernel: pnp: PnP ACPI init May 16 16:41:39.838804 kernel: system 00:05: [mem 0xb0000000-0xbfffffff window] has been reserved May 16 16:41:39.838815 kernel: pnp: PnP ACPI: found 6 devices May 16 16:41:39.838823 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 16 16:41:39.838831 kernel: NET: Registered PF_INET protocol family May 16 16:41:39.838841 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) May 16 16:41:39.838849 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) May 16 16:41:39.838856 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 16 16:41:39.838864 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) May 16 16:41:39.838872 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) May 16 16:41:39.838879 kernel: TCP: Hash tables configured (established 32768 bind 32768) May 16 16:41:39.838887 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) May 16 16:41:39.838895 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) May 16 16:41:39.838902 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 16 16:41:39.838912 kernel: NET: Registered PF_XDP protocol family May 16 16:41:39.839017 kernel: pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7 window] May 16 16:41:39.839121 kernel: pci_bus 0000:00: resource 5 [io 0x0d00-0xffff window] May 16 16:41:39.839234 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window] May 16 16:41:39.839340 kernel: pci_bus 0000:00: resource 7 [mem 0x9d000000-0xafffffff window] May 16 16:41:39.839445 kernel: pci_bus 0000:00: resource 8 [mem 0xc0000000-0xfebfffff window] May 16 16:41:39.839549 kernel: pci_bus 0000:00: resource 9 [mem 0x100000000-0x8ffffffff window] May 16 16:41:39.839558 kernel: PCI: CLS 0 bytes, default 64 May 16 16:41:39.839588 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x2848ddd4e75, max_idle_ns: 440795346320 ns May 16 16:41:39.839611 kernel: Initialise system trusted keyrings May 16 16:41:39.839627 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 May 16 16:41:39.839634 kernel: Key type asymmetric registered May 16 16:41:39.839642 kernel: Asymmetric key parser 'x509' registered May 16 16:41:39.839650 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 16 16:41:39.839657 kernel: io scheduler mq-deadline registered May 16 16:41:39.839665 kernel: io scheduler kyber registered May 16 16:41:39.839672 kernel: io scheduler bfq registered May 16 16:41:39.839683 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 16 16:41:39.839691 kernel: ACPI: \_SB_.GSIG: Enabled at IRQ 22 May 16 16:41:39.839698 kernel: ACPI: \_SB_.GSIH: Enabled at IRQ 23 May 16 16:41:39.839706 kernel: ACPI: \_SB_.GSIE: Enabled at IRQ 20 May 16 16:41:39.839713 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 16 16:41:39.839721 kernel: 00:03: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 16 16:41:39.839729 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 May 16 16:41:39.839737 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1 May 16 16:41:39.839744 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12 May 16 16:41:39.839872 kernel: rtc_cmos 00:04: RTC can wake from S4 May 16 16:41:39.839883 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 May 16 16:41:39.839989 kernel: rtc_cmos 00:04: registered as rtc0 May 16 16:41:39.840097 kernel: rtc_cmos 00:04: setting system clock to 2025-05-16T16:41:39 UTC (1747413699) May 16 16:41:39.840204 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram, hpet irqs May 16 16:41:39.840222 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled May 16 16:41:39.840230 kernel: NET: Registered PF_INET6 protocol family May 16 16:41:39.840238 kernel: Segment Routing with IPv6 May 16 16:41:39.840249 kernel: In-situ OAM (IOAM) with IPv6 May 16 16:41:39.840257 kernel: NET: Registered PF_PACKET protocol family May 16 16:41:39.840264 kernel: Key type dns_resolver registered May 16 16:41:39.840272 kernel: IPI shorthand broadcast: enabled May 16 16:41:39.840280 kernel: sched_clock: Marking stable (2712002025, 112202459)->(2839349015, -15144531) May 16 16:41:39.840287 kernel: registered taskstats version 1 May 16 16:41:39.840295 kernel: Loading compiled-in X.509 certificates May 16 16:41:39.840302 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.20-flatcar: 310304ddc2cf6c43796c9bf79d11c0543afdf71f' May 16 16:41:39.840310 kernel: Demotion targets for Node 0: null May 16 16:41:39.840319 kernel: Key type .fscrypt registered May 16 16:41:39.840327 kernel: Key type fscrypt-provisioning registered May 16 16:41:39.840334 kernel: ima: No TPM chip found, activating TPM-bypass! May 16 16:41:39.840342 kernel: ima: Allocated hash algorithm: sha1 May 16 16:41:39.840349 kernel: ima: No architecture policies found May 16 16:41:39.840357 kernel: clk: Disabling unused clocks May 16 16:41:39.840364 kernel: Warning: unable to open an initial console. May 16 16:41:39.840372 kernel: Freeing unused kernel image (initmem) memory: 54416K May 16 16:41:39.840380 kernel: Write protecting the kernel read-only data: 24576k May 16 16:41:39.840389 kernel: Freeing unused kernel image (rodata/data gap) memory: 296K May 16 16:41:39.840397 kernel: Run /init as init process May 16 16:41:39.840404 kernel: with arguments: May 16 16:41:39.840412 kernel: /init May 16 16:41:39.840419 kernel: with environment: May 16 16:41:39.840426 kernel: HOME=/ May 16 16:41:39.840434 kernel: TERM=linux May 16 16:41:39.840441 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 16 16:41:39.840450 systemd[1]: Successfully made /usr/ read-only. May 16 16:41:39.840470 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 16 16:41:39.840481 systemd[1]: Detected virtualization kvm. May 16 16:41:39.840489 systemd[1]: Detected architecture x86-64. May 16 16:41:39.840497 systemd[1]: Running in initrd. May 16 16:41:39.840505 systemd[1]: No hostname configured, using default hostname. May 16 16:41:39.840515 systemd[1]: Hostname set to . May 16 16:41:39.840523 systemd[1]: Initializing machine ID from VM UUID. May 16 16:41:39.840532 systemd[1]: Queued start job for default target initrd.target. May 16 16:41:39.840540 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 16 16:41:39.840549 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 16 16:41:39.840558 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 16 16:41:39.840581 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 16 16:41:39.840589 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 16 16:41:39.840601 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 16 16:41:39.840610 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 16 16:41:39.840619 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 16 16:41:39.840627 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 16 16:41:39.840635 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 16 16:41:39.840644 systemd[1]: Reached target paths.target - Path Units. May 16 16:41:39.840652 systemd[1]: Reached target slices.target - Slice Units. May 16 16:41:39.840662 systemd[1]: Reached target swap.target - Swaps. May 16 16:41:39.840671 systemd[1]: Reached target timers.target - Timer Units. May 16 16:41:39.840679 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 16 16:41:39.840687 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 16 16:41:39.840696 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 16 16:41:39.840704 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 16 16:41:39.840713 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 16 16:41:39.840721 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 16 16:41:39.840734 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 16 16:41:39.840742 systemd[1]: Reached target sockets.target - Socket Units. May 16 16:41:39.840750 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 16 16:41:39.840759 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 16 16:41:39.840769 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 16 16:41:39.840778 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 16 16:41:39.840789 systemd[1]: Starting systemd-fsck-usr.service... May 16 16:41:39.840797 systemd[1]: Starting systemd-journald.service - Journal Service... May 16 16:41:39.840806 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 16 16:41:39.840814 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 16:41:39.840823 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 16 16:41:39.840834 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 16 16:41:39.840842 systemd[1]: Finished systemd-fsck-usr.service. May 16 16:41:39.840851 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 16 16:41:39.840876 systemd-journald[220]: Collecting audit messages is disabled. May 16 16:41:39.840914 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 16 16:41:39.840923 systemd-journald[220]: Journal started May 16 16:41:39.840946 systemd-journald[220]: Runtime Journal (/run/log/journal/08837bc546d744308d5fb5b7814b8924) is 6M, max 48.6M, 42.5M free. May 16 16:41:39.831064 systemd-modules-load[221]: Inserted module 'overlay' May 16 16:41:39.870977 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 16 16:41:39.870997 kernel: Bridge firewalling registered May 16 16:41:39.857629 systemd-modules-load[221]: Inserted module 'br_netfilter' May 16 16:41:39.873697 systemd[1]: Started systemd-journald.service - Journal Service. May 16 16:41:39.874111 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 16 16:41:39.876493 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:41:39.882074 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 16 16:41:39.885355 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 16 16:41:39.891279 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 16 16:41:39.891987 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 16 16:41:39.900996 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 16 16:41:39.902033 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 16 16:41:39.906779 systemd-tmpfiles[246]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 16 16:41:39.909004 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 16 16:41:39.911941 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 16 16:41:39.915370 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 16 16:41:39.917795 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 16 16:41:39.944934 dracut-cmdline[262]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyS0,115200 flatcar.first_boot=detected verity.usrhash=e3be1f8a550c199f4f838f30cb661b44d98bde818b7f263cba125cc457a9c137 May 16 16:41:39.961924 systemd-resolved[264]: Positive Trust Anchors: May 16 16:41:39.961938 systemd-resolved[264]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 16 16:41:39.961969 systemd-resolved[264]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 16 16:41:39.964362 systemd-resolved[264]: Defaulting to hostname 'linux'. May 16 16:41:39.965419 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 16 16:41:39.971662 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 16 16:41:40.056598 kernel: SCSI subsystem initialized May 16 16:41:40.065597 kernel: Loading iSCSI transport class v2.0-870. May 16 16:41:40.076598 kernel: iscsi: registered transport (tcp) May 16 16:41:40.097599 kernel: iscsi: registered transport (qla4xxx) May 16 16:41:40.097621 kernel: QLogic iSCSI HBA Driver May 16 16:41:40.117089 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 16 16:41:40.147074 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 16 16:41:40.149392 systemd[1]: Reached target network-pre.target - Preparation for Network. May 16 16:41:40.205212 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 16 16:41:40.207823 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 16 16:41:40.274605 kernel: raid6: avx2x4 gen() 30139 MB/s May 16 16:41:40.291597 kernel: raid6: avx2x2 gen() 30676 MB/s May 16 16:41:40.308686 kernel: raid6: avx2x1 gen() 25806 MB/s May 16 16:41:40.308707 kernel: raid6: using algorithm avx2x2 gen() 30676 MB/s May 16 16:41:40.326714 kernel: raid6: .... xor() 19793 MB/s, rmw enabled May 16 16:41:40.326745 kernel: raid6: using avx2x2 recovery algorithm May 16 16:41:40.346598 kernel: xor: automatically using best checksumming function avx May 16 16:41:40.510628 kernel: Btrfs loaded, zoned=no, fsverity=no May 16 16:41:40.519817 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 16 16:41:40.522609 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 16 16:41:40.561890 systemd-udevd[473]: Using default interface naming scheme 'v255'. May 16 16:41:40.567177 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 16 16:41:40.570753 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 16 16:41:40.595708 dracut-pre-trigger[482]: rd.md=0: removing MD RAID activation May 16 16:41:40.624849 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 16 16:41:40.628498 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 16 16:41:40.710960 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 16 16:41:40.714657 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 16 16:41:40.757596 kernel: cryptd: max_cpu_qlen set to 1000 May 16 16:41:40.761611 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues May 16 16:41:40.785260 kernel: virtio_blk virtio1: [vda] 19775488 512-byte logical blocks (10.1 GB/9.43 GiB) May 16 16:41:40.785407 kernel: libata version 3.00 loaded. May 16 16:41:40.785426 kernel: input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 May 16 16:41:40.785436 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. May 16 16:41:40.785447 kernel: GPT:9289727 != 19775487 May 16 16:41:40.785457 kernel: GPT:Alternate GPT header not at the end of the disk. May 16 16:41:40.785467 kernel: GPT:9289727 != 19775487 May 16 16:41:40.785476 kernel: GPT: Use GNU Parted to correct GPT errors. May 16 16:41:40.785486 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 16 16:41:40.785496 kernel: ahci 0000:00:1f.2: version 3.0 May 16 16:41:40.822919 kernel: ACPI: \_SB_.GSIA: Enabled at IRQ 16 May 16 16:41:40.822943 kernel: AES CTR mode by8 optimization enabled May 16 16:41:40.822954 kernel: ahci 0000:00:1f.2: AHCI vers 0001.0000, 32 command slots, 1.5 Gbps, SATA mode May 16 16:41:40.823104 kernel: ahci 0000:00:1f.2: 6/6 ports implemented (port mask 0x3f) May 16 16:41:40.823247 kernel: ahci 0000:00:1f.2: flags: 64bit ncq only May 16 16:41:40.823385 kernel: scsi host0: ahci May 16 16:41:40.823529 kernel: scsi host1: ahci May 16 16:41:40.823696 kernel: scsi host2: ahci May 16 16:41:40.823845 kernel: scsi host3: ahci May 16 16:41:40.823977 kernel: scsi host4: ahci May 16 16:41:40.824108 kernel: scsi host5: ahci May 16 16:41:40.824277 kernel: ata1: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4100 irq 34 lpm-pol 0 May 16 16:41:40.824292 kernel: ata2: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4180 irq 34 lpm-pol 0 May 16 16:41:40.824306 kernel: ata3: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4200 irq 34 lpm-pol 0 May 16 16:41:40.824320 kernel: ata4: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4280 irq 34 lpm-pol 0 May 16 16:41:40.824333 kernel: ata5: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4300 irq 34 lpm-pol 0 May 16 16:41:40.824344 kernel: ata6: SATA max UDMA/133 abar m4096@0xfebd4000 port 0xfebd4380 irq 34 lpm-pol 0 May 16 16:41:40.773732 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 16 16:41:40.773850 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:41:40.776278 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 16 16:41:40.779746 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 16:41:40.829523 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. May 16 16:41:40.863940 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:41:40.886508 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. May 16 16:41:40.895801 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. May 16 16:41:40.898363 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132. May 16 16:41:40.910770 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 16 16:41:40.914350 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 16 16:41:40.945949 disk-uuid[632]: Primary Header is updated. May 16 16:41:40.945949 disk-uuid[632]: Secondary Entries is updated. May 16 16:41:40.945949 disk-uuid[632]: Secondary Header is updated. May 16 16:41:40.949575 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 16 16:41:40.953586 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 16 16:41:41.135553 kernel: ata5: SATA link down (SStatus 0 SControl 300) May 16 16:41:41.135639 kernel: ata3: SATA link up 1.5 Gbps (SStatus 113 SControl 300) May 16 16:41:41.135651 kernel: ata2: SATA link down (SStatus 0 SControl 300) May 16 16:41:41.135674 kernel: ata1: SATA link down (SStatus 0 SControl 300) May 16 16:41:41.136595 kernel: ata4: SATA link down (SStatus 0 SControl 300) May 16 16:41:41.137595 kernel: ata6: SATA link down (SStatus 0 SControl 300) May 16 16:41:41.138601 kernel: ata3.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 May 16 16:41:41.138616 kernel: ata3.00: applying bridge limits May 16 16:41:41.139588 kernel: ata3.00: configured for UDMA/100 May 16 16:41:41.141594 kernel: scsi 2:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 May 16 16:41:41.188177 kernel: sr 2:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray May 16 16:41:41.208268 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 16 16:41:41.208287 kernel: sr 2:0:0:0: Attached scsi CD-ROM sr0 May 16 16:41:41.552753 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 16 16:41:41.554677 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 16 16:41:41.556716 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 16 16:41:41.558110 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 16 16:41:41.560346 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 16 16:41:41.597888 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 16 16:41:41.955598 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 May 16 16:41:41.956368 disk-uuid[634]: The operation has completed successfully. May 16 16:41:41.985602 systemd[1]: disk-uuid.service: Deactivated successfully. May 16 16:41:41.985724 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 16 16:41:42.018202 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 16 16:41:42.032717 sh[662]: Success May 16 16:41:42.049892 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 16 16:41:42.049946 kernel: device-mapper: uevent: version 1.0.3 May 16 16:41:42.050978 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 16 16:41:42.059590 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 16 16:41:42.088754 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 16 16:41:42.092148 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 16 16:41:42.116454 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 16 16:41:42.123379 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 16 16:41:42.123436 kernel: BTRFS: device fsid 85b2a34c-237f-4a0a-87d0-0a783de0f256 devid 1 transid 39 /dev/mapper/usr (253:0) scanned by mount (674) May 16 16:41:42.125615 kernel: BTRFS info (device dm-0): first mount of filesystem 85b2a34c-237f-4a0a-87d0-0a783de0f256 May 16 16:41:42.125637 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 16 16:41:42.125648 kernel: BTRFS info (device dm-0): using free-space-tree May 16 16:41:42.130406 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 16 16:41:42.130977 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 16 16:41:42.133266 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 16 16:41:42.134025 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 16 16:41:42.135823 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 16 16:41:42.163592 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (707) May 16 16:41:42.163638 kernel: BTRFS info (device vda6): first mount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:41:42.164590 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 16 16:41:42.165786 kernel: BTRFS info (device vda6): using free-space-tree May 16 16:41:42.171587 kernel: BTRFS info (device vda6): last unmount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:41:42.172885 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 16 16:41:42.173846 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 16 16:41:42.251688 ignition[746]: Ignition 2.21.0 May 16 16:41:42.252043 ignition[746]: Stage: fetch-offline May 16 16:41:42.252096 ignition[746]: no configs at "/usr/lib/ignition/base.d" May 16 16:41:42.252111 ignition[746]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 16 16:41:42.252212 ignition[746]: parsed url from cmdline: "" May 16 16:41:42.252216 ignition[746]: no config URL provided May 16 16:41:42.252221 ignition[746]: reading system config file "/usr/lib/ignition/user.ign" May 16 16:41:42.252229 ignition[746]: no config at "/usr/lib/ignition/user.ign" May 16 16:41:42.252254 ignition[746]: op(1): [started] loading QEMU firmware config module May 16 16:41:42.252259 ignition[746]: op(1): executing: "modprobe" "qemu_fw_cfg" May 16 16:41:42.261382 ignition[746]: op(1): [finished] loading QEMU firmware config module May 16 16:41:42.269532 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 16 16:41:42.274595 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 16 16:41:42.304233 ignition[746]: parsing config with SHA512: a368439afef424cbe455701d46494dc241f8cd3d3bab49eb07f1dfb725060d93c9fe21947d4d99e54b8407a800630be1ac204ee637d5b6b7d84e87bbac8b503d May 16 16:41:42.307636 unknown[746]: fetched base config from "system" May 16 16:41:42.307646 unknown[746]: fetched user config from "qemu" May 16 16:41:42.307933 ignition[746]: fetch-offline: fetch-offline passed May 16 16:41:42.307978 ignition[746]: Ignition finished successfully May 16 16:41:42.310999 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 16 16:41:42.322615 systemd-networkd[852]: lo: Link UP May 16 16:41:42.322625 systemd-networkd[852]: lo: Gained carrier May 16 16:41:42.324123 systemd-networkd[852]: Enumeration completed May 16 16:41:42.324216 systemd[1]: Started systemd-networkd.service - Network Configuration. May 16 16:41:42.324468 systemd-networkd[852]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 16 16:41:42.324472 systemd-networkd[852]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 16 16:41:42.325198 systemd-networkd[852]: eth0: Link UP May 16 16:41:42.325202 systemd-networkd[852]: eth0: Gained carrier May 16 16:41:42.325209 systemd-networkd[852]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 16 16:41:42.325796 systemd[1]: Reached target network.target - Network. May 16 16:41:42.327421 systemd[1]: ignition-fetch.service - Ignition (fetch) was skipped because of an unmet condition check (ConditionPathExists=!/run/ignition.json). May 16 16:41:42.328204 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 16 16:41:42.354608 systemd-networkd[852]: eth0: DHCPv4 address 10.0.0.80/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 16 16:41:42.366254 ignition[856]: Ignition 2.21.0 May 16 16:41:42.366270 ignition[856]: Stage: kargs May 16 16:41:42.366458 ignition[856]: no configs at "/usr/lib/ignition/base.d" May 16 16:41:42.366475 ignition[856]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 16 16:41:42.368926 ignition[856]: kargs: kargs passed May 16 16:41:42.368994 ignition[856]: Ignition finished successfully May 16 16:41:42.375602 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 16 16:41:42.377837 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 16 16:41:42.415973 ignition[865]: Ignition 2.21.0 May 16 16:41:42.415985 ignition[865]: Stage: disks May 16 16:41:42.416297 ignition[865]: no configs at "/usr/lib/ignition/base.d" May 16 16:41:42.416309 ignition[865]: no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 16 16:41:42.418071 ignition[865]: disks: disks passed May 16 16:41:42.418116 ignition[865]: Ignition finished successfully May 16 16:41:42.422780 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 16 16:41:42.424972 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 16 16:41:42.425052 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 16 16:41:42.427312 systemd[1]: Reached target local-fs.target - Local File Systems. May 16 16:41:42.429813 systemd[1]: Reached target sysinit.target - System Initialization. May 16 16:41:42.431895 systemd[1]: Reached target basic.target - Basic System. May 16 16:41:42.434850 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 16 16:41:42.463936 systemd-fsck[875]: ROOT: clean, 15/553520 files, 52789/553472 blocks May 16 16:41:42.471968 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 16 16:41:42.475265 systemd[1]: Mounting sysroot.mount - /sysroot... May 16 16:41:42.578586 kernel: EXT4-fs (vda9): mounted filesystem 07293137-138a-42a3-a962-d767034e11a7 r/w with ordered data mode. Quota mode: none. May 16 16:41:42.579288 systemd[1]: Mounted sysroot.mount - /sysroot. May 16 16:41:42.580887 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 16 16:41:42.583469 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 16 16:41:42.586168 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 16 16:41:42.586603 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. May 16 16:41:42.586659 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 16 16:41:42.586686 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 16 16:41:42.606665 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 16 16:41:42.610320 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 16 16:41:42.614717 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (883) May 16 16:41:42.614748 kernel: BTRFS info (device vda6): first mount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:41:42.614764 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 16 16:41:42.614779 kernel: BTRFS info (device vda6): using free-space-tree May 16 16:41:42.618692 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 16 16:41:42.651803 initrd-setup-root[907]: cut: /sysroot/etc/passwd: No such file or directory May 16 16:41:42.657071 initrd-setup-root[914]: cut: /sysroot/etc/group: No such file or directory May 16 16:41:42.661915 initrd-setup-root[921]: cut: /sysroot/etc/shadow: No such file or directory May 16 16:41:42.666219 initrd-setup-root[928]: cut: /sysroot/etc/gshadow: No such file or directory May 16 16:41:42.752205 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 16 16:41:42.754274 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 16 16:41:42.755941 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 16 16:41:42.771593 kernel: BTRFS info (device vda6): last unmount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:41:42.783671 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 16 16:41:42.796945 ignition[998]: INFO : Ignition 2.21.0 May 16 16:41:42.796945 ignition[998]: INFO : Stage: mount May 16 16:41:42.798596 ignition[998]: INFO : no configs at "/usr/lib/ignition/base.d" May 16 16:41:42.798596 ignition[998]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 16 16:41:42.802903 ignition[998]: INFO : mount: mount passed May 16 16:41:42.803745 ignition[998]: INFO : Ignition finished successfully May 16 16:41:42.807261 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 16 16:41:42.809315 systemd[1]: Starting ignition-files.service - Ignition (files)... May 16 16:41:43.122588 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 16 16:41:43.124255 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 16 16:41:43.158254 kernel: BTRFS: device label OEM devid 1 transid 14 /dev/vda6 (254:6) scanned by mount (1010) May 16 16:41:43.158282 kernel: BTRFS info (device vda6): first mount of filesystem 97ba3731-2b30-4c65-8762-24a0a058313d May 16 16:41:43.158293 kernel: BTRFS info (device vda6): using crc32c (crc32c-intel) checksum algorithm May 16 16:41:43.159772 kernel: BTRFS info (device vda6): using free-space-tree May 16 16:41:43.162954 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 16 16:41:43.197320 ignition[1027]: INFO : Ignition 2.21.0 May 16 16:41:43.197320 ignition[1027]: INFO : Stage: files May 16 16:41:43.198974 ignition[1027]: INFO : no configs at "/usr/lib/ignition/base.d" May 16 16:41:43.198974 ignition[1027]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 16 16:41:43.202692 ignition[1027]: DEBUG : files: compiled without relabeling support, skipping May 16 16:41:43.204427 ignition[1027]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 16 16:41:43.204427 ignition[1027]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 16 16:41:43.207705 ignition[1027]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 16 16:41:43.209230 ignition[1027]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 16 16:41:43.209230 ignition[1027]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 16 16:41:43.208311 unknown[1027]: wrote ssh authorized keys file for user: core May 16 16:41:43.213244 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" May 16 16:41:43.213244 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-amd64.tar.gz: attempt #1 May 16 16:41:43.263678 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 16 16:41:43.649795 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-amd64.tar.gz" May 16 16:41:43.652125 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 16 16:41:43.652125 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 16 16:41:43.652125 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 16 16:41:43.652125 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 16 16:41:43.652125 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 16 16:41:43.652125 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 16 16:41:43.652125 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 16 16:41:43.652125 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 16 16:41:43.718822 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 16 16:41:43.730321 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 16 16:41:43.730321 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 16 16:41:43.767719 systemd-networkd[852]: eth0: Gained IPv6LL May 16 16:41:43.787488 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 16 16:41:43.790052 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 16 16:41:43.790052 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-x86-64.raw: attempt #1 May 16 16:41:44.480347 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 16 16:41:44.889184 ignition[1027]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-x86-64.raw" May 16 16:41:44.889184 ignition[1027]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 16 16:41:44.893154 ignition[1027]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 16 16:41:44.896913 ignition[1027]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 16 16:41:44.896913 ignition[1027]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 16 16:41:44.896913 ignition[1027]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" May 16 16:41:44.901894 ignition[1027]: INFO : files: op(d): op(e): [started] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 16 16:41:44.901894 ignition[1027]: INFO : files: op(d): op(e): [finished] writing unit "coreos-metadata.service" at "/sysroot/etc/systemd/system/coreos-metadata.service" May 16 16:41:44.901894 ignition[1027]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" May 16 16:41:44.901894 ignition[1027]: INFO : files: op(f): [started] setting preset to disabled for "coreos-metadata.service" May 16 16:41:44.917233 ignition[1027]: INFO : files: op(f): op(10): [started] removing enablement symlink(s) for "coreos-metadata.service" May 16 16:41:44.921004 ignition[1027]: INFO : files: op(f): op(10): [finished] removing enablement symlink(s) for "coreos-metadata.service" May 16 16:41:44.922671 ignition[1027]: INFO : files: op(f): [finished] setting preset to disabled for "coreos-metadata.service" May 16 16:41:44.922671 ignition[1027]: INFO : files: op(11): [started] setting preset to enabled for "prepare-helm.service" May 16 16:41:44.922671 ignition[1027]: INFO : files: op(11): [finished] setting preset to enabled for "prepare-helm.service" May 16 16:41:44.922671 ignition[1027]: INFO : files: createResultFile: createFiles: op(12): [started] writing file "/sysroot/etc/.ignition-result.json" May 16 16:41:44.922671 ignition[1027]: INFO : files: createResultFile: createFiles: op(12): [finished] writing file "/sysroot/etc/.ignition-result.json" May 16 16:41:44.922671 ignition[1027]: INFO : files: files passed May 16 16:41:44.922671 ignition[1027]: INFO : Ignition finished successfully May 16 16:41:44.924397 systemd[1]: Finished ignition-files.service - Ignition (files). May 16 16:41:44.929137 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 16 16:41:44.934248 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 16 16:41:44.950096 systemd[1]: ignition-quench.service: Deactivated successfully. May 16 16:41:44.950223 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 16 16:41:44.954524 initrd-setup-root-after-ignition[1056]: grep: /sysroot/oem/oem-release: No such file or directory May 16 16:41:44.957669 initrd-setup-root-after-ignition[1058]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 16 16:41:44.957669 initrd-setup-root-after-ignition[1058]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 16 16:41:44.960806 initrd-setup-root-after-ignition[1062]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 16 16:41:44.963748 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 16 16:41:44.965177 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 16 16:41:44.968492 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 16 16:41:45.013763 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 16 16:41:45.013892 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 16 16:41:45.014982 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 16 16:41:45.017163 systemd[1]: Reached target initrd.target - Initrd Default Target. May 16 16:41:45.019112 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 16 16:41:45.022476 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 16 16:41:45.063587 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 16 16:41:45.066334 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 16 16:41:45.092150 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 16 16:41:45.092297 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 16 16:41:45.094501 systemd[1]: Stopped target timers.target - Timer Units. May 16 16:41:45.096699 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 16 16:41:45.096812 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 16 16:41:45.101456 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 16 16:41:45.101599 systemd[1]: Stopped target basic.target - Basic System. May 16 16:41:45.103504 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 16 16:41:45.103998 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 16 16:41:45.108235 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 16 16:41:45.109350 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 16 16:41:45.111430 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 16 16:41:45.113502 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 16 16:41:45.115428 systemd[1]: Stopped target sysinit.target - System Initialization. May 16 16:41:45.117720 systemd[1]: Stopped target local-fs.target - Local File Systems. May 16 16:41:45.119624 systemd[1]: Stopped target swap.target - Swaps. May 16 16:41:45.121451 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 16 16:41:45.121558 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 16 16:41:45.125780 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 16 16:41:45.125913 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 16 16:41:45.127901 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 16 16:41:45.130993 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 16 16:41:45.131096 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 16 16:41:45.131206 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 16 16:41:45.136380 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 16 16:41:45.136493 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 16 16:41:45.137585 systemd[1]: Stopped target paths.target - Path Units. May 16 16:41:45.140515 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 16 16:41:45.145615 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 16 16:41:45.145753 systemd[1]: Stopped target slices.target - Slice Units. May 16 16:41:45.148333 systemd[1]: Stopped target sockets.target - Socket Units. May 16 16:41:45.150014 systemd[1]: iscsid.socket: Deactivated successfully. May 16 16:41:45.150099 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 16 16:41:45.151766 systemd[1]: iscsiuio.socket: Deactivated successfully. May 16 16:41:45.151846 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 16 16:41:45.153493 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 16 16:41:45.153620 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 16 16:41:45.155307 systemd[1]: ignition-files.service: Deactivated successfully. May 16 16:41:45.155409 systemd[1]: Stopped ignition-files.service - Ignition (files). May 16 16:41:45.160206 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 16 16:41:45.161355 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 16 16:41:45.161464 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 16 16:41:45.163886 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 16 16:41:45.165127 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 16 16:41:45.165239 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 16 16:41:45.172869 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 16 16:41:45.173937 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 16 16:41:45.179939 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 16 16:41:45.180047 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 16 16:41:45.191093 ignition[1083]: INFO : Ignition 2.21.0 May 16 16:41:45.191093 ignition[1083]: INFO : Stage: umount May 16 16:41:45.192947 ignition[1083]: INFO : no configs at "/usr/lib/ignition/base.d" May 16 16:41:45.192947 ignition[1083]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/qemu" May 16 16:41:45.195325 ignition[1083]: INFO : umount: umount passed May 16 16:41:45.195325 ignition[1083]: INFO : Ignition finished successfully May 16 16:41:45.196213 systemd[1]: ignition-mount.service: Deactivated successfully. May 16 16:41:45.196351 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 16 16:41:45.198360 systemd[1]: Stopped target network.target - Network. May 16 16:41:45.199975 systemd[1]: ignition-disks.service: Deactivated successfully. May 16 16:41:45.200028 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 16 16:41:45.201893 systemd[1]: ignition-kargs.service: Deactivated successfully. May 16 16:41:45.201937 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 16 16:41:45.203860 systemd[1]: ignition-setup.service: Deactivated successfully. May 16 16:41:45.203911 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 16 16:41:45.204814 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 16 16:41:45.204856 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 16 16:41:45.207266 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 16 16:41:45.210791 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 16 16:41:45.215255 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 16 16:41:45.219469 systemd[1]: systemd-resolved.service: Deactivated successfully. May 16 16:41:45.219604 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 16 16:41:45.223463 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 16 16:41:45.223730 systemd[1]: systemd-networkd.service: Deactivated successfully. May 16 16:41:45.223842 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 16 16:41:45.227680 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 16 16:41:45.228318 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 16 16:41:45.229119 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 16 16:41:45.229159 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 16 16:41:45.230166 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 16 16:41:45.233296 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 16 16:41:45.233344 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 16 16:41:45.233860 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 16 16:41:45.233901 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 16 16:41:45.239152 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 16 16:41:45.239195 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 16 16:41:45.240205 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 16 16:41:45.240248 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 16 16:41:45.244552 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 16 16:41:45.247316 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 16 16:41:45.247375 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 16 16:41:45.258630 systemd[1]: network-cleanup.service: Deactivated successfully. May 16 16:41:45.258773 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 16 16:41:45.264422 systemd[1]: systemd-udevd.service: Deactivated successfully. May 16 16:41:45.264619 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 16 16:41:45.265717 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 16 16:41:45.265762 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 16 16:41:45.268930 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 16 16:41:45.268966 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 16 16:41:45.269985 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 16 16:41:45.270032 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 16 16:41:45.273437 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 16 16:41:45.273485 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 16 16:41:45.274267 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 16 16:41:45.274313 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 16 16:41:45.275512 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 16 16:41:45.279677 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 16 16:41:45.279737 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 16 16:41:45.283682 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 16 16:41:45.283731 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 16 16:41:45.287334 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 16 16:41:45.287396 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:41:45.295989 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 16 16:41:45.296052 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 16 16:41:45.296108 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 16 16:41:45.309531 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 16 16:41:45.309664 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 16 16:41:45.364940 systemd[1]: sysroot-boot.service: Deactivated successfully. May 16 16:41:45.365085 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 16 16:41:45.368669 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 16 16:41:45.368740 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 16 16:41:45.368795 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 16 16:41:45.375067 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 16 16:41:45.412892 systemd[1]: Switching root. May 16 16:41:45.456932 systemd-journald[220]: Journal stopped May 16 16:41:46.611211 systemd-journald[220]: Received SIGTERM from PID 1 (systemd). May 16 16:41:46.611274 kernel: SELinux: policy capability network_peer_controls=1 May 16 16:41:46.611295 kernel: SELinux: policy capability open_perms=1 May 16 16:41:46.611312 kernel: SELinux: policy capability extended_socket_class=1 May 16 16:41:46.611332 kernel: SELinux: policy capability always_check_network=0 May 16 16:41:46.611346 kernel: SELinux: policy capability cgroup_seclabel=1 May 16 16:41:46.611360 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 16 16:41:46.611379 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 16 16:41:46.611392 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 16 16:41:46.611411 kernel: SELinux: policy capability userspace_initial_context=0 May 16 16:41:46.611423 kernel: audit: type=1403 audit(1747413705.856:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 16 16:41:46.611435 systemd[1]: Successfully loaded SELinux policy in 52.670ms. May 16 16:41:46.611474 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 12.221ms. May 16 16:41:46.611495 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 16 16:41:46.611508 systemd[1]: Detected virtualization kvm. May 16 16:41:46.611519 systemd[1]: Detected architecture x86-64. May 16 16:41:46.611534 systemd[1]: Detected first boot. May 16 16:41:46.611546 systemd[1]: Initializing machine ID from VM UUID. May 16 16:41:46.611577 zram_generator::config[1130]: No configuration found. May 16 16:41:46.611591 kernel: Guest personality initialized and is inactive May 16 16:41:46.611602 kernel: VMCI host device registered (name=vmci, major=10, minor=125) May 16 16:41:46.611613 kernel: Initialized host personality May 16 16:41:46.611624 kernel: NET: Registered PF_VSOCK protocol family May 16 16:41:46.611637 systemd[1]: Populated /etc with preset unit settings. May 16 16:41:46.611652 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 16 16:41:46.611664 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 16 16:41:46.611676 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 16 16:41:46.611687 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 16 16:41:46.611700 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 16 16:41:46.611712 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 16 16:41:46.611724 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 16 16:41:46.611735 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 16 16:41:46.611747 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 16 16:41:46.611762 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 16 16:41:46.611774 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 16 16:41:46.611785 systemd[1]: Created slice user.slice - User and Session Slice. May 16 16:41:46.611797 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 16 16:41:46.611809 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 16 16:41:46.611821 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 16 16:41:46.611833 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 16 16:41:46.611845 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 16 16:41:46.611859 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 16 16:41:46.611871 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 16 16:41:46.611883 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 16 16:41:46.611896 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 16 16:41:46.611908 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 16 16:41:46.611925 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 16 16:41:46.611937 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 16 16:41:46.611949 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 16 16:41:46.611962 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 16 16:41:46.611974 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 16 16:41:46.611986 systemd[1]: Reached target slices.target - Slice Units. May 16 16:41:46.611998 systemd[1]: Reached target swap.target - Swaps. May 16 16:41:46.612010 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 16 16:41:46.612021 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 16 16:41:46.612033 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 16 16:41:46.612045 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 16 16:41:46.612056 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 16 16:41:46.612068 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 16 16:41:46.612090 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 16 16:41:46.612103 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 16 16:41:46.612118 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 16 16:41:46.612133 systemd[1]: Mounting media.mount - External Media Directory... May 16 16:41:46.612148 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:41:46.612163 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 16 16:41:46.612180 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 16 16:41:46.612195 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 16 16:41:46.612210 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 16 16:41:46.612222 systemd[1]: Reached target machines.target - Containers. May 16 16:41:46.612233 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 16 16:41:46.612246 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 16 16:41:46.612257 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 16 16:41:46.612269 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 16 16:41:46.612280 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 16 16:41:46.612292 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 16 16:41:46.612304 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 16 16:41:46.612317 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 16 16:41:46.612329 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 16 16:41:46.612341 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 16 16:41:46.612352 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 16 16:41:46.612364 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 16 16:41:46.612376 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 16 16:41:46.612387 systemd[1]: Stopped systemd-fsck-usr.service. May 16 16:41:46.612400 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 16:41:46.612413 systemd[1]: Starting systemd-journald.service - Journal Service... May 16 16:41:46.612425 kernel: fuse: init (API version 7.41) May 16 16:41:46.612437 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 16 16:41:46.612448 kernel: loop: module loaded May 16 16:41:46.612460 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 16 16:41:46.612472 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 16 16:41:46.612484 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 16 16:41:46.612496 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 16 16:41:46.612510 systemd[1]: verity-setup.service: Deactivated successfully. May 16 16:41:46.612522 systemd[1]: Stopped verity-setup.service. May 16 16:41:46.612534 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:41:46.612546 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 16 16:41:46.612557 kernel: ACPI: bus type drm_connector registered May 16 16:41:46.612672 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 16 16:41:46.612687 systemd[1]: Mounted media.mount - External Media Directory. May 16 16:41:46.612699 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 16 16:41:46.612713 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 16 16:41:46.612725 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 16 16:41:46.612760 systemd-journald[1200]: Collecting audit messages is disabled. May 16 16:41:46.612786 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 16 16:41:46.612799 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 16 16:41:46.612811 systemd-journald[1200]: Journal started May 16 16:41:46.612836 systemd-journald[1200]: Runtime Journal (/run/log/journal/08837bc546d744308d5fb5b7814b8924) is 6M, max 48.6M, 42.5M free. May 16 16:41:46.367759 systemd[1]: Queued start job for default target multi-user.target. May 16 16:41:46.387347 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. May 16 16:41:46.387797 systemd[1]: systemd-journald.service: Deactivated successfully. May 16 16:41:46.617039 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 16 16:41:46.618604 systemd[1]: Started systemd-journald.service - Journal Service. May 16 16:41:46.620028 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 16 16:41:46.621691 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 16 16:41:46.621914 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 16 16:41:46.623356 systemd[1]: modprobe@drm.service: Deactivated successfully. May 16 16:41:46.623579 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 16 16:41:46.625098 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 16 16:41:46.625375 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 16 16:41:46.626893 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 16 16:41:46.627110 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 16 16:41:46.628525 systemd[1]: modprobe@loop.service: Deactivated successfully. May 16 16:41:46.628805 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 16 16:41:46.630242 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 16 16:41:46.631677 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 16 16:41:46.633226 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 16 16:41:46.634799 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 16 16:41:46.649446 systemd[1]: Reached target network-pre.target - Preparation for Network. May 16 16:41:46.652150 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 16 16:41:46.654498 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 16 16:41:46.655765 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 16 16:41:46.655796 systemd[1]: Reached target local-fs.target - Local File Systems. May 16 16:41:46.658048 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 16 16:41:46.661693 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 16 16:41:46.662819 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 16:41:46.666050 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 16 16:41:46.668301 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 16 16:41:46.669696 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 16 16:41:46.671386 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 16 16:41:46.672645 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 16 16:41:46.674377 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 16 16:41:46.678666 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 16 16:41:46.681742 systemd-journald[1200]: Time spent on flushing to /var/log/journal/08837bc546d744308d5fb5b7814b8924 is 18.779ms for 976 entries. May 16 16:41:46.681742 systemd-journald[1200]: System Journal (/var/log/journal/08837bc546d744308d5fb5b7814b8924) is 8M, max 195.6M, 187.6M free. May 16 16:41:46.721754 systemd-journald[1200]: Received client request to flush runtime journal. May 16 16:41:46.721860 kernel: loop0: detected capacity change from 0 to 146240 May 16 16:41:46.681386 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 16 16:41:46.685626 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 16 16:41:46.687226 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 16 16:41:46.689053 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 16 16:41:46.697624 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 16 16:41:46.699537 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 16 16:41:46.704361 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 16 16:41:46.716043 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 16 16:41:46.724686 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 16 16:41:46.735606 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 16 16:41:46.738810 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 16 16:41:46.743484 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 16 16:41:46.746706 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 16 16:41:46.759610 kernel: loop1: detected capacity change from 0 to 113872 May 16 16:41:46.777151 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. May 16 16:41:46.777169 systemd-tmpfiles[1266]: ACLs are not supported, ignoring. May 16 16:41:46.782586 kernel: loop2: detected capacity change from 0 to 229808 May 16 16:41:46.782682 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 16 16:41:46.813601 kernel: loop3: detected capacity change from 0 to 146240 May 16 16:41:46.826597 kernel: loop4: detected capacity change from 0 to 113872 May 16 16:41:46.835586 kernel: loop5: detected capacity change from 0 to 229808 May 16 16:41:46.842452 (sd-merge)[1271]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes'. May 16 16:41:46.842998 (sd-merge)[1271]: Merged extensions into '/usr'. May 16 16:41:46.847677 systemd[1]: Reload requested from client PID 1249 ('systemd-sysext') (unit systemd-sysext.service)... May 16 16:41:46.847693 systemd[1]: Reloading... May 16 16:41:46.904659 zram_generator::config[1299]: No configuration found. May 16 16:41:46.979296 ldconfig[1244]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 16 16:41:47.002108 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 16:41:47.081545 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 16 16:41:47.082081 systemd[1]: Reloading finished in 233 ms. May 16 16:41:47.113907 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 16 16:41:47.115582 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 16 16:41:47.128999 systemd[1]: Starting ensure-sysext.service... May 16 16:41:47.131178 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 16 16:41:47.140609 systemd[1]: Reload requested from client PID 1335 ('systemctl') (unit ensure-sysext.service)... May 16 16:41:47.140625 systemd[1]: Reloading... May 16 16:41:47.152452 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 16 16:41:47.152492 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 16 16:41:47.153160 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 16 16:41:47.153465 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 16 16:41:47.154408 systemd-tmpfiles[1336]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 16 16:41:47.154749 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. May 16 16:41:47.154881 systemd-tmpfiles[1336]: ACLs are not supported, ignoring. May 16 16:41:47.159029 systemd-tmpfiles[1336]: Detected autofs mount point /boot during canonicalization of boot. May 16 16:41:47.159163 systemd-tmpfiles[1336]: Skipping /boot May 16 16:41:47.171226 systemd-tmpfiles[1336]: Detected autofs mount point /boot during canonicalization of boot. May 16 16:41:47.171308 systemd-tmpfiles[1336]: Skipping /boot May 16 16:41:47.192584 zram_generator::config[1366]: No configuration found. May 16 16:41:47.280183 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 16:41:47.358755 systemd[1]: Reloading finished in 217 ms. May 16 16:41:47.380747 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 16 16:41:47.402037 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 16 16:41:47.410703 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 16 16:41:47.413223 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 16 16:41:47.423260 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 16 16:41:47.426543 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 16 16:41:47.430729 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 16 16:41:47.433393 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 16 16:41:47.438357 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:41:47.438524 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 16 16:41:47.439959 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 16 16:41:47.442891 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 16 16:41:47.445925 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 16 16:41:47.447062 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 16:41:47.447167 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 16:41:47.452144 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 16 16:41:47.453331 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:41:47.454482 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 16 16:41:47.459170 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 16 16:41:47.461391 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 16 16:41:47.464301 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 16 16:41:47.464809 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 16 16:41:47.466646 systemd[1]: modprobe@loop.service: Deactivated successfully. May 16 16:41:47.467010 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 16 16:41:47.477028 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:41:47.477321 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 16 16:41:47.479664 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 16 16:41:47.481880 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 16 16:41:47.489152 systemd-udevd[1407]: Using default interface naming scheme 'v255'. May 16 16:41:47.490496 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 16 16:41:47.491723 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 16:41:47.491823 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 16:41:47.493516 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 16 16:41:47.494739 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:41:47.496121 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 16 16:41:47.498887 augenrules[1438]: No rules May 16 16:41:47.498857 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 16 16:41:47.501029 systemd[1]: audit-rules.service: Deactivated successfully. May 16 16:41:47.501279 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 16 16:41:47.503106 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 16 16:41:47.503314 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 16 16:41:47.505046 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 16 16:41:47.505358 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 16 16:41:47.506939 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 16 16:41:47.508462 systemd[1]: modprobe@loop.service: Deactivated successfully. May 16 16:41:47.508671 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 16 16:41:47.512270 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 16 16:41:47.526213 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 16 16:41:47.528264 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:41:47.530769 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 16 16:41:47.531915 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 16 16:41:47.536789 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 16 16:41:47.548034 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 16 16:41:47.551750 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 16 16:41:47.553875 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 16 16:41:47.555745 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 16 16:41:47.555856 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 16 16:41:47.565835 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 16 16:41:47.567656 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 16 16:41:47.567759 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 16 16:41:47.569275 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 16 16:41:47.569492 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 16 16:41:47.572316 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 16 16:41:47.573185 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 16 16:41:47.575020 systemd[1]: modprobe@drm.service: Deactivated successfully. May 16 16:41:47.575267 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 16 16:41:47.588751 systemd[1]: Finished ensure-sysext.service. May 16 16:41:47.595520 augenrules[1464]: /sbin/augenrules: No change May 16 16:41:47.601894 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 16 16:41:47.604644 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... May 16 16:41:47.606366 systemd[1]: modprobe@loop.service: Deactivated successfully. May 16 16:41:47.606600 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 16 16:41:47.611242 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 16 16:41:47.611905 augenrules[1511]: No rules May 16 16:41:47.612962 systemd[1]: audit-rules.service: Deactivated successfully. May 16 16:41:47.613253 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 16 16:41:47.639201 systemd-resolved[1405]: Positive Trust Anchors: May 16 16:41:47.639513 systemd-resolved[1405]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 16 16:41:47.639603 systemd-resolved[1405]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 16 16:41:47.642552 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 16 16:41:47.646271 systemd-resolved[1405]: Defaulting to hostname 'linux'. May 16 16:41:47.648721 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 16 16:41:47.650009 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 16 16:41:47.703854 systemd-networkd[1488]: lo: Link UP May 16 16:41:47.704143 systemd-networkd[1488]: lo: Gained carrier May 16 16:41:47.706118 systemd-networkd[1488]: Enumeration completed May 16 16:41:47.706584 systemd[1]: Started systemd-networkd.service - Network Configuration. May 16 16:41:47.707003 systemd-networkd[1488]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 16 16:41:47.707007 systemd-networkd[1488]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 16 16:41:47.707890 systemd[1]: Reached target network.target - Network. May 16 16:41:47.708510 systemd-networkd[1488]: eth0: Link UP May 16 16:41:47.708677 systemd-networkd[1488]: eth0: Gained carrier May 16 16:41:47.708691 systemd-networkd[1488]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 16 16:41:47.709599 kernel: mousedev: PS/2 mouse device common for all mice May 16 16:41:47.712829 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 16 16:41:47.715398 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 16 16:41:47.719587 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. May 16 16:41:47.721647 systemd-networkd[1488]: eth0: DHCPv4 address 10.0.0.80/16, gateway 10.0.0.1 acquired from 10.0.0.1 May 16 16:41:47.724522 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 16 16:41:47.731201 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. May 16 16:41:47.732561 systemd-timesyncd[1510]: Contacted time server 10.0.0.1:123 (10.0.0.1). May 16 16:41:47.732889 systemd-timesyncd[1510]: Initial clock synchronization to Fri 2025-05-16 16:41:47.616878 UTC. May 16 16:41:47.732890 systemd[1]: Reached target sysinit.target - System Initialization. May 16 16:41:47.734127 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 16 16:41:47.736868 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input4 May 16 16:41:47.736931 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 16 16:41:47.738317 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 16 16:41:47.739671 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 16 16:41:47.740957 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 16 16:41:47.740988 systemd[1]: Reached target paths.target - Path Units. May 16 16:41:47.742493 systemd[1]: Reached target time-set.target - System Time Set. May 16 16:41:47.742596 kernel: ACPI: button: Power Button [PWRF] May 16 16:41:47.743800 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 16 16:41:47.744992 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 16 16:41:47.747657 systemd[1]: Reached target timers.target - Timer Units. May 16 16:41:47.749543 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 16 16:41:47.752348 systemd[1]: Starting docker.socket - Docker Socket for the API... May 16 16:41:47.755962 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 16 16:41:47.757866 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 16 16:41:47.759210 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 16 16:41:47.763126 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 16 16:41:47.765108 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 16 16:41:47.767455 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 16 16:41:47.769036 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 16 16:41:47.770502 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 16 16:41:47.779600 kernel: i801_smbus 0000:00:1f.3: SMBus using PCI interrupt May 16 16:41:47.781899 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD May 16 16:41:47.783826 systemd[1]: Reached target sockets.target - Socket Units. May 16 16:41:47.785678 systemd[1]: Reached target basic.target - Basic System. May 16 16:41:47.786682 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 16 16:41:47.786714 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 16 16:41:47.788706 systemd[1]: Starting containerd.service - containerd container runtime... May 16 16:41:47.792684 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 16 16:41:47.795735 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 16 16:41:47.798872 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 16 16:41:47.805149 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 16 16:41:47.806485 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 16 16:41:47.810871 jq[1554]: false May 16 16:41:47.811762 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 16 16:41:47.816654 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 16 16:41:47.821191 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 16 16:41:47.828808 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 16 16:41:47.831313 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 16 16:41:47.836402 extend-filesystems[1555]: Found loop3 May 16 16:41:47.837429 extend-filesystems[1555]: Found loop4 May 16 16:41:47.837429 extend-filesystems[1555]: Found loop5 May 16 16:41:47.837429 extend-filesystems[1555]: Found sr0 May 16 16:41:47.837429 extend-filesystems[1555]: Found vda May 16 16:41:47.837429 extend-filesystems[1555]: Found vda1 May 16 16:41:47.837429 extend-filesystems[1555]: Found vda2 May 16 16:41:47.837429 extend-filesystems[1555]: Found vda3 May 16 16:41:47.837429 extend-filesystems[1555]: Found usr May 16 16:41:47.837429 extend-filesystems[1555]: Found vda4 May 16 16:41:47.837429 extend-filesystems[1555]: Found vda6 May 16 16:41:47.837429 extend-filesystems[1555]: Found vda7 May 16 16:41:47.837429 extend-filesystems[1555]: Found vda9 May 16 16:41:47.837429 extend-filesystems[1555]: Checking size of /dev/vda9 May 16 16:41:47.838010 systemd[1]: Starting systemd-logind.service - User Login Management... May 16 16:41:47.853243 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Refreshing passwd entry cache May 16 16:41:47.837482 oslogin_cache_refresh[1556]: Refreshing passwd entry cache May 16 16:41:47.839905 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 16 16:41:47.840345 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 16 16:41:47.846735 systemd[1]: Starting update-engine.service - Update Engine... May 16 16:41:47.850680 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 16 16:41:47.862749 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 16 16:41:47.864692 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 16 16:41:47.864934 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 16 16:41:47.867148 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 16 16:41:47.867383 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 16 16:41:47.868893 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Failure getting users, quitting May 16 16:41:47.868886 oslogin_cache_refresh[1556]: Failure getting users, quitting May 16 16:41:47.868960 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 16 16:41:47.868908 oslogin_cache_refresh[1556]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 16 16:41:47.872843 jq[1568]: true May 16 16:41:47.874223 systemd[1]: motdgen.service: Deactivated successfully. May 16 16:41:47.879721 update_engine[1564]: I20250516 16:41:47.879657 1564 main.cc:92] Flatcar Update Engine starting May 16 16:41:47.880238 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 16 16:41:47.880688 (ntainerd)[1577]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 16 16:41:48.016028 jq[1582]: true May 16 16:41:48.017582 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 16 16:41:48.037348 tar[1576]: linux-amd64/LICENSE May 16 16:41:48.038610 tar[1576]: linux-amd64/helm May 16 16:41:48.052283 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Refreshing group entry cache May 16 16:41:48.051836 oslogin_cache_refresh[1556]: Refreshing group entry cache May 16 16:41:48.052897 dbus-daemon[1549]: [system] SELinux support is enabled May 16 16:41:48.053094 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 16 16:41:48.056574 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 16 16:41:48.056603 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 16 16:41:48.058132 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 16 16:41:48.058147 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 16 16:41:48.060372 extend-filesystems[1555]: Resized partition /dev/vda9 May 16 16:41:48.063765 extend-filesystems[1595]: resize2fs 1.47.2 (1-Jan-2025) May 16 16:41:48.064906 update_engine[1564]: I20250516 16:41:48.064507 1564 update_check_scheduler.cc:74] Next update check in 9m18s May 16 16:41:48.072182 kernel: EXT4-fs (vda9): resizing filesystem from 553472 to 1864699 blocks May 16 16:41:48.096649 kernel: EXT4-fs (vda9): resized filesystem to 1864699 May 16 16:41:48.121639 kernel: kvm_amd: TSC scaling supported May 16 16:41:48.121718 kernel: kvm_amd: Nested Virtualization enabled May 16 16:41:48.121745 kernel: kvm_amd: Nested Paging enabled May 16 16:41:48.121770 kernel: kvm_amd: LBR virtualization supported May 16 16:41:48.121792 kernel: kvm_amd: Virtual VMLOAD VMSAVE supported May 16 16:41:48.121817 kernel: kvm_amd: Virtual GIF supported May 16 16:41:48.098979 oslogin_cache_refresh[1556]: Failure getting groups, quitting May 16 16:41:48.121948 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Failure getting groups, quitting May 16 16:41:48.121948 google_oslogin_nss_cache[1556]: oslogin_cache_refresh[1556]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 16 16:41:48.099012 oslogin_cache_refresh[1556]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 16 16:41:48.124332 systemd-logind[1562]: Watching system buttons on /dev/input/event2 (Power Button) May 16 16:41:48.126253 extend-filesystems[1595]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required May 16 16:41:48.126253 extend-filesystems[1595]: old_desc_blocks = 1, new_desc_blocks = 1 May 16 16:41:48.126253 extend-filesystems[1595]: The filesystem on /dev/vda9 is now 1864699 (4k) blocks long. May 16 16:41:48.137170 extend-filesystems[1555]: Resized filesystem in /dev/vda9 May 16 16:41:48.129755 systemd-logind[1562]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 16 16:41:48.130721 systemd-logind[1562]: New seat seat0. May 16 16:41:48.133063 systemd[1]: extend-filesystems.service: Deactivated successfully. May 16 16:41:48.134677 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 16 16:41:48.146323 bash[1611]: Updated "/home/core/.ssh/authorized_keys" May 16 16:41:48.162591 kernel: EDAC MC: Ver: 3.0.0 May 16 16:41:48.197467 sshd_keygen[1572]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 16 16:41:48.238275 containerd[1577]: time="2025-05-16T16:41:48Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 16 16:41:48.241221 containerd[1577]: time="2025-05-16T16:41:48.241010260Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 16 16:41:48.251305 containerd[1577]: time="2025-05-16T16:41:48.251257321Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="9.42µs" May 16 16:41:48.251305 containerd[1577]: time="2025-05-16T16:41:48.251301263Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 16 16:41:48.251380 containerd[1577]: time="2025-05-16T16:41:48.251318405Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 16 16:41:48.251556 containerd[1577]: time="2025-05-16T16:41:48.251526173Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 16 16:41:48.251596 containerd[1577]: time="2025-05-16T16:41:48.251547817Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 16 16:41:48.251596 containerd[1577]: time="2025-05-16T16:41:48.251594118Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 16 16:41:48.251795 containerd[1577]: time="2025-05-16T16:41:48.251655657Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 16 16:41:48.251795 containerd[1577]: time="2025-05-16T16:41:48.251671850Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 16 16:41:48.251982 containerd[1577]: time="2025-05-16T16:41:48.251957380Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 16 16:41:48.251982 containerd[1577]: time="2025-05-16T16:41:48.251977958Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 16 16:41:48.252026 containerd[1577]: time="2025-05-16T16:41:48.251989037Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 16 16:41:48.252026 containerd[1577]: time="2025-05-16T16:41:48.251998270Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 16 16:41:48.252192 containerd[1577]: time="2025-05-16T16:41:48.252080267Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 16 16:41:48.252329 containerd[1577]: time="2025-05-16T16:41:48.252305464Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 16 16:41:48.252370 containerd[1577]: time="2025-05-16T16:41:48.252340369Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 16 16:41:48.252370 containerd[1577]: time="2025-05-16T16:41:48.252350431Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 16 16:41:48.252418 containerd[1577]: time="2025-05-16T16:41:48.252390857Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 16 16:41:48.252771 containerd[1577]: time="2025-05-16T16:41:48.252711184Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 16 16:41:48.252825 containerd[1577]: time="2025-05-16T16:41:48.252801604Z" level=info msg="metadata content store policy set" policy=shared May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257739792Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257775715Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257794437Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257806504Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257818551Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257828050Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257846653Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257858434Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257868792Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257878577Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257887593Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 16 16:41:48.257974 containerd[1577]: time="2025-05-16T16:41:48.257905722Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258012603Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258035215Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258049001Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258064967Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258075079Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258087343Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258098254Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258107339Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258118981Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258129052Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258139105Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 16 16:41:48.258195 containerd[1577]: time="2025-05-16T16:41:48.258191953Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 16 16:41:48.258417 containerd[1577]: time="2025-05-16T16:41:48.258213114Z" level=info msg="Start snapshots syncer" May 16 16:41:48.258417 containerd[1577]: time="2025-05-16T16:41:48.258230888Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 16 16:41:48.258757 containerd[1577]: time="2025-05-16T16:41:48.258433106Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 16 16:41:48.258757 containerd[1577]: time="2025-05-16T16:41:48.258486922Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 16 16:41:48.259412 containerd[1577]: time="2025-05-16T16:41:48.259194860Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 16 16:41:48.259412 containerd[1577]: time="2025-05-16T16:41:48.259342976Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 16 16:41:48.259412 containerd[1577]: time="2025-05-16T16:41:48.259373252Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 16 16:41:48.259412 containerd[1577]: time="2025-05-16T16:41:48.259385881Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 16 16:41:48.259412 containerd[1577]: time="2025-05-16T16:41:48.259400574Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 16 16:41:48.259412 containerd[1577]: time="2025-05-16T16:41:48.259411989Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259422199Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259431936Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259451072Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259461282Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259471355Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259501125Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259512491Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259520598Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259529189Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 16 16:41:48.259534 containerd[1577]: time="2025-05-16T16:41:48.259537563Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 16 16:41:48.259742 containerd[1577]: time="2025-05-16T16:41:48.259554122Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 16 16:41:48.259742 containerd[1577]: time="2025-05-16T16:41:48.259585522Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 16 16:41:48.259742 containerd[1577]: time="2025-05-16T16:41:48.259601005Z" level=info msg="runtime interface created" May 16 16:41:48.259742 containerd[1577]: time="2025-05-16T16:41:48.259606565Z" level=info msg="created NRI interface" May 16 16:41:48.259742 containerd[1577]: time="2025-05-16T16:41:48.259627765Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 16 16:41:48.259742 containerd[1577]: time="2025-05-16T16:41:48.259638391Z" level=info msg="Connect containerd service" May 16 16:41:48.259742 containerd[1577]: time="2025-05-16T16:41:48.259668181Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 16 16:41:48.261578 containerd[1577]: time="2025-05-16T16:41:48.261531538Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 16 16:41:48.303973 systemd[1]: Started systemd-logind.service - User Login Management. May 16 16:41:48.305662 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 16 16:41:48.305930 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 16 16:41:48.307403 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 16 16:41:48.309051 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 16 16:41:48.310619 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 16 16:41:48.314603 dbus-daemon[1549]: [system] Successfully activated service 'org.freedesktop.systemd1' May 16 16:41:48.330348 systemd[1]: Started update-engine.service - Update Engine. May 16 16:41:48.335818 systemd[1]: Starting issuegen.service - Generate /run/issue... May 16 16:41:48.337215 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 16 16:41:48.339747 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 16 16:41:48.353448 systemd[1]: issuegen.service: Deactivated successfully. May 16 16:41:48.355523 containerd[1577]: time="2025-05-16T16:41:48.355481972Z" level=info msg="Start subscribing containerd event" May 16 16:41:48.355634 containerd[1577]: time="2025-05-16T16:41:48.355557540Z" level=info msg="Start recovering state" May 16 16:41:48.355822 containerd[1577]: time="2025-05-16T16:41:48.355791496Z" level=info msg="Start event monitor" May 16 16:41:48.355933 containerd[1577]: time="2025-05-16T16:41:48.355908794Z" level=info msg="Start cni network conf syncer for default" May 16 16:41:48.355933 containerd[1577]: time="2025-05-16T16:41:48.355921809Z" level=info msg="Start streaming server" May 16 16:41:48.355933 containerd[1577]: time="2025-05-16T16:41:48.355930952Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 16 16:41:48.355992 containerd[1577]: time="2025-05-16T16:41:48.355938743Z" level=info msg="runtime interface starting up..." May 16 16:41:48.356112 containerd[1577]: time="2025-05-16T16:41:48.355944411Z" level=info msg="starting plugins..." May 16 16:41:48.356133 containerd[1577]: time="2025-05-16T16:41:48.356112523Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 16 16:41:48.356260 containerd[1577]: time="2025-05-16T16:41:48.356235657Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 16 16:41:48.356289 containerd[1577]: time="2025-05-16T16:41:48.356283183Z" level=info msg=serving... address=/run/containerd/containerd.sock May 16 16:41:48.356413 containerd[1577]: time="2025-05-16T16:41:48.356397233Z" level=info msg="containerd successfully booted in 0.118582s" May 16 16:41:48.356773 systemd[1]: Finished issuegen.service - Generate /run/issue. May 16 16:41:48.358249 systemd[1]: Started containerd.service - containerd container runtime. May 16 16:41:48.369796 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 16 16:41:48.376974 locksmithd[1652]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 16 16:41:48.391196 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 16 16:41:48.394371 systemd[1]: Started getty@tty1.service - Getty on tty1. May 16 16:41:48.396787 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 16 16:41:48.398348 systemd[1]: Reached target getty.target - Login Prompts. May 16 16:41:48.546282 tar[1576]: linux-amd64/README.md May 16 16:41:48.572090 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 16 16:41:49.591735 systemd-networkd[1488]: eth0: Gained IPv6LL May 16 16:41:49.594803 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 16 16:41:49.596610 systemd[1]: Reached target network-online.target - Network is Online. May 16 16:41:49.599598 systemd[1]: Starting coreos-metadata.service - QEMU metadata agent... May 16 16:41:49.602067 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:41:49.611136 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 16 16:41:49.627926 systemd[1]: coreos-metadata.service: Deactivated successfully. May 16 16:41:49.628248 systemd[1]: Finished coreos-metadata.service - QEMU metadata agent. May 16 16:41:49.629920 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 16 16:41:49.632628 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 16 16:41:50.294645 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:41:50.296357 systemd[1]: Reached target multi-user.target - Multi-User System. May 16 16:41:50.297721 systemd[1]: Startup finished in 2.787s (kernel) + 6.220s (initrd) + 4.492s (userspace) = 13.500s. May 16 16:41:50.298761 (kubelet)[1691]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 16:41:50.711186 kubelet[1691]: E0516 16:41:50.711070 1691 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 16:41:50.715021 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 16:41:50.715232 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 16:41:50.715604 systemd[1]: kubelet.service: Consumed 977ms CPU time, 265.9M memory peak. May 16 16:41:52.531740 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 16 16:41:52.533005 systemd[1]: Started sshd@0-10.0.0.80:22-10.0.0.1:57694.service - OpenSSH per-connection server daemon (10.0.0.1:57694). May 16 16:41:52.605221 sshd[1704]: Accepted publickey for core from 10.0.0.1 port 57694 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:41:52.607262 sshd-session[1704]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:41:52.613950 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 16 16:41:52.615054 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 16 16:41:52.621275 systemd-logind[1562]: New session 1 of user core. May 16 16:41:52.638575 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 16 16:41:52.642270 systemd[1]: Starting user@500.service - User Manager for UID 500... May 16 16:41:52.658863 (systemd)[1708]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 16 16:41:52.661338 systemd-logind[1562]: New session c1 of user core. May 16 16:41:52.796833 systemd[1708]: Queued start job for default target default.target. May 16 16:41:52.810739 systemd[1708]: Created slice app.slice - User Application Slice. May 16 16:41:52.810765 systemd[1708]: Reached target paths.target - Paths. May 16 16:41:52.810802 systemd[1708]: Reached target timers.target - Timers. May 16 16:41:52.812268 systemd[1708]: Starting dbus.socket - D-Bus User Message Bus Socket... May 16 16:41:52.823635 systemd[1708]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 16 16:41:52.823759 systemd[1708]: Reached target sockets.target - Sockets. May 16 16:41:52.823798 systemd[1708]: Reached target basic.target - Basic System. May 16 16:41:52.823836 systemd[1708]: Reached target default.target - Main User Target. May 16 16:41:52.823865 systemd[1708]: Startup finished in 156ms. May 16 16:41:52.824208 systemd[1]: Started user@500.service - User Manager for UID 500. May 16 16:41:52.833696 systemd[1]: Started session-1.scope - Session 1 of User core. May 16 16:41:52.896437 systemd[1]: Started sshd@1-10.0.0.80:22-10.0.0.1:57696.service - OpenSSH per-connection server daemon (10.0.0.1:57696). May 16 16:41:52.947805 sshd[1719]: Accepted publickey for core from 10.0.0.1 port 57696 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:41:52.949084 sshd-session[1719]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:41:52.953177 systemd-logind[1562]: New session 2 of user core. May 16 16:41:52.967689 systemd[1]: Started session-2.scope - Session 2 of User core. May 16 16:41:53.020986 sshd[1721]: Connection closed by 10.0.0.1 port 57696 May 16 16:41:53.021335 sshd-session[1719]: pam_unix(sshd:session): session closed for user core May 16 16:41:53.033948 systemd[1]: sshd@1-10.0.0.80:22-10.0.0.1:57696.service: Deactivated successfully. May 16 16:41:53.035501 systemd[1]: session-2.scope: Deactivated successfully. May 16 16:41:53.036221 systemd-logind[1562]: Session 2 logged out. Waiting for processes to exit. May 16 16:41:53.038983 systemd[1]: Started sshd@2-10.0.0.80:22-10.0.0.1:57708.service - OpenSSH per-connection server daemon (10.0.0.1:57708). May 16 16:41:53.039497 systemd-logind[1562]: Removed session 2. May 16 16:41:53.097355 sshd[1727]: Accepted publickey for core from 10.0.0.1 port 57708 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:41:53.098513 sshd-session[1727]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:41:53.102750 systemd-logind[1562]: New session 3 of user core. May 16 16:41:53.112692 systemd[1]: Started session-3.scope - Session 3 of User core. May 16 16:41:53.163099 sshd[1729]: Connection closed by 10.0.0.1 port 57708 May 16 16:41:53.163373 sshd-session[1727]: pam_unix(sshd:session): session closed for user core May 16 16:41:53.176198 systemd[1]: sshd@2-10.0.0.80:22-10.0.0.1:57708.service: Deactivated successfully. May 16 16:41:53.178165 systemd[1]: session-3.scope: Deactivated successfully. May 16 16:41:53.179046 systemd-logind[1562]: Session 3 logged out. Waiting for processes to exit. May 16 16:41:53.182072 systemd[1]: Started sshd@3-10.0.0.80:22-10.0.0.1:57718.service - OpenSSH per-connection server daemon (10.0.0.1:57718). May 16 16:41:53.182889 systemd-logind[1562]: Removed session 3. May 16 16:41:53.234591 sshd[1735]: Accepted publickey for core from 10.0.0.1 port 57718 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:41:53.236091 sshd-session[1735]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:41:53.241408 systemd-logind[1562]: New session 4 of user core. May 16 16:41:53.250824 systemd[1]: Started session-4.scope - Session 4 of User core. May 16 16:41:53.307169 sshd[1737]: Connection closed by 10.0.0.1 port 57718 May 16 16:41:53.307591 sshd-session[1735]: pam_unix(sshd:session): session closed for user core May 16 16:41:53.320341 systemd[1]: sshd@3-10.0.0.80:22-10.0.0.1:57718.service: Deactivated successfully. May 16 16:41:53.322510 systemd[1]: session-4.scope: Deactivated successfully. May 16 16:41:53.323329 systemd-logind[1562]: Session 4 logged out. Waiting for processes to exit. May 16 16:41:53.326889 systemd[1]: Started sshd@4-10.0.0.80:22-10.0.0.1:57724.service - OpenSSH per-connection server daemon (10.0.0.1:57724). May 16 16:41:53.327598 systemd-logind[1562]: Removed session 4. May 16 16:41:53.386090 sshd[1743]: Accepted publickey for core from 10.0.0.1 port 57724 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:41:53.387411 sshd-session[1743]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:41:53.392232 systemd-logind[1562]: New session 5 of user core. May 16 16:41:53.401724 systemd[1]: Started session-5.scope - Session 5 of User core. May 16 16:41:53.460441 sudo[1746]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 16 16:41:53.460885 sudo[1746]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 16:41:53.483360 sudo[1746]: pam_unix(sudo:session): session closed for user root May 16 16:41:53.485213 sshd[1745]: Connection closed by 10.0.0.1 port 57724 May 16 16:41:53.485652 sshd-session[1743]: pam_unix(sshd:session): session closed for user core May 16 16:41:53.505536 systemd[1]: sshd@4-10.0.0.80:22-10.0.0.1:57724.service: Deactivated successfully. May 16 16:41:53.507685 systemd[1]: session-5.scope: Deactivated successfully. May 16 16:41:53.508642 systemd-logind[1562]: Session 5 logged out. Waiting for processes to exit. May 16 16:41:53.512383 systemd[1]: Started sshd@5-10.0.0.80:22-10.0.0.1:57730.service - OpenSSH per-connection server daemon (10.0.0.1:57730). May 16 16:41:53.513163 systemd-logind[1562]: Removed session 5. May 16 16:41:53.567955 sshd[1752]: Accepted publickey for core from 10.0.0.1 port 57730 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:41:53.569349 sshd-session[1752]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:41:53.574392 systemd-logind[1562]: New session 6 of user core. May 16 16:41:53.581753 systemd[1]: Started session-6.scope - Session 6 of User core. May 16 16:41:53.636624 sudo[1756]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 16 16:41:53.636996 sudo[1756]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 16:41:53.643682 sudo[1756]: pam_unix(sudo:session): session closed for user root May 16 16:41:53.651173 sudo[1755]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 16 16:41:53.651619 sudo[1755]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 16:41:53.663786 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 16 16:41:53.715008 augenrules[1778]: No rules May 16 16:41:53.717037 systemd[1]: audit-rules.service: Deactivated successfully. May 16 16:41:53.717384 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 16 16:41:53.718656 sudo[1755]: pam_unix(sudo:session): session closed for user root May 16 16:41:53.720023 sshd[1754]: Connection closed by 10.0.0.1 port 57730 May 16 16:41:53.720326 sshd-session[1752]: pam_unix(sshd:session): session closed for user core May 16 16:41:53.738115 systemd[1]: sshd@5-10.0.0.80:22-10.0.0.1:57730.service: Deactivated successfully. May 16 16:41:53.740400 systemd[1]: session-6.scope: Deactivated successfully. May 16 16:41:53.741217 systemd-logind[1562]: Session 6 logged out. Waiting for processes to exit. May 16 16:41:53.744745 systemd[1]: Started sshd@6-10.0.0.80:22-10.0.0.1:57734.service - OpenSSH per-connection server daemon (10.0.0.1:57734). May 16 16:41:53.745300 systemd-logind[1562]: Removed session 6. May 16 16:41:53.792395 sshd[1787]: Accepted publickey for core from 10.0.0.1 port 57734 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:41:53.793529 sshd-session[1787]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:41:53.797870 systemd-logind[1562]: New session 7 of user core. May 16 16:41:53.807682 systemd[1]: Started session-7.scope - Session 7 of User core. May 16 16:41:53.860004 sudo[1790]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 16 16:41:53.860311 sudo[1790]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 16 16:41:54.148961 systemd[1]: Starting docker.service - Docker Application Container Engine... May 16 16:41:54.162875 (dockerd)[1811]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 16 16:41:54.373416 dockerd[1811]: time="2025-05-16T16:41:54.373343481Z" level=info msg="Starting up" May 16 16:41:54.374948 dockerd[1811]: time="2025-05-16T16:41:54.374897860Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 16 16:41:54.710228 dockerd[1811]: time="2025-05-16T16:41:54.710162393Z" level=info msg="Loading containers: start." May 16 16:41:54.719592 kernel: Initializing XFRM netlink socket May 16 16:41:54.959931 systemd-networkd[1488]: docker0: Link UP May 16 16:41:54.965863 dockerd[1811]: time="2025-05-16T16:41:54.965775554Z" level=info msg="Loading containers: done." May 16 16:41:54.979522 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2392890154-merged.mount: Deactivated successfully. May 16 16:41:54.983093 dockerd[1811]: time="2025-05-16T16:41:54.983043176Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 16 16:41:54.983169 dockerd[1811]: time="2025-05-16T16:41:54.983135320Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 16 16:41:54.983302 dockerd[1811]: time="2025-05-16T16:41:54.983273183Z" level=info msg="Initializing buildkit" May 16 16:41:55.014350 dockerd[1811]: time="2025-05-16T16:41:55.014303011Z" level=info msg="Completed buildkit initialization" May 16 16:41:55.018476 dockerd[1811]: time="2025-05-16T16:41:55.018437207Z" level=info msg="Daemon has completed initialization" May 16 16:41:55.018534 dockerd[1811]: time="2025-05-16T16:41:55.018510479Z" level=info msg="API listen on /run/docker.sock" May 16 16:41:55.018703 systemd[1]: Started docker.service - Docker Application Container Engine. May 16 16:41:55.622561 containerd[1577]: time="2025-05-16T16:41:55.622498392Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\"" May 16 16:41:56.278212 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount903438250.mount: Deactivated successfully. May 16 16:41:57.141205 containerd[1577]: time="2025-05-16T16:41:57.141146651Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:57.141894 containerd[1577]: time="2025-05-16T16:41:57.141837965Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.1: active requests=0, bytes read=30075403" May 16 16:41:57.143044 containerd[1577]: time="2025-05-16T16:41:57.142989037Z" level=info msg="ImageCreate event name:\"sha256:c6ab243b29f82a6ce269a5342bfd9ea3d0d4ef0f2bb7e98c6ac0bde1aeafab66\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:57.145398 containerd[1577]: time="2025-05-16T16:41:57.145360358Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:57.146226 containerd[1577]: time="2025-05-16T16:41:57.146182091Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.1\" with image id \"sha256:c6ab243b29f82a6ce269a5342bfd9ea3d0d4ef0f2bb7e98c6ac0bde1aeafab66\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.1\", repo digest \"registry.k8s.io/kube-apiserver@sha256:d8ae2fb01c39aa1c7add84f3d54425cf081c24c11e3946830292a8cfa4293548\", size \"30072203\" in 1.523628416s" May 16 16:41:57.146258 containerd[1577]: time="2025-05-16T16:41:57.146227080Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.1\" returns image reference \"sha256:c6ab243b29f82a6ce269a5342bfd9ea3d0d4ef0f2bb7e98c6ac0bde1aeafab66\"" May 16 16:41:57.146932 containerd[1577]: time="2025-05-16T16:41:57.146882843Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\"" May 16 16:41:58.363541 containerd[1577]: time="2025-05-16T16:41:58.363483428Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:58.364337 containerd[1577]: time="2025-05-16T16:41:58.364289678Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.1: active requests=0, bytes read=26011390" May 16 16:41:58.365543 containerd[1577]: time="2025-05-16T16:41:58.365512070Z" level=info msg="ImageCreate event name:\"sha256:ef43894fa110c389f7286f4d5a3ea176072c95280efeca60d6a79617cdbbf3e4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:58.368092 containerd[1577]: time="2025-05-16T16:41:58.368060862Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:58.368984 containerd[1577]: time="2025-05-16T16:41:58.368950623Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.1\" with image id \"sha256:ef43894fa110c389f7286f4d5a3ea176072c95280efeca60d6a79617cdbbf3e4\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.1\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:7c9bea694e3a3c01ed6a5ee02d55a6124cc08e0b2eec6caa33f2c396b8cbc3f8\", size \"27638910\" in 1.222036661s" May 16 16:41:58.369021 containerd[1577]: time="2025-05-16T16:41:58.368983489Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.1\" returns image reference \"sha256:ef43894fa110c389f7286f4d5a3ea176072c95280efeca60d6a79617cdbbf3e4\"" May 16 16:41:58.369502 containerd[1577]: time="2025-05-16T16:41:58.369448545Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\"" May 16 16:41:59.879731 containerd[1577]: time="2025-05-16T16:41:59.879666310Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:59.880700 containerd[1577]: time="2025-05-16T16:41:59.880666325Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.1: active requests=0, bytes read=20148960" May 16 16:41:59.882006 containerd[1577]: time="2025-05-16T16:41:59.881969532Z" level=info msg="ImageCreate event name:\"sha256:398c985c0d950becc8dcdab5877a8a517ffeafca0792b3fe5f1acff218aeac49\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:59.885086 containerd[1577]: time="2025-05-16T16:41:59.885035461Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:41:59.886051 containerd[1577]: time="2025-05-16T16:41:59.885981713Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.1\" with image id \"sha256:398c985c0d950becc8dcdab5877a8a517ffeafca0792b3fe5f1acff218aeac49\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.1\", repo digest \"registry.k8s.io/kube-scheduler@sha256:395b7de7cdbdcc3c3a3db270844a3f71d757e2447a1e4db76b4cce46fba7fd55\", size \"21776498\" in 1.516500795s" May 16 16:41:59.886051 containerd[1577]: time="2025-05-16T16:41:59.886040888Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.1\" returns image reference \"sha256:398c985c0d950becc8dcdab5877a8a517ffeafca0792b3fe5f1acff218aeac49\"" May 16 16:41:59.886683 containerd[1577]: time="2025-05-16T16:41:59.886655359Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\"" May 16 16:42:00.779170 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2716662176.mount: Deactivated successfully. May 16 16:42:00.780324 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 16 16:42:00.781655 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:42:00.977376 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:00.984823 (kubelet)[2101]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 16 16:42:01.029838 kubelet[2101]: E0516 16:42:01.029580 2101 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 16 16:42:01.037663 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 16 16:42:01.037907 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 16 16:42:01.038480 systemd[1]: kubelet.service: Consumed 216ms CPU time, 109.1M memory peak. May 16 16:42:01.376142 containerd[1577]: time="2025-05-16T16:42:01.376013152Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:01.377099 containerd[1577]: time="2025-05-16T16:42:01.377051542Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.1: active requests=0, bytes read=31889075" May 16 16:42:01.378300 containerd[1577]: time="2025-05-16T16:42:01.378264146Z" level=info msg="ImageCreate event name:\"sha256:b79c189b052cdbe0e837d0caa6faf1d9fd696d8664fcc462f67d9ea51f26fef2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:01.380275 containerd[1577]: time="2025-05-16T16:42:01.380245010Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:01.380763 containerd[1577]: time="2025-05-16T16:42:01.380727408Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.1\" with image id \"sha256:b79c189b052cdbe0e837d0caa6faf1d9fd696d8664fcc462f67d9ea51f26fef2\", repo tag \"registry.k8s.io/kube-proxy:v1.33.1\", repo digest \"registry.k8s.io/kube-proxy@sha256:7ddf379897139ae8ade8b33cb9373b70c632a4d5491da6e234f5d830e0a50807\", size \"31888094\" in 1.494041358s" May 16 16:42:01.380808 containerd[1577]: time="2025-05-16T16:42:01.380768681Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.1\" returns image reference \"sha256:b79c189b052cdbe0e837d0caa6faf1d9fd696d8664fcc462f67d9ea51f26fef2\"" May 16 16:42:01.381216 containerd[1577]: time="2025-05-16T16:42:01.381191880Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" May 16 16:42:01.906230 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount151266874.mount: Deactivated successfully. May 16 16:42:02.846120 containerd[1577]: time="2025-05-16T16:42:02.846055988Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:02.846904 containerd[1577]: time="2025-05-16T16:42:02.846851074Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=20942238" May 16 16:42:02.847912 containerd[1577]: time="2025-05-16T16:42:02.847872929Z" level=info msg="ImageCreate event name:\"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:02.850582 containerd[1577]: time="2025-05-16T16:42:02.850521742Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:02.851545 containerd[1577]: time="2025-05-16T16:42:02.851499383Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"20939036\" in 1.470278459s" May 16 16:42:02.851625 containerd[1577]: time="2025-05-16T16:42:02.851546927Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:1cf5f116067c67da67f97bff78c4bbc76913f59057c18627b96facaced73ea0b\"" May 16 16:42:02.852949 containerd[1577]: time="2025-05-16T16:42:02.852722220Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 16 16:42:03.393937 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1663299724.mount: Deactivated successfully. May 16 16:42:03.401702 containerd[1577]: time="2025-05-16T16:42:03.401658049Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 16 16:42:03.402550 containerd[1577]: time="2025-05-16T16:42:03.402526199Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321138" May 16 16:42:03.403904 containerd[1577]: time="2025-05-16T16:42:03.403856409Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 16 16:42:03.405762 containerd[1577]: time="2025-05-16T16:42:03.405730561Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 16 16:42:03.406348 containerd[1577]: time="2025-05-16T16:42:03.406306542Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 553.551571ms" May 16 16:42:03.406348 containerd[1577]: time="2025-05-16T16:42:03.406340260Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 16 16:42:03.406849 containerd[1577]: time="2025-05-16T16:42:03.406796589Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" May 16 16:42:05.687436 containerd[1577]: time="2025-05-16T16:42:05.687379691Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:05.688391 containerd[1577]: time="2025-05-16T16:42:05.688345126Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=58142739" May 16 16:42:05.689643 containerd[1577]: time="2025-05-16T16:42:05.689603095Z" level=info msg="ImageCreate event name:\"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:05.692249 containerd[1577]: time="2025-05-16T16:42:05.692222702Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:05.693206 containerd[1577]: time="2025-05-16T16:42:05.693165399Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"58938593\" in 2.286340909s" May 16 16:42:05.693271 containerd[1577]: time="2025-05-16T16:42:05.693209096Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:499038711c0816eda03a1ad96a8eb0440c005baa6949698223c6176b7f5077e1\"" May 16 16:42:08.695842 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:08.696053 systemd[1]: kubelet.service: Consumed 216ms CPU time, 109.1M memory peak. May 16 16:42:08.698322 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:42:08.721660 systemd[1]: Reload requested from client PID 2210 ('systemctl') (unit session-7.scope)... May 16 16:42:08.721676 systemd[1]: Reloading... May 16 16:42:08.802937 zram_generator::config[2255]: No configuration found. May 16 16:42:08.965855 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 16:42:09.077678 systemd[1]: Reloading finished in 355 ms. May 16 16:42:09.176234 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 16 16:42:09.176328 systemd[1]: kubelet.service: Failed with result 'signal'. May 16 16:42:09.176627 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:09.176669 systemd[1]: kubelet.service: Consumed 203ms CPU time, 98.3M memory peak. May 16 16:42:09.178273 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:42:09.346177 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:09.355863 (kubelet)[2298]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 16 16:42:09.421684 kubelet[2298]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 16:42:09.421684 kubelet[2298]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 16 16:42:09.421684 kubelet[2298]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 16:42:09.422062 kubelet[2298]: I0516 16:42:09.421750 2298 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 16 16:42:09.618616 kubelet[2298]: I0516 16:42:09.618524 2298 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 16 16:42:09.618616 kubelet[2298]: I0516 16:42:09.618543 2298 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 16 16:42:09.618755 kubelet[2298]: I0516 16:42:09.618738 2298 server.go:956] "Client rotation is on, will bootstrap in background" May 16 16:42:09.648261 kubelet[2298]: E0516 16:42:09.647226 2298 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.0.80:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.0.80:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" May 16 16:42:09.648261 kubelet[2298]: I0516 16:42:09.647433 2298 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 16 16:42:09.655996 kubelet[2298]: I0516 16:42:09.655978 2298 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 16 16:42:09.661340 kubelet[2298]: I0516 16:42:09.661303 2298 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 16 16:42:09.661559 kubelet[2298]: I0516 16:42:09.661526 2298 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 16 16:42:09.661730 kubelet[2298]: I0516 16:42:09.661549 2298 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 16 16:42:09.661730 kubelet[2298]: I0516 16:42:09.661729 2298 topology_manager.go:138] "Creating topology manager with none policy" May 16 16:42:09.661869 kubelet[2298]: I0516 16:42:09.661738 2298 container_manager_linux.go:303] "Creating device plugin manager" May 16 16:42:09.662629 kubelet[2298]: I0516 16:42:09.662602 2298 state_mem.go:36] "Initialized new in-memory state store" May 16 16:42:09.666527 kubelet[2298]: I0516 16:42:09.666500 2298 kubelet.go:480] "Attempting to sync node with API server" May 16 16:42:09.666527 kubelet[2298]: I0516 16:42:09.666521 2298 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 16 16:42:09.666587 kubelet[2298]: I0516 16:42:09.666547 2298 kubelet.go:386] "Adding apiserver pod source" May 16 16:42:09.668041 kubelet[2298]: I0516 16:42:09.668026 2298 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 16 16:42:09.674297 kubelet[2298]: E0516 16:42:09.674243 2298 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.80:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.80:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 16 16:42:09.675313 kubelet[2298]: I0516 16:42:09.675292 2298 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 16 16:42:09.675895 kubelet[2298]: I0516 16:42:09.675853 2298 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 16 16:42:09.676437 kubelet[2298]: W0516 16:42:09.676413 2298 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 16 16:42:09.677463 kubelet[2298]: E0516 16:42:09.677438 2298 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.0.80:6443/api/v1/nodes?fieldSelector=metadata.name%3Dlocalhost&limit=500&resourceVersion=0\": dial tcp 10.0.0.80:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" May 16 16:42:09.679093 kubelet[2298]: I0516 16:42:09.679069 2298 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 16 16:42:09.679215 kubelet[2298]: I0516 16:42:09.679195 2298 server.go:1289] "Started kubelet" May 16 16:42:09.679890 kubelet[2298]: I0516 16:42:09.679847 2298 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 16 16:42:09.684739 kubelet[2298]: I0516 16:42:09.684543 2298 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 16 16:42:09.685385 kubelet[2298]: I0516 16:42:09.685364 2298 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 16 16:42:09.685832 kubelet[2298]: I0516 16:42:09.685800 2298 server.go:317] "Adding debug handlers to kubelet server" May 16 16:42:09.687799 kubelet[2298]: E0516 16:42:09.687774 2298 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 16 16:42:09.688023 kubelet[2298]: E0516 16:42:09.686953 2298 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.0.80:6443/api/v1/namespaces/default/events\": dial tcp 10.0.0.80:6443: connect: connection refused" event="&Event{ObjectMeta:{localhost.18400f876929a22c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:localhost,UID:localhost,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:localhost,},FirstTimestamp:2025-05-16 16:42:09.67909022 +0000 UTC m=+0.293273223,LastTimestamp:2025-05-16 16:42:09.67909022 +0000 UTC m=+0.293273223,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:localhost,}" May 16 16:42:09.689126 kubelet[2298]: I0516 16:42:09.689108 2298 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 16 16:42:09.689611 kubelet[2298]: I0516 16:42:09.689207 2298 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 16 16:42:09.689659 kubelet[2298]: I0516 16:42:09.689619 2298 volume_manager.go:297] "Starting Kubelet Volume Manager" May 16 16:42:09.691015 kubelet[2298]: E0516 16:42:09.690990 2298 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:09.691292 kubelet[2298]: I0516 16:42:09.691276 2298 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 16 16:42:09.691380 kubelet[2298]: I0516 16:42:09.691366 2298 reconciler.go:26] "Reconciler: start to sync state" May 16 16:42:09.692642 kubelet[2298]: E0516 16:42:09.692607 2298 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.0.80:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.0.80:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" May 16 16:42:09.692868 kubelet[2298]: E0516 16:42:09.692843 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.80:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.80:6443: connect: connection refused" interval="200ms" May 16 16:42:09.693905 kubelet[2298]: I0516 16:42:09.693742 2298 factory.go:223] Registration of the systemd container factory successfully May 16 16:42:09.693905 kubelet[2298]: I0516 16:42:09.693863 2298 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 16 16:42:09.694869 kubelet[2298]: I0516 16:42:09.694843 2298 factory.go:223] Registration of the containerd container factory successfully May 16 16:42:09.704739 kubelet[2298]: I0516 16:42:09.704702 2298 cpu_manager.go:221] "Starting CPU manager" policy="none" May 16 16:42:09.704739 kubelet[2298]: I0516 16:42:09.704724 2298 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 16 16:42:09.704739 kubelet[2298]: I0516 16:42:09.704740 2298 state_mem.go:36] "Initialized new in-memory state store" May 16 16:42:09.710884 kubelet[2298]: I0516 16:42:09.710832 2298 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 16 16:42:09.710884 kubelet[2298]: I0516 16:42:09.710862 2298 policy_none.go:49] "None policy: Start" May 16 16:42:09.710884 kubelet[2298]: I0516 16:42:09.710887 2298 memory_manager.go:186] "Starting memorymanager" policy="None" May 16 16:42:09.710997 kubelet[2298]: I0516 16:42:09.710897 2298 state_mem.go:35] "Initializing new in-memory state store" May 16 16:42:09.712175 kubelet[2298]: I0516 16:42:09.712147 2298 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 16 16:42:09.712224 kubelet[2298]: I0516 16:42:09.712179 2298 status_manager.go:230] "Starting to sync pod status with apiserver" May 16 16:42:09.712224 kubelet[2298]: I0516 16:42:09.712207 2298 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 16 16:42:09.712224 kubelet[2298]: I0516 16:42:09.712215 2298 kubelet.go:2436] "Starting kubelet main sync loop" May 16 16:42:09.712281 kubelet[2298]: E0516 16:42:09.712254 2298 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 16 16:42:09.714587 kubelet[2298]: E0516 16:42:09.714542 2298 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.0.80:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.0.80:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" May 16 16:42:09.718140 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 16 16:42:09.731628 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 16 16:42:09.734527 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 16 16:42:09.753361 kubelet[2298]: E0516 16:42:09.753328 2298 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 16 16:42:09.753538 kubelet[2298]: I0516 16:42:09.753506 2298 eviction_manager.go:189] "Eviction manager: starting control loop" May 16 16:42:09.753538 kubelet[2298]: I0516 16:42:09.753522 2298 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 16 16:42:09.753812 kubelet[2298]: I0516 16:42:09.753796 2298 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 16 16:42:09.754598 kubelet[2298]: E0516 16:42:09.754559 2298 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 16 16:42:09.754645 kubelet[2298]: E0516 16:42:09.754610 2298 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"localhost\" not found" May 16 16:42:09.823379 systemd[1]: Created slice kubepods-burstable-poddbd56bfc540eb384a572a250615f5e14.slice - libcontainer container kubepods-burstable-poddbd56bfc540eb384a572a250615f5e14.slice. May 16 16:42:09.837397 kubelet[2298]: E0516 16:42:09.837346 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:09.840338 systemd[1]: Created slice kubepods-burstable-pod97963c41ada533e2e0872a518ecd4611.slice - libcontainer container kubepods-burstable-pod97963c41ada533e2e0872a518ecd4611.slice. May 16 16:42:09.856441 kubelet[2298]: E0516 16:42:09.856381 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:09.856679 kubelet[2298]: I0516 16:42:09.856657 2298 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 16 16:42:09.857045 kubelet[2298]: E0516 16:42:09.857013 2298 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.80:6443/api/v1/nodes\": dial tcp 10.0.0.80:6443: connect: connection refused" node="localhost" May 16 16:42:09.859444 systemd[1]: Created slice kubepods-burstable-pod8fba52155e63f70cc922ab7cc8c200fd.slice - libcontainer container kubepods-burstable-pod8fba52155e63f70cc922ab7cc8c200fd.slice. May 16 16:42:09.862111 kubelet[2298]: E0516 16:42:09.862055 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:09.891845 kubelet[2298]: I0516 16:42:09.891754 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dbd56bfc540eb384a572a250615f5e14-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"dbd56bfc540eb384a572a250615f5e14\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:09.891845 kubelet[2298]: I0516 16:42:09.891784 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dbd56bfc540eb384a572a250615f5e14-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"dbd56bfc540eb384a572a250615f5e14\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:09.891845 kubelet[2298]: I0516 16:42:09.891806 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:09.891845 kubelet[2298]: I0516 16:42:09.891824 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:09.891845 kubelet[2298]: I0516 16:42:09.891842 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:09.891996 kubelet[2298]: I0516 16:42:09.891858 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8fba52155e63f70cc922ab7cc8c200fd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"8fba52155e63f70cc922ab7cc8c200fd\") " pod="kube-system/kube-scheduler-localhost" May 16 16:42:09.891996 kubelet[2298]: I0516 16:42:09.891872 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dbd56bfc540eb384a572a250615f5e14-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"dbd56bfc540eb384a572a250615f5e14\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:09.891996 kubelet[2298]: I0516 16:42:09.891887 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:09.891996 kubelet[2298]: I0516 16:42:09.891907 2298 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:09.894086 kubelet[2298]: E0516 16:42:09.894045 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.80:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.80:6443: connect: connection refused" interval="400ms" May 16 16:42:10.058275 kubelet[2298]: I0516 16:42:10.058238 2298 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 16 16:42:10.058520 kubelet[2298]: E0516 16:42:10.058481 2298 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.80:6443/api/v1/nodes\": dial tcp 10.0.0.80:6443: connect: connection refused" node="localhost" May 16 16:42:10.138743 containerd[1577]: time="2025-05-16T16:42:10.138708823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:dbd56bfc540eb384a572a250615f5e14,Namespace:kube-system,Attempt:0,}" May 16 16:42:10.157597 containerd[1577]: time="2025-05-16T16:42:10.157491594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:97963c41ada533e2e0872a518ecd4611,Namespace:kube-system,Attempt:0,}" May 16 16:42:10.163251 containerd[1577]: time="2025-05-16T16:42:10.163208321Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:8fba52155e63f70cc922ab7cc8c200fd,Namespace:kube-system,Attempt:0,}" May 16 16:42:10.207054 containerd[1577]: time="2025-05-16T16:42:10.207004881Z" level=info msg="connecting to shim 4d1ac11be4fc6f631ddd3f73931dd1bbeb003b3b3830c34fcdfc4c30f3b55006" address="unix:///run/containerd/s/37d8167671eea48faef45b0b32977bd87450c161eca07f0a4e06d7a1f9ba3412" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:10.208421 containerd[1577]: time="2025-05-16T16:42:10.208374206Z" level=info msg="connecting to shim 1bf6cbddcdf7465980840a7a3881372fff17ed50f93d96226ffea351d7ef6d9d" address="unix:///run/containerd/s/9a4b98c12094738bc06ae6a492cb098b154e75ad14bb05b8ca7f65c582dc37f9" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:10.210442 containerd[1577]: time="2025-05-16T16:42:10.210403275Z" level=info msg="connecting to shim 97fadb9d2428b68307cd8c23ab718f42918cc149fca4557d07463ac35a612977" address="unix:///run/containerd/s/46782c4d17d71d49cc4b060cdd5f2997930eeefc68070e237dbe474fa6ec1939" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:10.260768 systemd[1]: Started cri-containerd-97fadb9d2428b68307cd8c23ab718f42918cc149fca4557d07463ac35a612977.scope - libcontainer container 97fadb9d2428b68307cd8c23ab718f42918cc149fca4557d07463ac35a612977. May 16 16:42:10.265618 systemd[1]: Started cri-containerd-4d1ac11be4fc6f631ddd3f73931dd1bbeb003b3b3830c34fcdfc4c30f3b55006.scope - libcontainer container 4d1ac11be4fc6f631ddd3f73931dd1bbeb003b3b3830c34fcdfc4c30f3b55006. May 16 16:42:10.271954 systemd[1]: Started cri-containerd-1bf6cbddcdf7465980840a7a3881372fff17ed50f93d96226ffea351d7ef6d9d.scope - libcontainer container 1bf6cbddcdf7465980840a7a3881372fff17ed50f93d96226ffea351d7ef6d9d. May 16 16:42:10.295037 kubelet[2298]: E0516 16:42:10.294991 2298 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.0.80:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/localhost?timeout=10s\": dial tcp 10.0.0.80:6443: connect: connection refused" interval="800ms" May 16 16:42:10.439030 containerd[1577]: time="2025-05-16T16:42:10.438905848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-localhost,Uid:8fba52155e63f70cc922ab7cc8c200fd,Namespace:kube-system,Attempt:0,} returns sandbox id \"97fadb9d2428b68307cd8c23ab718f42918cc149fca4557d07463ac35a612977\"" May 16 16:42:10.445870 containerd[1577]: time="2025-05-16T16:42:10.445840792Z" level=info msg="CreateContainer within sandbox \"97fadb9d2428b68307cd8c23ab718f42918cc149fca4557d07463ac35a612977\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 16 16:42:10.446862 containerd[1577]: time="2025-05-16T16:42:10.446828421Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-localhost,Uid:97963c41ada533e2e0872a518ecd4611,Namespace:kube-system,Attempt:0,} returns sandbox id \"4d1ac11be4fc6f631ddd3f73931dd1bbeb003b3b3830c34fcdfc4c30f3b55006\"" May 16 16:42:10.454038 containerd[1577]: time="2025-05-16T16:42:10.454001060Z" level=info msg="CreateContainer within sandbox \"4d1ac11be4fc6f631ddd3f73931dd1bbeb003b3b3830c34fcdfc4c30f3b55006\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 16 16:42:10.457541 containerd[1577]: time="2025-05-16T16:42:10.457497464Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-localhost,Uid:dbd56bfc540eb384a572a250615f5e14,Namespace:kube-system,Attempt:0,} returns sandbox id \"1bf6cbddcdf7465980840a7a3881372fff17ed50f93d96226ffea351d7ef6d9d\"" May 16 16:42:10.459471 kubelet[2298]: I0516 16:42:10.459444 2298 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 16 16:42:10.459931 kubelet[2298]: E0516 16:42:10.459869 2298 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.0.80:6443/api/v1/nodes\": dial tcp 10.0.0.80:6443: connect: connection refused" node="localhost" May 16 16:42:10.461171 containerd[1577]: time="2025-05-16T16:42:10.461141573Z" level=info msg="CreateContainer within sandbox \"1bf6cbddcdf7465980840a7a3881372fff17ed50f93d96226ffea351d7ef6d9d\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 16 16:42:10.468678 containerd[1577]: time="2025-05-16T16:42:10.468637752Z" level=info msg="Container 8c8237aa98767ccbe39f61cf48d19ddd1d604118c6a9bf60f0e6b737041f34a1: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:10.472480 containerd[1577]: time="2025-05-16T16:42:10.472449819Z" level=info msg="Container dff2a99594a6b79688e0fe3869c9fb7d7bf078c42015c16cb0cd312565c62417: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:10.477113 containerd[1577]: time="2025-05-16T16:42:10.477081247Z" level=info msg="CreateContainer within sandbox \"97fadb9d2428b68307cd8c23ab718f42918cc149fca4557d07463ac35a612977\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"8c8237aa98767ccbe39f61cf48d19ddd1d604118c6a9bf60f0e6b737041f34a1\"" May 16 16:42:10.477392 containerd[1577]: time="2025-05-16T16:42:10.477355923Z" level=info msg="Container 721ca0ecf24755f2036c8c0510dc1038f37c9a47039a0e72ad5c48a2fe99d494: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:10.477832 containerd[1577]: time="2025-05-16T16:42:10.477798486Z" level=info msg="StartContainer for \"8c8237aa98767ccbe39f61cf48d19ddd1d604118c6a9bf60f0e6b737041f34a1\"" May 16 16:42:10.478946 containerd[1577]: time="2025-05-16T16:42:10.478894056Z" level=info msg="connecting to shim 8c8237aa98767ccbe39f61cf48d19ddd1d604118c6a9bf60f0e6b737041f34a1" address="unix:///run/containerd/s/46782c4d17d71d49cc4b060cdd5f2997930eeefc68070e237dbe474fa6ec1939" protocol=ttrpc version=3 May 16 16:42:10.486045 containerd[1577]: time="2025-05-16T16:42:10.485988578Z" level=info msg="CreateContainer within sandbox \"4d1ac11be4fc6f631ddd3f73931dd1bbeb003b3b3830c34fcdfc4c30f3b55006\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"dff2a99594a6b79688e0fe3869c9fb7d7bf078c42015c16cb0cd312565c62417\"" May 16 16:42:10.486495 containerd[1577]: time="2025-05-16T16:42:10.486463136Z" level=info msg="StartContainer for \"dff2a99594a6b79688e0fe3869c9fb7d7bf078c42015c16cb0cd312565c62417\"" May 16 16:42:10.487487 containerd[1577]: time="2025-05-16T16:42:10.487458214Z" level=info msg="connecting to shim dff2a99594a6b79688e0fe3869c9fb7d7bf078c42015c16cb0cd312565c62417" address="unix:///run/containerd/s/37d8167671eea48faef45b0b32977bd87450c161eca07f0a4e06d7a1f9ba3412" protocol=ttrpc version=3 May 16 16:42:10.489300 containerd[1577]: time="2025-05-16T16:42:10.489267669Z" level=info msg="CreateContainer within sandbox \"1bf6cbddcdf7465980840a7a3881372fff17ed50f93d96226ffea351d7ef6d9d\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"721ca0ecf24755f2036c8c0510dc1038f37c9a47039a0e72ad5c48a2fe99d494\"" May 16 16:42:10.489985 containerd[1577]: time="2025-05-16T16:42:10.489958709Z" level=info msg="StartContainer for \"721ca0ecf24755f2036c8c0510dc1038f37c9a47039a0e72ad5c48a2fe99d494\"" May 16 16:42:10.491406 containerd[1577]: time="2025-05-16T16:42:10.491349878Z" level=info msg="connecting to shim 721ca0ecf24755f2036c8c0510dc1038f37c9a47039a0e72ad5c48a2fe99d494" address="unix:///run/containerd/s/9a4b98c12094738bc06ae6a492cb098b154e75ad14bb05b8ca7f65c582dc37f9" protocol=ttrpc version=3 May 16 16:42:10.501896 systemd[1]: Started cri-containerd-8c8237aa98767ccbe39f61cf48d19ddd1d604118c6a9bf60f0e6b737041f34a1.scope - libcontainer container 8c8237aa98767ccbe39f61cf48d19ddd1d604118c6a9bf60f0e6b737041f34a1. May 16 16:42:10.512737 systemd[1]: Started cri-containerd-dff2a99594a6b79688e0fe3869c9fb7d7bf078c42015c16cb0cd312565c62417.scope - libcontainer container dff2a99594a6b79688e0fe3869c9fb7d7bf078c42015c16cb0cd312565c62417. May 16 16:42:10.516937 systemd[1]: Started cri-containerd-721ca0ecf24755f2036c8c0510dc1038f37c9a47039a0e72ad5c48a2fe99d494.scope - libcontainer container 721ca0ecf24755f2036c8c0510dc1038f37c9a47039a0e72ad5c48a2fe99d494. May 16 16:42:10.568881 containerd[1577]: time="2025-05-16T16:42:10.568643467Z" level=info msg="StartContainer for \"8c8237aa98767ccbe39f61cf48d19ddd1d604118c6a9bf60f0e6b737041f34a1\" returns successfully" May 16 16:42:10.570863 kubelet[2298]: E0516 16:42:10.570818 2298 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.0.80:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.0.80:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" May 16 16:42:10.573481 containerd[1577]: time="2025-05-16T16:42:10.573441520Z" level=info msg="StartContainer for \"dff2a99594a6b79688e0fe3869c9fb7d7bf078c42015c16cb0cd312565c62417\" returns successfully" May 16 16:42:10.595915 containerd[1577]: time="2025-05-16T16:42:10.595860040Z" level=info msg="StartContainer for \"721ca0ecf24755f2036c8c0510dc1038f37c9a47039a0e72ad5c48a2fe99d494\" returns successfully" May 16 16:42:10.723602 kubelet[2298]: E0516 16:42:10.721956 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:10.728129 kubelet[2298]: E0516 16:42:10.728092 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:10.737416 kubelet[2298]: E0516 16:42:10.737092 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:11.261780 kubelet[2298]: I0516 16:42:11.261750 2298 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 16 16:42:11.739182 kubelet[2298]: E0516 16:42:11.739085 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:11.739501 kubelet[2298]: E0516 16:42:11.739303 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:11.739501 kubelet[2298]: E0516 16:42:11.739420 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:12.740044 kubelet[2298]: E0516 16:42:12.740013 2298 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"localhost\" not found" node="localhost" May 16 16:42:12.793833 kubelet[2298]: E0516 16:42:12.793786 2298 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"localhost\" not found" node="localhost" May 16 16:42:12.883832 kubelet[2298]: I0516 16:42:12.883789 2298 kubelet_node_status.go:78] "Successfully registered node" node="localhost" May 16 16:42:12.891478 kubelet[2298]: I0516 16:42:12.891420 2298 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" May 16 16:42:12.897240 kubelet[2298]: E0516 16:42:12.896805 2298 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-localhost" May 16 16:42:12.897240 kubelet[2298]: I0516 16:42:12.896833 2298 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 16 16:42:12.898532 kubelet[2298]: E0516 16:42:12.898519 2298 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-localhost" May 16 16:42:12.898685 kubelet[2298]: I0516 16:42:12.898595 2298 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 16 16:42:12.900214 kubelet[2298]: E0516 16:42:12.900151 2298 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-localhost" May 16 16:42:13.676097 kubelet[2298]: I0516 16:42:13.676058 2298 apiserver.go:52] "Watching apiserver" May 16 16:42:13.691759 kubelet[2298]: I0516 16:42:13.691722 2298 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 16 16:42:15.293814 systemd[1]: Reload requested from client PID 2585 ('systemctl') (unit session-7.scope)... May 16 16:42:15.293831 systemd[1]: Reloading... May 16 16:42:15.373602 zram_generator::config[2627]: No configuration found. May 16 16:42:15.467436 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 16 16:42:15.594909 systemd[1]: Reloading finished in 300 ms. May 16 16:42:15.620005 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:42:15.646254 systemd[1]: kubelet.service: Deactivated successfully. May 16 16:42:15.646547 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:15.646632 systemd[1]: kubelet.service: Consumed 822ms CPU time, 132.1M memory peak. May 16 16:42:15.648605 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 16 16:42:15.876296 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 16 16:42:15.888994 (kubelet)[2673]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 16 16:42:15.943989 kubelet[2673]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 16:42:15.943989 kubelet[2673]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 16 16:42:15.943989 kubelet[2673]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 16 16:42:15.944407 kubelet[2673]: I0516 16:42:15.944026 2673 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 16 16:42:15.951549 kubelet[2673]: I0516 16:42:15.951503 2673 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" May 16 16:42:15.952221 kubelet[2673]: I0516 16:42:15.951679 2673 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 16 16:42:15.952376 kubelet[2673]: I0516 16:42:15.952327 2673 server.go:956] "Client rotation is on, will bootstrap in background" May 16 16:42:15.954306 kubelet[2673]: I0516 16:42:15.954266 2673 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" May 16 16:42:15.957036 kubelet[2673]: I0516 16:42:15.957002 2673 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 16 16:42:15.961359 kubelet[2673]: I0516 16:42:15.961335 2673 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 16 16:42:15.966076 kubelet[2673]: I0516 16:42:15.966052 2673 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 16 16:42:15.966273 kubelet[2673]: I0516 16:42:15.966243 2673 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 16 16:42:15.966441 kubelet[2673]: I0516 16:42:15.966266 2673 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"localhost","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 16 16:42:15.966519 kubelet[2673]: I0516 16:42:15.966445 2673 topology_manager.go:138] "Creating topology manager with none policy" May 16 16:42:15.966519 kubelet[2673]: I0516 16:42:15.966456 2673 container_manager_linux.go:303] "Creating device plugin manager" May 16 16:42:15.966519 kubelet[2673]: I0516 16:42:15.966514 2673 state_mem.go:36] "Initialized new in-memory state store" May 16 16:42:15.966701 kubelet[2673]: I0516 16:42:15.966681 2673 kubelet.go:480] "Attempting to sync node with API server" May 16 16:42:15.966701 kubelet[2673]: I0516 16:42:15.966697 2673 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" May 16 16:42:15.966756 kubelet[2673]: I0516 16:42:15.966718 2673 kubelet.go:386] "Adding apiserver pod source" May 16 16:42:15.966756 kubelet[2673]: I0516 16:42:15.966732 2673 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 16 16:42:15.968269 kubelet[2673]: I0516 16:42:15.968161 2673 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 16 16:42:15.968884 kubelet[2673]: I0516 16:42:15.968855 2673 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" May 16 16:42:15.974671 kubelet[2673]: I0516 16:42:15.974074 2673 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 16 16:42:15.974671 kubelet[2673]: I0516 16:42:15.974137 2673 server.go:1289] "Started kubelet" May 16 16:42:15.976047 kubelet[2673]: I0516 16:42:15.976006 2673 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 May 16 16:42:15.976498 kubelet[2673]: I0516 16:42:15.976430 2673 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 16 16:42:15.978001 kubelet[2673]: I0516 16:42:15.977692 2673 server.go:317] "Adding debug handlers to kubelet server" May 16 16:42:15.979849 kubelet[2673]: I0516 16:42:15.979812 2673 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 16 16:42:15.979939 kubelet[2673]: I0516 16:42:15.979910 2673 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 16 16:42:15.983679 kubelet[2673]: I0516 16:42:15.981657 2673 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 16 16:42:15.986162 kubelet[2673]: E0516 16:42:15.986102 2673 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 16 16:42:15.986162 kubelet[2673]: I0516 16:42:15.983712 2673 volume_manager.go:297] "Starting Kubelet Volume Manager" May 16 16:42:15.986318 kubelet[2673]: I0516 16:42:15.983717 2673 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 16 16:42:15.986592 kubelet[2673]: I0516 16:42:15.986407 2673 reconciler.go:26] "Reconciler: start to sync state" May 16 16:42:15.988198 kubelet[2673]: E0516 16:42:15.983665 2673 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"localhost\" not found" May 16 16:42:15.990053 kubelet[2673]: I0516 16:42:15.990022 2673 factory.go:223] Registration of the containerd container factory successfully May 16 16:42:15.990053 kubelet[2673]: I0516 16:42:15.990044 2673 factory.go:223] Registration of the systemd container factory successfully May 16 16:42:15.990225 kubelet[2673]: I0516 16:42:15.990182 2673 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 16 16:42:15.998093 kubelet[2673]: I0516 16:42:15.998041 2673 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" May 16 16:42:15.999452 kubelet[2673]: I0516 16:42:15.999417 2673 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" May 16 16:42:15.999452 kubelet[2673]: I0516 16:42:15.999449 2673 status_manager.go:230] "Starting to sync pod status with apiserver" May 16 16:42:15.999520 kubelet[2673]: I0516 16:42:15.999479 2673 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 16 16:42:15.999520 kubelet[2673]: I0516 16:42:15.999487 2673 kubelet.go:2436] "Starting kubelet main sync loop" May 16 16:42:15.999586 kubelet[2673]: E0516 16:42:15.999532 2673 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 16 16:42:16.052726 kubelet[2673]: I0516 16:42:16.052692 2673 cpu_manager.go:221] "Starting CPU manager" policy="none" May 16 16:42:16.052726 kubelet[2673]: I0516 16:42:16.052715 2673 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 16 16:42:16.052867 kubelet[2673]: I0516 16:42:16.052743 2673 state_mem.go:36] "Initialized new in-memory state store" May 16 16:42:16.052922 kubelet[2673]: I0516 16:42:16.052907 2673 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 16 16:42:16.052957 kubelet[2673]: I0516 16:42:16.052919 2673 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 16 16:42:16.052957 kubelet[2673]: I0516 16:42:16.052947 2673 policy_none.go:49] "None policy: Start" May 16 16:42:16.052957 kubelet[2673]: I0516 16:42:16.052956 2673 memory_manager.go:186] "Starting memorymanager" policy="None" May 16 16:42:16.053036 kubelet[2673]: I0516 16:42:16.052965 2673 state_mem.go:35] "Initializing new in-memory state store" May 16 16:42:16.053057 kubelet[2673]: I0516 16:42:16.053052 2673 state_mem.go:75] "Updated machine memory state" May 16 16:42:16.058035 kubelet[2673]: E0516 16:42:16.058014 2673 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" May 16 16:42:16.058548 kubelet[2673]: I0516 16:42:16.058410 2673 eviction_manager.go:189] "Eviction manager: starting control loop" May 16 16:42:16.058548 kubelet[2673]: I0516 16:42:16.058426 2673 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 16 16:42:16.058627 kubelet[2673]: I0516 16:42:16.058612 2673 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 16 16:42:16.060000 kubelet[2673]: E0516 16:42:16.059976 2673 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 16 16:42:16.101179 kubelet[2673]: I0516 16:42:16.101122 2673 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 16 16:42:16.101301 kubelet[2673]: I0516 16:42:16.101267 2673 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-localhost" May 16 16:42:16.101390 kubelet[2673]: I0516 16:42:16.101360 2673 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 16 16:42:16.160660 kubelet[2673]: I0516 16:42:16.160519 2673 kubelet_node_status.go:75] "Attempting to register node" node="localhost" May 16 16:42:16.167244 kubelet[2673]: I0516 16:42:16.167211 2673 kubelet_node_status.go:124] "Node was previously registered" node="localhost" May 16 16:42:16.167370 kubelet[2673]: I0516 16:42:16.167288 2673 kubelet_node_status.go:78] "Successfully registered node" node="localhost" May 16 16:42:16.287350 kubelet[2673]: I0516 16:42:16.287305 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-ca-certs\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:16.287350 kubelet[2673]: I0516 16:42:16.287340 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-flexvolume-dir\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:16.287518 kubelet[2673]: I0516 16:42:16.287371 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-kubeconfig\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:16.287518 kubelet[2673]: I0516 16:42:16.287427 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/dbd56bfc540eb384a572a250615f5e14-usr-share-ca-certificates\") pod \"kube-apiserver-localhost\" (UID: \"dbd56bfc540eb384a572a250615f5e14\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:16.287518 kubelet[2673]: I0516 16:42:16.287468 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-k8s-certs\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:16.287518 kubelet[2673]: I0516 16:42:16.287486 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/97963c41ada533e2e0872a518ecd4611-usr-share-ca-certificates\") pod \"kube-controller-manager-localhost\" (UID: \"97963c41ada533e2e0872a518ecd4611\") " pod="kube-system/kube-controller-manager-localhost" May 16 16:42:16.287518 kubelet[2673]: I0516 16:42:16.287513 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8fba52155e63f70cc922ab7cc8c200fd-kubeconfig\") pod \"kube-scheduler-localhost\" (UID: \"8fba52155e63f70cc922ab7cc8c200fd\") " pod="kube-system/kube-scheduler-localhost" May 16 16:42:16.287741 kubelet[2673]: I0516 16:42:16.287544 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/dbd56bfc540eb384a572a250615f5e14-ca-certs\") pod \"kube-apiserver-localhost\" (UID: \"dbd56bfc540eb384a572a250615f5e14\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:16.287741 kubelet[2673]: I0516 16:42:16.287650 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/dbd56bfc540eb384a572a250615f5e14-k8s-certs\") pod \"kube-apiserver-localhost\" (UID: \"dbd56bfc540eb384a572a250615f5e14\") " pod="kube-system/kube-apiserver-localhost" May 16 16:42:16.968918 kubelet[2673]: I0516 16:42:16.968850 2673 apiserver.go:52] "Watching apiserver" May 16 16:42:16.986707 kubelet[2673]: I0516 16:42:16.986662 2673 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 16 16:42:17.013823 kubelet[2673]: I0516 16:42:17.013783 2673 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-localhost" May 16 16:42:17.014277 kubelet[2673]: I0516 16:42:17.014225 2673 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-localhost" May 16 16:42:17.022319 kubelet[2673]: E0516 16:42:17.022260 2673 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-localhost\" already exists" pod="kube-system/kube-scheduler-localhost" May 16 16:42:17.022319 kubelet[2673]: E0516 16:42:17.022296 2673 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-localhost\" already exists" pod="kube-system/kube-apiserver-localhost" May 16 16:42:17.041665 kubelet[2673]: I0516 16:42:17.041617 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-localhost" podStartSLOduration=1.041556355 podStartE2EDuration="1.041556355s" podCreationTimestamp="2025-05-16 16:42:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:17.04135038 +0000 UTC m=+1.145235663" watchObservedRunningTime="2025-05-16 16:42:17.041556355 +0000 UTC m=+1.145441637" May 16 16:42:17.041934 kubelet[2673]: I0516 16:42:17.041711 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-localhost" podStartSLOduration=1.0417078339999999 podStartE2EDuration="1.041707834s" podCreationTimestamp="2025-05-16 16:42:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:17.034705912 +0000 UTC m=+1.138591204" watchObservedRunningTime="2025-05-16 16:42:17.041707834 +0000 UTC m=+1.145593116" May 16 16:42:17.048293 kubelet[2673]: I0516 16:42:17.048242 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-localhost" podStartSLOduration=1.048234477 podStartE2EDuration="1.048234477s" podCreationTimestamp="2025-05-16 16:42:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:17.048133388 +0000 UTC m=+1.152018660" watchObservedRunningTime="2025-05-16 16:42:17.048234477 +0000 UTC m=+1.152119759" May 16 16:42:22.014152 kubelet[2673]: I0516 16:42:22.014092 2673 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 16 16:42:22.014663 kubelet[2673]: I0516 16:42:22.014584 2673 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 16 16:42:22.014702 containerd[1577]: time="2025-05-16T16:42:22.014385829Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 16 16:42:23.033612 systemd[1]: Created slice kubepods-besteffort-pod66e2a138_5770_4c60_8669_043eb8fa1b15.slice - libcontainer container kubepods-besteffort-pod66e2a138_5770_4c60_8669_043eb8fa1b15.slice. May 16 16:42:23.094725 systemd[1]: Created slice kubepods-besteffort-pod944ff01c_1e5e_43ed_bf75_e92108f31eb3.slice - libcontainer container kubepods-besteffort-pod944ff01c_1e5e_43ed_bf75_e92108f31eb3.slice. May 16 16:42:23.137755 kubelet[2673]: I0516 16:42:23.137718 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv527\" (UniqueName: \"kubernetes.io/projected/66e2a138-5770-4c60-8669-043eb8fa1b15-kube-api-access-sv527\") pod \"kube-proxy-bwtlp\" (UID: \"66e2a138-5770-4c60-8669-043eb8fa1b15\") " pod="kube-system/kube-proxy-bwtlp" May 16 16:42:23.138142 kubelet[2673]: I0516 16:42:23.137767 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/66e2a138-5770-4c60-8669-043eb8fa1b15-kube-proxy\") pod \"kube-proxy-bwtlp\" (UID: \"66e2a138-5770-4c60-8669-043eb8fa1b15\") " pod="kube-system/kube-proxy-bwtlp" May 16 16:42:23.138142 kubelet[2673]: I0516 16:42:23.137786 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/944ff01c-1e5e-43ed-bf75-e92108f31eb3-var-lib-calico\") pod \"tigera-operator-844669ff44-wncvh\" (UID: \"944ff01c-1e5e-43ed-bf75-e92108f31eb3\") " pod="tigera-operator/tigera-operator-844669ff44-wncvh" May 16 16:42:23.138142 kubelet[2673]: I0516 16:42:23.137803 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/66e2a138-5770-4c60-8669-043eb8fa1b15-xtables-lock\") pod \"kube-proxy-bwtlp\" (UID: \"66e2a138-5770-4c60-8669-043eb8fa1b15\") " pod="kube-system/kube-proxy-bwtlp" May 16 16:42:23.138142 kubelet[2673]: I0516 16:42:23.137817 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/66e2a138-5770-4c60-8669-043eb8fa1b15-lib-modules\") pod \"kube-proxy-bwtlp\" (UID: \"66e2a138-5770-4c60-8669-043eb8fa1b15\") " pod="kube-system/kube-proxy-bwtlp" May 16 16:42:23.138142 kubelet[2673]: I0516 16:42:23.137839 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n5xv\" (UniqueName: \"kubernetes.io/projected/944ff01c-1e5e-43ed-bf75-e92108f31eb3-kube-api-access-4n5xv\") pod \"tigera-operator-844669ff44-wncvh\" (UID: \"944ff01c-1e5e-43ed-bf75-e92108f31eb3\") " pod="tigera-operator/tigera-operator-844669ff44-wncvh" May 16 16:42:23.345415 containerd[1577]: time="2025-05-16T16:42:23.345320226Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bwtlp,Uid:66e2a138-5770-4c60-8669-043eb8fa1b15,Namespace:kube-system,Attempt:0,}" May 16 16:42:23.363294 containerd[1577]: time="2025-05-16T16:42:23.363263498Z" level=info msg="connecting to shim 26b05e9ff582602012264cab6ce97d5a21c3a7c03f2078b044101175ae1c4649" address="unix:///run/containerd/s/3b06379bbd51d425a81aa66df6e283de1a2c592efcfaded095a51e9640e5c4d8" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:23.391705 systemd[1]: Started cri-containerd-26b05e9ff582602012264cab6ce97d5a21c3a7c03f2078b044101175ae1c4649.scope - libcontainer container 26b05e9ff582602012264cab6ce97d5a21c3a7c03f2078b044101175ae1c4649. May 16 16:42:23.398619 containerd[1577]: time="2025-05-16T16:42:23.398541133Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-wncvh,Uid:944ff01c-1e5e-43ed-bf75-e92108f31eb3,Namespace:tigera-operator,Attempt:0,}" May 16 16:42:23.418637 containerd[1577]: time="2025-05-16T16:42:23.418595998Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-bwtlp,Uid:66e2a138-5770-4c60-8669-043eb8fa1b15,Namespace:kube-system,Attempt:0,} returns sandbox id \"26b05e9ff582602012264cab6ce97d5a21c3a7c03f2078b044101175ae1c4649\"" May 16 16:42:23.419461 containerd[1577]: time="2025-05-16T16:42:23.419391348Z" level=info msg="connecting to shim caf376c8a0e263262787e1ded88cf702c6949a1e5df045e780eec1cde8eb073d" address="unix:///run/containerd/s/edfff451eb0c577782dc18a67402bbf1d6fa5d42d10ced12edfaee1bee23d63c" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:23.424163 containerd[1577]: time="2025-05-16T16:42:23.424123416Z" level=info msg="CreateContainer within sandbox \"26b05e9ff582602012264cab6ce97d5a21c3a7c03f2078b044101175ae1c4649\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 16 16:42:23.442963 containerd[1577]: time="2025-05-16T16:42:23.442889630Z" level=info msg="Container 0fb8b7dd11619d489c5b465d31c96c1c7e0745fbfdbde96de2e46b243ab5c4e3: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:23.450496 containerd[1577]: time="2025-05-16T16:42:23.450464741Z" level=info msg="CreateContainer within sandbox \"26b05e9ff582602012264cab6ce97d5a21c3a7c03f2078b044101175ae1c4649\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"0fb8b7dd11619d489c5b465d31c96c1c7e0745fbfdbde96de2e46b243ab5c4e3\"" May 16 16:42:23.450856 containerd[1577]: time="2025-05-16T16:42:23.450832439Z" level=info msg="StartContainer for \"0fb8b7dd11619d489c5b465d31c96c1c7e0745fbfdbde96de2e46b243ab5c4e3\"" May 16 16:42:23.451726 systemd[1]: Started cri-containerd-caf376c8a0e263262787e1ded88cf702c6949a1e5df045e780eec1cde8eb073d.scope - libcontainer container caf376c8a0e263262787e1ded88cf702c6949a1e5df045e780eec1cde8eb073d. May 16 16:42:23.452208 containerd[1577]: time="2025-05-16T16:42:23.452188639Z" level=info msg="connecting to shim 0fb8b7dd11619d489c5b465d31c96c1c7e0745fbfdbde96de2e46b243ab5c4e3" address="unix:///run/containerd/s/3b06379bbd51d425a81aa66df6e283de1a2c592efcfaded095a51e9640e5c4d8" protocol=ttrpc version=3 May 16 16:42:23.472970 systemd[1]: Started cri-containerd-0fb8b7dd11619d489c5b465d31c96c1c7e0745fbfdbde96de2e46b243ab5c4e3.scope - libcontainer container 0fb8b7dd11619d489c5b465d31c96c1c7e0745fbfdbde96de2e46b243ab5c4e3. May 16 16:42:23.493756 containerd[1577]: time="2025-05-16T16:42:23.493715091Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-wncvh,Uid:944ff01c-1e5e-43ed-bf75-e92108f31eb3,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"caf376c8a0e263262787e1ded88cf702c6949a1e5df045e780eec1cde8eb073d\"" May 16 16:42:23.496676 containerd[1577]: time="2025-05-16T16:42:23.496632424Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 16 16:42:23.517164 containerd[1577]: time="2025-05-16T16:42:23.517134996Z" level=info msg="StartContainer for \"0fb8b7dd11619d489c5b465d31c96c1c7e0745fbfdbde96de2e46b243ab5c4e3\" returns successfully" May 16 16:42:24.034311 kubelet[2673]: I0516 16:42:24.034255 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-bwtlp" podStartSLOduration=1.034240788 podStartE2EDuration="1.034240788s" podCreationTimestamp="2025-05-16 16:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:24.034137133 +0000 UTC m=+8.138022415" watchObservedRunningTime="2025-05-16 16:42:24.034240788 +0000 UTC m=+8.138126070" May 16 16:42:25.246156 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount276510173.mount: Deactivated successfully. May 16 16:42:25.590139 containerd[1577]: time="2025-05-16T16:42:25.590016074Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:25.590882 containerd[1577]: time="2025-05-16T16:42:25.590835789Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 16 16:42:25.592036 containerd[1577]: time="2025-05-16T16:42:25.592002724Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:25.593855 containerd[1577]: time="2025-05-16T16:42:25.593821429Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:25.594380 containerd[1577]: time="2025-05-16T16:42:25.594339670Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 2.097662472s" May 16 16:42:25.594412 containerd[1577]: time="2025-05-16T16:42:25.594379244Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 16 16:42:25.599020 containerd[1577]: time="2025-05-16T16:42:25.598955444Z" level=info msg="CreateContainer within sandbox \"caf376c8a0e263262787e1ded88cf702c6949a1e5df045e780eec1cde8eb073d\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 16 16:42:25.607542 containerd[1577]: time="2025-05-16T16:42:25.607503521Z" level=info msg="Container 1b355b90d5ea8fb27e1e2c4f65a5995dfda9644aacb8ff81cbf57801ff0aafb3: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:25.611051 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount291275602.mount: Deactivated successfully. May 16 16:42:25.612809 containerd[1577]: time="2025-05-16T16:42:25.612770583Z" level=info msg="CreateContainer within sandbox \"caf376c8a0e263262787e1ded88cf702c6949a1e5df045e780eec1cde8eb073d\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"1b355b90d5ea8fb27e1e2c4f65a5995dfda9644aacb8ff81cbf57801ff0aafb3\"" May 16 16:42:25.613169 containerd[1577]: time="2025-05-16T16:42:25.613085153Z" level=info msg="StartContainer for \"1b355b90d5ea8fb27e1e2c4f65a5995dfda9644aacb8ff81cbf57801ff0aafb3\"" May 16 16:42:25.613897 containerd[1577]: time="2025-05-16T16:42:25.613872146Z" level=info msg="connecting to shim 1b355b90d5ea8fb27e1e2c4f65a5995dfda9644aacb8ff81cbf57801ff0aafb3" address="unix:///run/containerd/s/edfff451eb0c577782dc18a67402bbf1d6fa5d42d10ced12edfaee1bee23d63c" protocol=ttrpc version=3 May 16 16:42:25.661709 systemd[1]: Started cri-containerd-1b355b90d5ea8fb27e1e2c4f65a5995dfda9644aacb8ff81cbf57801ff0aafb3.scope - libcontainer container 1b355b90d5ea8fb27e1e2c4f65a5995dfda9644aacb8ff81cbf57801ff0aafb3. May 16 16:42:25.689375 containerd[1577]: time="2025-05-16T16:42:25.689336310Z" level=info msg="StartContainer for \"1b355b90d5ea8fb27e1e2c4f65a5995dfda9644aacb8ff81cbf57801ff0aafb3\" returns successfully" May 16 16:42:26.050165 kubelet[2673]: I0516 16:42:26.050091 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-wncvh" podStartSLOduration=0.949933912 podStartE2EDuration="3.050073272s" podCreationTimestamp="2025-05-16 16:42:23 +0000 UTC" firstStartedPulling="2025-05-16 16:42:23.494965603 +0000 UTC m=+7.598850885" lastFinishedPulling="2025-05-16 16:42:25.595104963 +0000 UTC m=+9.698990245" observedRunningTime="2025-05-16 16:42:26.050069015 +0000 UTC m=+10.153954297" watchObservedRunningTime="2025-05-16 16:42:26.050073272 +0000 UTC m=+10.153958554" May 16 16:42:30.618667 sudo[1790]: pam_unix(sudo:session): session closed for user root May 16 16:42:30.620383 sshd[1789]: Connection closed by 10.0.0.1 port 57734 May 16 16:42:30.620886 sshd-session[1787]: pam_unix(sshd:session): session closed for user core May 16 16:42:30.625441 systemd[1]: sshd@6-10.0.0.80:22-10.0.0.1:57734.service: Deactivated successfully. May 16 16:42:30.628695 systemd[1]: session-7.scope: Deactivated successfully. May 16 16:42:30.629083 systemd[1]: session-7.scope: Consumed 5.188s CPU time, 227.3M memory peak. May 16 16:42:30.630772 systemd-logind[1562]: Session 7 logged out. Waiting for processes to exit. May 16 16:42:30.632873 systemd-logind[1562]: Removed session 7. May 16 16:42:33.210701 systemd[1]: Created slice kubepods-besteffort-pod06838a96_9c38_4c0f_ab4a_5b8d35f94288.slice - libcontainer container kubepods-besteffort-pod06838a96_9c38_4c0f_ab4a_5b8d35f94288.slice. May 16 16:42:33.303622 kubelet[2673]: I0516 16:42:33.303551 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/06838a96-9c38-4c0f-ab4a-5b8d35f94288-tigera-ca-bundle\") pod \"calico-typha-7b565c4d48-wh6n6\" (UID: \"06838a96-9c38-4c0f-ab4a-5b8d35f94288\") " pod="calico-system/calico-typha-7b565c4d48-wh6n6" May 16 16:42:33.303622 kubelet[2673]: I0516 16:42:33.303625 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/06838a96-9c38-4c0f-ab4a-5b8d35f94288-typha-certs\") pod \"calico-typha-7b565c4d48-wh6n6\" (UID: \"06838a96-9c38-4c0f-ab4a-5b8d35f94288\") " pod="calico-system/calico-typha-7b565c4d48-wh6n6" May 16 16:42:33.304233 kubelet[2673]: I0516 16:42:33.303645 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kx6lh\" (UniqueName: \"kubernetes.io/projected/06838a96-9c38-4c0f-ab4a-5b8d35f94288-kube-api-access-kx6lh\") pod \"calico-typha-7b565c4d48-wh6n6\" (UID: \"06838a96-9c38-4c0f-ab4a-5b8d35f94288\") " pod="calico-system/calico-typha-7b565c4d48-wh6n6" May 16 16:42:33.516495 containerd[1577]: time="2025-05-16T16:42:33.516186570Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7b565c4d48-wh6n6,Uid:06838a96-9c38-4c0f-ab4a-5b8d35f94288,Namespace:calico-system,Attempt:0,}" May 16 16:42:33.545995 systemd[1]: Created slice kubepods-besteffort-poda623c9c4_0c2d_451b_891b_304f5c903c18.slice - libcontainer container kubepods-besteffort-poda623c9c4_0c2d_451b_891b_304f5c903c18.slice. May 16 16:42:33.563644 containerd[1577]: time="2025-05-16T16:42:33.563553000Z" level=info msg="connecting to shim d8e9811212080304ff6b4dd1272eaddc3ff1fe5642d0b69c201102bd8a1f0200" address="unix:///run/containerd/s/052b87042b5dd0dd6f0c608664130ca630cb9949283e207c775fbfd2bf550ded" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:33.588737 systemd[1]: Started cri-containerd-d8e9811212080304ff6b4dd1272eaddc3ff1fe5642d0b69c201102bd8a1f0200.scope - libcontainer container d8e9811212080304ff6b4dd1272eaddc3ff1fe5642d0b69c201102bd8a1f0200. May 16 16:42:33.605807 kubelet[2673]: I0516 16:42:33.605764 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a623c9c4-0c2d-451b-891b-304f5c903c18-flexvol-driver-host\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.605807 kubelet[2673]: I0516 16:42:33.605803 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a623c9c4-0c2d-451b-891b-304f5c903c18-xtables-lock\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.605939 kubelet[2673]: I0516 16:42:33.605820 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7t5r\" (UniqueName: \"kubernetes.io/projected/a623c9c4-0c2d-451b-891b-304f5c903c18-kube-api-access-d7t5r\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.605939 kubelet[2673]: I0516 16:42:33.605836 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a623c9c4-0c2d-451b-891b-304f5c903c18-lib-modules\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.605939 kubelet[2673]: I0516 16:42:33.605852 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a623c9c4-0c2d-451b-891b-304f5c903c18-cni-log-dir\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.605939 kubelet[2673]: I0516 16:42:33.605867 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a623c9c4-0c2d-451b-891b-304f5c903c18-cni-net-dir\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.605939 kubelet[2673]: I0516 16:42:33.605882 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a623c9c4-0c2d-451b-891b-304f5c903c18-var-lib-calico\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.606078 kubelet[2673]: I0516 16:42:33.605896 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a623c9c4-0c2d-451b-891b-304f5c903c18-policysync\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.606078 kubelet[2673]: I0516 16:42:33.605910 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a623c9c4-0c2d-451b-891b-304f5c903c18-var-run-calico\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.606078 kubelet[2673]: I0516 16:42:33.605930 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a623c9c4-0c2d-451b-891b-304f5c903c18-cni-bin-dir\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.606078 kubelet[2673]: I0516 16:42:33.605950 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a623c9c4-0c2d-451b-891b-304f5c903c18-node-certs\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.606078 kubelet[2673]: I0516 16:42:33.605968 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a623c9c4-0c2d-451b-891b-304f5c903c18-tigera-ca-bundle\") pod \"calico-node-h98bp\" (UID: \"a623c9c4-0c2d-451b-891b-304f5c903c18\") " pod="calico-system/calico-node-h98bp" May 16 16:42:33.638241 containerd[1577]: time="2025-05-16T16:42:33.638192698Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-7b565c4d48-wh6n6,Uid:06838a96-9c38-4c0f-ab4a-5b8d35f94288,Namespace:calico-system,Attempt:0,} returns sandbox id \"d8e9811212080304ff6b4dd1272eaddc3ff1fe5642d0b69c201102bd8a1f0200\"" May 16 16:42:33.639527 containerd[1577]: time="2025-05-16T16:42:33.639488036Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 16 16:42:33.709656 kubelet[2673]: E0516 16:42:33.709616 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.709656 kubelet[2673]: W0516 16:42:33.709644 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.709822 kubelet[2673]: E0516 16:42:33.709669 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.710756 kubelet[2673]: E0516 16:42:33.710722 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.710756 kubelet[2673]: W0516 16:42:33.710746 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.710832 kubelet[2673]: E0516 16:42:33.710765 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.715998 kubelet[2673]: E0516 16:42:33.715962 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.715998 kubelet[2673]: W0516 16:42:33.715995 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.716065 kubelet[2673]: E0516 16:42:33.716014 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.770615 kubelet[2673]: E0516 16:42:33.769844 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bb6pt" podUID="7f67342a-4c47-4a10-9f69-002db6933f22" May 16 16:42:33.786156 kubelet[2673]: E0516 16:42:33.786028 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.786156 kubelet[2673]: W0516 16:42:33.786052 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.786156 kubelet[2673]: E0516 16:42:33.786071 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.786460 kubelet[2673]: E0516 16:42:33.786433 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.786460 kubelet[2673]: W0516 16:42:33.786445 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.786460 kubelet[2673]: E0516 16:42:33.786454 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.786676 kubelet[2673]: E0516 16:42:33.786595 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.786676 kubelet[2673]: W0516 16:42:33.786602 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.786676 kubelet[2673]: E0516 16:42:33.786609 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.786802 kubelet[2673]: E0516 16:42:33.786769 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.786802 kubelet[2673]: W0516 16:42:33.786781 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.786802 kubelet[2673]: E0516 16:42:33.786788 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.787027 kubelet[2673]: E0516 16:42:33.786990 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.787027 kubelet[2673]: W0516 16:42:33.787020 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.787105 kubelet[2673]: E0516 16:42:33.787045 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.787325 kubelet[2673]: E0516 16:42:33.787296 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.787325 kubelet[2673]: W0516 16:42:33.787308 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.787325 kubelet[2673]: E0516 16:42:33.787317 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.787688 kubelet[2673]: E0516 16:42:33.787661 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.787688 kubelet[2673]: W0516 16:42:33.787672 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.787748 kubelet[2673]: E0516 16:42:33.787697 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.787926 kubelet[2673]: E0516 16:42:33.787908 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.787926 kubelet[2673]: W0516 16:42:33.787926 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.788010 kubelet[2673]: E0516 16:42:33.787934 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.788135 kubelet[2673]: E0516 16:42:33.788119 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.788135 kubelet[2673]: W0516 16:42:33.788129 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.788135 kubelet[2673]: E0516 16:42:33.788136 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.788494 kubelet[2673]: E0516 16:42:33.788270 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.788494 kubelet[2673]: W0516 16:42:33.788284 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.788494 kubelet[2673]: E0516 16:42:33.788291 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.788671 kubelet[2673]: E0516 16:42:33.788639 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.788671 kubelet[2673]: W0516 16:42:33.788661 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.788671 kubelet[2673]: E0516 16:42:33.788670 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.789289 kubelet[2673]: E0516 16:42:33.789237 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.789289 kubelet[2673]: W0516 16:42:33.789251 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.789289 kubelet[2673]: E0516 16:42:33.789260 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.789912 kubelet[2673]: E0516 16:42:33.789483 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.789912 kubelet[2673]: W0516 16:42:33.789491 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.789912 kubelet[2673]: E0516 16:42:33.789499 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.789912 kubelet[2673]: E0516 16:42:33.789761 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.789912 kubelet[2673]: W0516 16:42:33.789769 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.789912 kubelet[2673]: E0516 16:42:33.789777 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.790081 kubelet[2673]: E0516 16:42:33.790048 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.790081 kubelet[2673]: W0516 16:42:33.790056 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.790081 kubelet[2673]: E0516 16:42:33.790065 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.790296 kubelet[2673]: E0516 16:42:33.790276 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.790296 kubelet[2673]: W0516 16:42:33.790289 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.790296 kubelet[2673]: E0516 16:42:33.790297 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.790589 kubelet[2673]: E0516 16:42:33.790529 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.790589 kubelet[2673]: W0516 16:42:33.790544 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.790589 kubelet[2673]: E0516 16:42:33.790556 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.790928 kubelet[2673]: E0516 16:42:33.790909 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.790928 kubelet[2673]: W0516 16:42:33.790921 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.790928 kubelet[2673]: E0516 16:42:33.790929 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.791112 kubelet[2673]: E0516 16:42:33.791077 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.791112 kubelet[2673]: W0516 16:42:33.791087 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.791112 kubelet[2673]: E0516 16:42:33.791094 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.791414 kubelet[2673]: E0516 16:42:33.791389 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.791414 kubelet[2673]: W0516 16:42:33.791416 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.791557 kubelet[2673]: E0516 16:42:33.791426 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.797970 update_engine[1564]: I20250516 16:42:33.797868 1564 update_attempter.cc:509] Updating boot flags... May 16 16:42:33.808415 kubelet[2673]: E0516 16:42:33.807964 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.808415 kubelet[2673]: W0516 16:42:33.807986 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.808415 kubelet[2673]: E0516 16:42:33.808012 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.808415 kubelet[2673]: I0516 16:42:33.808043 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7f67342a-4c47-4a10-9f69-002db6933f22-registration-dir\") pod \"csi-node-driver-bb6pt\" (UID: \"7f67342a-4c47-4a10-9f69-002db6933f22\") " pod="calico-system/csi-node-driver-bb6pt" May 16 16:42:33.808415 kubelet[2673]: E0516 16:42:33.808248 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.808415 kubelet[2673]: W0516 16:42:33.808256 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.808415 kubelet[2673]: E0516 16:42:33.808264 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.808415 kubelet[2673]: I0516 16:42:33.808287 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/7f67342a-4c47-4a10-9f69-002db6933f22-varrun\") pod \"csi-node-driver-bb6pt\" (UID: \"7f67342a-4c47-4a10-9f69-002db6933f22\") " pod="calico-system/csi-node-driver-bb6pt" May 16 16:42:33.808835 kubelet[2673]: E0516 16:42:33.808819 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.808835 kubelet[2673]: W0516 16:42:33.808833 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.808904 kubelet[2673]: E0516 16:42:33.808842 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.810885 kubelet[2673]: I0516 16:42:33.810850 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7f67342a-4c47-4a10-9f69-002db6933f22-kubelet-dir\") pod \"csi-node-driver-bb6pt\" (UID: \"7f67342a-4c47-4a10-9f69-002db6933f22\") " pod="calico-system/csi-node-driver-bb6pt" May 16 16:42:33.811383 kubelet[2673]: E0516 16:42:33.811362 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.811383 kubelet[2673]: W0516 16:42:33.811380 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.811445 kubelet[2673]: E0516 16:42:33.811397 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.811689 kubelet[2673]: I0516 16:42:33.811516 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brv7v\" (UniqueName: \"kubernetes.io/projected/7f67342a-4c47-4a10-9f69-002db6933f22-kube-api-access-brv7v\") pod \"csi-node-driver-bb6pt\" (UID: \"7f67342a-4c47-4a10-9f69-002db6933f22\") " pod="calico-system/csi-node-driver-bb6pt" May 16 16:42:33.812137 kubelet[2673]: E0516 16:42:33.812120 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.812137 kubelet[2673]: W0516 16:42:33.812133 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.812189 kubelet[2673]: E0516 16:42:33.812143 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.812343 kubelet[2673]: E0516 16:42:33.812329 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.812343 kubelet[2673]: W0516 16:42:33.812339 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.812395 kubelet[2673]: E0516 16:42:33.812346 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.812538 kubelet[2673]: E0516 16:42:33.812524 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.812538 kubelet[2673]: W0516 16:42:33.812534 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.812607 kubelet[2673]: E0516 16:42:33.812542 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.812750 kubelet[2673]: E0516 16:42:33.812736 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.812750 kubelet[2673]: W0516 16:42:33.812746 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.812802 kubelet[2673]: E0516 16:42:33.812753 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.812928 kubelet[2673]: E0516 16:42:33.812915 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.812928 kubelet[2673]: W0516 16:42:33.812924 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.812975 kubelet[2673]: E0516 16:42:33.812932 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.812996 kubelet[2673]: I0516 16:42:33.812970 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7f67342a-4c47-4a10-9f69-002db6933f22-socket-dir\") pod \"csi-node-driver-bb6pt\" (UID: \"7f67342a-4c47-4a10-9f69-002db6933f22\") " pod="calico-system/csi-node-driver-bb6pt" May 16 16:42:33.813105 kubelet[2673]: E0516 16:42:33.813090 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.813105 kubelet[2673]: W0516 16:42:33.813101 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.813152 kubelet[2673]: E0516 16:42:33.813108 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.813306 kubelet[2673]: E0516 16:42:33.813290 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.813306 kubelet[2673]: W0516 16:42:33.813301 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.813353 kubelet[2673]: E0516 16:42:33.813308 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.813491 kubelet[2673]: E0516 16:42:33.813476 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.813491 kubelet[2673]: W0516 16:42:33.813487 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.813535 kubelet[2673]: E0516 16:42:33.813495 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.813714 kubelet[2673]: E0516 16:42:33.813700 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.813714 kubelet[2673]: W0516 16:42:33.813711 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.813772 kubelet[2673]: E0516 16:42:33.813719 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.813914 kubelet[2673]: E0516 16:42:33.813901 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.813914 kubelet[2673]: W0516 16:42:33.813910 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.813964 kubelet[2673]: E0516 16:42:33.813918 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.814210 kubelet[2673]: E0516 16:42:33.814192 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.814240 kubelet[2673]: W0516 16:42:33.814208 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.814240 kubelet[2673]: E0516 16:42:33.814226 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.850331 containerd[1577]: time="2025-05-16T16:42:33.850285040Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h98bp,Uid:a623c9c4-0c2d-451b-891b-304f5c903c18,Namespace:calico-system,Attempt:0,}" May 16 16:42:33.892024 containerd[1577]: time="2025-05-16T16:42:33.891425814Z" level=info msg="connecting to shim 3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf" address="unix:///run/containerd/s/b3ad7dc1c3820a19d2cd3c1e1748a709560c3152b9cf47a7577f817424b8235c" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:33.928498 kubelet[2673]: E0516 16:42:33.928460 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.928773 kubelet[2673]: W0516 16:42:33.928489 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.928807 kubelet[2673]: E0516 16:42:33.928788 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.931900 kubelet[2673]: E0516 16:42:33.931871 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.932595 kubelet[2673]: W0516 16:42:33.932062 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.932595 kubelet[2673]: E0516 16:42:33.932078 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.935163 kubelet[2673]: E0516 16:42:33.935145 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.935163 kubelet[2673]: W0516 16:42:33.935160 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.935232 kubelet[2673]: E0516 16:42:33.935180 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.938587 kubelet[2673]: E0516 16:42:33.936783 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.938587 kubelet[2673]: W0516 16:42:33.936795 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.938587 kubelet[2673]: E0516 16:42:33.936807 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.938587 kubelet[2673]: E0516 16:42:33.937421 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.938587 kubelet[2673]: W0516 16:42:33.937429 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.938587 kubelet[2673]: E0516 16:42:33.937438 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.938888 kubelet[2673]: E0516 16:42:33.938871 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.941543 kubelet[2673]: W0516 16:42:33.938887 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.941671 kubelet[2673]: E0516 16:42:33.941651 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.947696 kubelet[2673]: E0516 16:42:33.947644 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.947696 kubelet[2673]: W0516 16:42:33.947667 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.947696 kubelet[2673]: E0516 16:42:33.947685 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.948165 kubelet[2673]: E0516 16:42:33.948147 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.948165 kubelet[2673]: W0516 16:42:33.948161 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.948234 kubelet[2673]: E0516 16:42:33.948188 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.951281 kubelet[2673]: E0516 16:42:33.951258 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.951281 kubelet[2673]: W0516 16:42:33.951274 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.951557 kubelet[2673]: E0516 16:42:33.951532 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.953825 kubelet[2673]: E0516 16:42:33.953804 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.953876 kubelet[2673]: W0516 16:42:33.953842 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.953876 kubelet[2673]: E0516 16:42:33.953854 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.954854 kubelet[2673]: E0516 16:42:33.954837 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.954854 kubelet[2673]: W0516 16:42:33.954851 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.954988 kubelet[2673]: E0516 16:42:33.954972 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.955710 kubelet[2673]: E0516 16:42:33.955694 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.955746 kubelet[2673]: W0516 16:42:33.955707 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.956600 kubelet[2673]: E0516 16:42:33.955825 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.959595 kubelet[2673]: E0516 16:42:33.958182 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.959595 kubelet[2673]: W0516 16:42:33.958194 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.959595 kubelet[2673]: E0516 16:42:33.958209 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.961302 kubelet[2673]: E0516 16:42:33.961259 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.961302 kubelet[2673]: W0516 16:42:33.961274 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.961302 kubelet[2673]: E0516 16:42:33.961286 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.968591 kubelet[2673]: E0516 16:42:33.967263 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.968591 kubelet[2673]: W0516 16:42:33.967288 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.969080 kubelet[2673]: E0516 16:42:33.969047 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.972784 kubelet[2673]: E0516 16:42:33.972649 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.972784 kubelet[2673]: W0516 16:42:33.972669 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.972784 kubelet[2673]: E0516 16:42:33.972687 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.974664 kubelet[2673]: E0516 16:42:33.974639 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.974664 kubelet[2673]: W0516 16:42:33.974655 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.974755 kubelet[2673]: E0516 16:42:33.974682 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.976582 kubelet[2673]: E0516 16:42:33.974854 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.976582 kubelet[2673]: W0516 16:42:33.974863 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.976582 kubelet[2673]: E0516 16:42:33.974873 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.976582 kubelet[2673]: E0516 16:42:33.975549 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.977814 kubelet[2673]: W0516 16:42:33.975559 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.977814 kubelet[2673]: E0516 16:42:33.977782 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.978734 kubelet[2673]: E0516 16:42:33.978717 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.978734 kubelet[2673]: W0516 16:42:33.978731 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.978734 kubelet[2673]: E0516 16:42:33.978740 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.981586 kubelet[2673]: E0516 16:42:33.980529 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.981586 kubelet[2673]: W0516 16:42:33.980552 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.981586 kubelet[2673]: E0516 16:42:33.980561 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.982054 kubelet[2673]: E0516 16:42:33.981829 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.982054 kubelet[2673]: W0516 16:42:33.982003 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.982054 kubelet[2673]: E0516 16:42:33.982034 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.986009 systemd[1]: Started cri-containerd-3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf.scope - libcontainer container 3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf. May 16 16:42:33.989557 kubelet[2673]: E0516 16:42:33.988661 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.989557 kubelet[2673]: W0516 16:42:33.988693 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.989557 kubelet[2673]: E0516 16:42:33.988717 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.991586 kubelet[2673]: E0516 16:42:33.991110 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.991586 kubelet[2673]: W0516 16:42:33.991124 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.991586 kubelet[2673]: E0516 16:42:33.991134 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:33.994602 kubelet[2673]: E0516 16:42:33.994222 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:33.994602 kubelet[2673]: W0516 16:42:33.994239 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:33.994602 kubelet[2673]: E0516 16:42:33.994251 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:34.016176 kubelet[2673]: E0516 16:42:34.013636 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:34.016176 kubelet[2673]: W0516 16:42:34.013674 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:34.016176 kubelet[2673]: E0516 16:42:34.013691 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:34.090618 containerd[1577]: time="2025-05-16T16:42:34.090496675Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-h98bp,Uid:a623c9c4-0c2d-451b-891b-304f5c903c18,Namespace:calico-system,Attempt:0,} returns sandbox id \"3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf\"" May 16 16:42:35.619471 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount529777025.mount: Deactivated successfully. May 16 16:42:36.000101 kubelet[2673]: E0516 16:42:36.000032 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bb6pt" podUID="7f67342a-4c47-4a10-9f69-002db6933f22" May 16 16:42:36.814981 containerd[1577]: time="2025-05-16T16:42:36.814927669Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:36.815793 containerd[1577]: time="2025-05-16T16:42:36.815736396Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 16 16:42:36.817109 containerd[1577]: time="2025-05-16T16:42:36.817065226Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:36.819450 containerd[1577]: time="2025-05-16T16:42:36.819405332Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:36.820020 containerd[1577]: time="2025-05-16T16:42:36.819976702Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 3.180442912s" May 16 16:42:36.820020 containerd[1577]: time="2025-05-16T16:42:36.820014874Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 16 16:42:36.821111 containerd[1577]: time="2025-05-16T16:42:36.821082135Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 16 16:42:36.835094 containerd[1577]: time="2025-05-16T16:42:36.835050355Z" level=info msg="CreateContainer within sandbox \"d8e9811212080304ff6b4dd1272eaddc3ff1fe5642d0b69c201102bd8a1f0200\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 16 16:42:36.848772 containerd[1577]: time="2025-05-16T16:42:36.848725165Z" level=info msg="Container 619e72fcaa67e0c5fd19adcbce0c3ae71a0ab45158057a5e122037a60bc42e03: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:36.859202 containerd[1577]: time="2025-05-16T16:42:36.859152278Z" level=info msg="CreateContainer within sandbox \"d8e9811212080304ff6b4dd1272eaddc3ff1fe5642d0b69c201102bd8a1f0200\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"619e72fcaa67e0c5fd19adcbce0c3ae71a0ab45158057a5e122037a60bc42e03\"" May 16 16:42:36.860644 containerd[1577]: time="2025-05-16T16:42:36.860548465Z" level=info msg="StartContainer for \"619e72fcaa67e0c5fd19adcbce0c3ae71a0ab45158057a5e122037a60bc42e03\"" May 16 16:42:36.862116 containerd[1577]: time="2025-05-16T16:42:36.862071680Z" level=info msg="connecting to shim 619e72fcaa67e0c5fd19adcbce0c3ae71a0ab45158057a5e122037a60bc42e03" address="unix:///run/containerd/s/052b87042b5dd0dd6f0c608664130ca630cb9949283e207c775fbfd2bf550ded" protocol=ttrpc version=3 May 16 16:42:36.889795 systemd[1]: Started cri-containerd-619e72fcaa67e0c5fd19adcbce0c3ae71a0ab45158057a5e122037a60bc42e03.scope - libcontainer container 619e72fcaa67e0c5fd19adcbce0c3ae71a0ab45158057a5e122037a60bc42e03. May 16 16:42:36.941133 containerd[1577]: time="2025-05-16T16:42:36.941079400Z" level=info msg="StartContainer for \"619e72fcaa67e0c5fd19adcbce0c3ae71a0ab45158057a5e122037a60bc42e03\" returns successfully" May 16 16:42:37.101118 kubelet[2673]: I0516 16:42:37.099076 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-7b565c4d48-wh6n6" podStartSLOduration=0.917430028 podStartE2EDuration="4.099062139s" podCreationTimestamp="2025-05-16 16:42:33 +0000 UTC" firstStartedPulling="2025-05-16 16:42:33.63927207 +0000 UTC m=+17.743157352" lastFinishedPulling="2025-05-16 16:42:36.820904181 +0000 UTC m=+20.924789463" observedRunningTime="2025-05-16 16:42:37.097466499 +0000 UTC m=+21.201351781" watchObservedRunningTime="2025-05-16 16:42:37.099062139 +0000 UTC m=+21.202947421" May 16 16:42:37.113438 kubelet[2673]: E0516 16:42:37.113378 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.113438 kubelet[2673]: W0516 16:42:37.113425 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.113609 kubelet[2673]: E0516 16:42:37.113452 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.114407 kubelet[2673]: E0516 16:42:37.114383 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.114407 kubelet[2673]: W0516 16:42:37.114402 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.114480 kubelet[2673]: E0516 16:42:37.114414 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.114664 kubelet[2673]: E0516 16:42:37.114633 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.114664 kubelet[2673]: W0516 16:42:37.114652 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.114664 kubelet[2673]: E0516 16:42:37.114663 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.115747 kubelet[2673]: E0516 16:42:37.115629 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.115747 kubelet[2673]: W0516 16:42:37.115649 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.115747 kubelet[2673]: E0516 16:42:37.115661 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.115969 kubelet[2673]: E0516 16:42:37.115937 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.115969 kubelet[2673]: W0516 16:42:37.115956 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.116228 kubelet[2673]: E0516 16:42:37.115974 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.116683 kubelet[2673]: E0516 16:42:37.116655 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.116683 kubelet[2673]: W0516 16:42:37.116678 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.116915 kubelet[2673]: E0516 16:42:37.116690 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.116915 kubelet[2673]: E0516 16:42:37.116893 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.116915 kubelet[2673]: W0516 16:42:37.116902 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.116915 kubelet[2673]: E0516 16:42:37.116911 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.117206 kubelet[2673]: E0516 16:42:37.117081 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.117206 kubelet[2673]: W0516 16:42:37.117089 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.117206 kubelet[2673]: E0516 16:42:37.117097 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.118102 kubelet[2673]: E0516 16:42:37.118077 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.118102 kubelet[2673]: W0516 16:42:37.118097 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.118268 kubelet[2673]: E0516 16:42:37.118111 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.118388 kubelet[2673]: E0516 16:42:37.118300 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.118388 kubelet[2673]: W0516 16:42:37.118324 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.118388 kubelet[2673]: E0516 16:42:37.118343 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.118556 kubelet[2673]: E0516 16:42:37.118522 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.118556 kubelet[2673]: W0516 16:42:37.118537 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.118556 kubelet[2673]: E0516 16:42:37.118546 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.118821 kubelet[2673]: E0516 16:42:37.118783 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.118821 kubelet[2673]: W0516 16:42:37.118803 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.118821 kubelet[2673]: E0516 16:42:37.118815 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.119025 kubelet[2673]: E0516 16:42:37.118998 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.119025 kubelet[2673]: W0516 16:42:37.119019 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.119176 kubelet[2673]: E0516 16:42:37.119028 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.119274 kubelet[2673]: E0516 16:42:37.119212 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.119274 kubelet[2673]: W0516 16:42:37.119230 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.119274 kubelet[2673]: E0516 16:42:37.119240 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.119479 kubelet[2673]: E0516 16:42:37.119433 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.119479 kubelet[2673]: W0516 16:42:37.119447 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.119479 kubelet[2673]: E0516 16:42:37.119457 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.162109 kubelet[2673]: E0516 16:42:37.162060 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.162109 kubelet[2673]: W0516 16:42:37.162082 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.162109 kubelet[2673]: E0516 16:42:37.162102 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.162439 kubelet[2673]: E0516 16:42:37.162411 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.162439 kubelet[2673]: W0516 16:42:37.162423 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.162439 kubelet[2673]: E0516 16:42:37.162434 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.162686 kubelet[2673]: E0516 16:42:37.162669 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.162686 kubelet[2673]: W0516 16:42:37.162678 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.162686 kubelet[2673]: E0516 16:42:37.162685 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.163024 kubelet[2673]: E0516 16:42:37.162985 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.163024 kubelet[2673]: W0516 16:42:37.163014 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.163104 kubelet[2673]: E0516 16:42:37.163034 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.163264 kubelet[2673]: E0516 16:42:37.163242 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.163264 kubelet[2673]: W0516 16:42:37.163253 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.163264 kubelet[2673]: E0516 16:42:37.163260 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.163494 kubelet[2673]: E0516 16:42:37.163475 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.163494 kubelet[2673]: W0516 16:42:37.163485 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.163494 kubelet[2673]: E0516 16:42:37.163493 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.163716 kubelet[2673]: E0516 16:42:37.163698 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.163716 kubelet[2673]: W0516 16:42:37.163708 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.163716 kubelet[2673]: E0516 16:42:37.163716 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.163896 kubelet[2673]: E0516 16:42:37.163879 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.163896 kubelet[2673]: W0516 16:42:37.163889 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.163896 kubelet[2673]: E0516 16:42:37.163896 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.164072 kubelet[2673]: E0516 16:42:37.164055 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.164072 kubelet[2673]: W0516 16:42:37.164065 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.164072 kubelet[2673]: E0516 16:42:37.164072 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.164255 kubelet[2673]: E0516 16:42:37.164240 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.164293 kubelet[2673]: W0516 16:42:37.164269 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.164293 kubelet[2673]: E0516 16:42:37.164277 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.164456 kubelet[2673]: E0516 16:42:37.164440 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.164456 kubelet[2673]: W0516 16:42:37.164449 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.164456 kubelet[2673]: E0516 16:42:37.164457 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.164671 kubelet[2673]: E0516 16:42:37.164653 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.164671 kubelet[2673]: W0516 16:42:37.164663 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.164671 kubelet[2673]: E0516 16:42:37.164670 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.164969 kubelet[2673]: E0516 16:42:37.164942 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.164969 kubelet[2673]: W0516 16:42:37.164961 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.165044 kubelet[2673]: E0516 16:42:37.164971 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.165279 kubelet[2673]: E0516 16:42:37.165248 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.165279 kubelet[2673]: W0516 16:42:37.165268 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.165353 kubelet[2673]: E0516 16:42:37.165287 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.165509 kubelet[2673]: E0516 16:42:37.165491 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.165509 kubelet[2673]: W0516 16:42:37.165501 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.165509 kubelet[2673]: E0516 16:42:37.165508 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.165729 kubelet[2673]: E0516 16:42:37.165713 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.165729 kubelet[2673]: W0516 16:42:37.165723 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.165729 kubelet[2673]: E0516 16:42:37.165730 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.165984 kubelet[2673]: E0516 16:42:37.165965 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.165984 kubelet[2673]: W0516 16:42:37.165977 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.165984 kubelet[2673]: E0516 16:42:37.165985 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:37.166168 kubelet[2673]: E0516 16:42:37.166152 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:37.166168 kubelet[2673]: W0516 16:42:37.166161 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:37.166168 kubelet[2673]: E0516 16:42:37.166169 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.003467 kubelet[2673]: E0516 16:42:38.003400 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bb6pt" podUID="7f67342a-4c47-4a10-9f69-002db6933f22" May 16 16:42:38.072955 kubelet[2673]: I0516 16:42:38.072925 2673 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 16 16:42:38.088812 containerd[1577]: time="2025-05-16T16:42:38.088767094Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:38.089617 containerd[1577]: time="2025-05-16T16:42:38.089560843Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 16 16:42:38.090762 containerd[1577]: time="2025-05-16T16:42:38.090722591Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:38.092470 containerd[1577]: time="2025-05-16T16:42:38.092432676Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:38.092962 containerd[1577]: time="2025-05-16T16:42:38.092932092Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 1.271815543s" May 16 16:42:38.092962 containerd[1577]: time="2025-05-16T16:42:38.092958051Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 16 16:42:38.097802 containerd[1577]: time="2025-05-16T16:42:38.097763088Z" level=info msg="CreateContainer within sandbox \"3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 16 16:42:38.109459 containerd[1577]: time="2025-05-16T16:42:38.109407814Z" level=info msg="Container cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:38.117830 containerd[1577]: time="2025-05-16T16:42:38.117775149Z" level=info msg="CreateContainer within sandbox \"3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9\"" May 16 16:42:38.118398 containerd[1577]: time="2025-05-16T16:42:38.118338084Z" level=info msg="StartContainer for \"cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9\"" May 16 16:42:38.119894 containerd[1577]: time="2025-05-16T16:42:38.119859877Z" level=info msg="connecting to shim cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9" address="unix:///run/containerd/s/b3ad7dc1c3820a19d2cd3c1e1748a709560c3152b9cf47a7577f817424b8235c" protocol=ttrpc version=3 May 16 16:42:38.128955 kubelet[2673]: E0516 16:42:38.128927 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.130902 kubelet[2673]: W0516 16:42:38.129369 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.130902 kubelet[2673]: E0516 16:42:38.129454 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.131191 kubelet[2673]: E0516 16:42:38.131051 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.131191 kubelet[2673]: W0516 16:42:38.131065 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.131191 kubelet[2673]: E0516 16:42:38.131077 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.133551 kubelet[2673]: E0516 16:42:38.133535 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.133712 kubelet[2673]: W0516 16:42:38.133627 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.133712 kubelet[2673]: E0516 16:42:38.133651 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.134088 kubelet[2673]: E0516 16:42:38.134041 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.134088 kubelet[2673]: W0516 16:42:38.134056 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.134088 kubelet[2673]: E0516 16:42:38.134067 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.135635 kubelet[2673]: E0516 16:42:38.135618 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.136060 kubelet[2673]: W0516 16:42:38.135661 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.136060 kubelet[2673]: E0516 16:42:38.135674 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.137792 kubelet[2673]: E0516 16:42:38.137777 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.137987 kubelet[2673]: W0516 16:42:38.137865 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.137987 kubelet[2673]: E0516 16:42:38.137883 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.138386 kubelet[2673]: E0516 16:42:38.138220 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.138386 kubelet[2673]: W0516 16:42:38.138246 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.138386 kubelet[2673]: E0516 16:42:38.138269 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.138584 kubelet[2673]: E0516 16:42:38.138558 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.138830 kubelet[2673]: W0516 16:42:38.138699 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.138830 kubelet[2673]: E0516 16:42:38.138715 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.139017 kubelet[2673]: E0516 16:42:38.139000 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.139176 kubelet[2673]: W0516 16:42:38.139073 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.139176 kubelet[2673]: E0516 16:42:38.139087 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.139384 kubelet[2673]: E0516 16:42:38.139368 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.139604 kubelet[2673]: W0516 16:42:38.139438 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.139604 kubelet[2673]: E0516 16:42:38.139453 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.139829 kubelet[2673]: E0516 16:42:38.139816 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.139895 kubelet[2673]: W0516 16:42:38.139882 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.140000 kubelet[2673]: E0516 16:42:38.139972 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.140679 kubelet[2673]: E0516 16:42:38.140495 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.140679 kubelet[2673]: W0516 16:42:38.140508 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.140679 kubelet[2673]: E0516 16:42:38.140521 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.140788 systemd[1]: Started cri-containerd-cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9.scope - libcontainer container cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9. May 16 16:42:38.141280 kubelet[2673]: E0516 16:42:38.141161 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.141280 kubelet[2673]: W0516 16:42:38.141185 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.141280 kubelet[2673]: E0516 16:42:38.141206 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.141734 kubelet[2673]: E0516 16:42:38.141720 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.141872 kubelet[2673]: W0516 16:42:38.141794 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.141872 kubelet[2673]: E0516 16:42:38.141808 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.142176 kubelet[2673]: E0516 16:42:38.142162 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.142243 kubelet[2673]: W0516 16:42:38.142231 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.142322 kubelet[2673]: E0516 16:42:38.142298 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.171114 kubelet[2673]: E0516 16:42:38.171085 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.171114 kubelet[2673]: W0516 16:42:38.171104 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.171114 kubelet[2673]: E0516 16:42:38.171121 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.171349 kubelet[2673]: E0516 16:42:38.171298 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.171349 kubelet[2673]: W0516 16:42:38.171312 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.171349 kubelet[2673]: E0516 16:42:38.171323 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.171505 kubelet[2673]: E0516 16:42:38.171487 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.171505 kubelet[2673]: W0516 16:42:38.171499 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.171505 kubelet[2673]: E0516 16:42:38.171507 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.173550 kubelet[2673]: E0516 16:42:38.173533 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.173550 kubelet[2673]: W0516 16:42:38.173547 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.173641 kubelet[2673]: E0516 16:42:38.173560 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.173780 kubelet[2673]: E0516 16:42:38.173756 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.173780 kubelet[2673]: W0516 16:42:38.173768 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.173780 kubelet[2673]: E0516 16:42:38.173778 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.173980 kubelet[2673]: E0516 16:42:38.173968 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.173980 kubelet[2673]: W0516 16:42:38.173980 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.174053 kubelet[2673]: E0516 16:42:38.173990 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.174182 kubelet[2673]: E0516 16:42:38.174166 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.174267 kubelet[2673]: W0516 16:42:38.174183 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.174267 kubelet[2673]: E0516 16:42:38.174195 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.174402 kubelet[2673]: E0516 16:42:38.174387 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.174478 kubelet[2673]: W0516 16:42:38.174410 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.174478 kubelet[2673]: E0516 16:42:38.174421 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.174707 kubelet[2673]: E0516 16:42:38.174630 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.174707 kubelet[2673]: W0516 16:42:38.174639 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.174707 kubelet[2673]: E0516 16:42:38.174648 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.175168 kubelet[2673]: E0516 16:42:38.175139 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.175259 kubelet[2673]: W0516 16:42:38.175239 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.175414 kubelet[2673]: E0516 16:42:38.175339 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.175680 kubelet[2673]: E0516 16:42:38.175668 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.175817 kubelet[2673]: W0516 16:42:38.175733 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.175817 kubelet[2673]: E0516 16:42:38.175747 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.176836 kubelet[2673]: E0516 16:42:38.176773 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.176836 kubelet[2673]: W0516 16:42:38.176786 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.176836 kubelet[2673]: E0516 16:42:38.176796 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.177151 kubelet[2673]: E0516 16:42:38.177115 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.177151 kubelet[2673]: W0516 16:42:38.177129 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.177151 kubelet[2673]: E0516 16:42:38.177141 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.177340 kubelet[2673]: E0516 16:42:38.177323 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.177340 kubelet[2673]: W0516 16:42:38.177335 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.177440 kubelet[2673]: E0516 16:42:38.177345 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.177535 kubelet[2673]: E0516 16:42:38.177520 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.177535 kubelet[2673]: W0516 16:42:38.177531 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.177613 kubelet[2673]: E0516 16:42:38.177541 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.177732 kubelet[2673]: E0516 16:42:38.177717 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.177732 kubelet[2673]: W0516 16:42:38.177729 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.177791 kubelet[2673]: E0516 16:42:38.177738 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.178298 kubelet[2673]: E0516 16:42:38.178272 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.178298 kubelet[2673]: W0516 16:42:38.178288 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.178377 kubelet[2673]: E0516 16:42:38.178298 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.178850 kubelet[2673]: E0516 16:42:38.178832 2673 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 16 16:42:38.178850 kubelet[2673]: W0516 16:42:38.178848 2673 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 16 16:42:38.178931 kubelet[2673]: E0516 16:42:38.178859 2673 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 16 16:42:38.184375 containerd[1577]: time="2025-05-16T16:42:38.184318261Z" level=info msg="StartContainer for \"cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9\" returns successfully" May 16 16:42:38.190543 systemd[1]: cri-containerd-cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9.scope: Deactivated successfully. May 16 16:42:38.192496 containerd[1577]: time="2025-05-16T16:42:38.192414236Z" level=info msg="received exit event container_id:\"cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9\" id:\"cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9\" pid:3395 exited_at:{seconds:1747413758 nanos:192029115}" May 16 16:42:38.192496 containerd[1577]: time="2025-05-16T16:42:38.192456716Z" level=info msg="TaskExit event in podsandbox handler container_id:\"cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9\" id:\"cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9\" pid:3395 exited_at:{seconds:1747413758 nanos:192029115}" May 16 16:42:38.215389 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-cf7f6d01da869c6c01395b96cefd28fcd07355fc32199903125b696516223ea9-rootfs.mount: Deactivated successfully. May 16 16:42:39.078016 containerd[1577]: time="2025-05-16T16:42:39.077784675Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 16 16:42:40.000836 kubelet[2673]: E0516 16:42:40.000761 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bb6pt" podUID="7f67342a-4c47-4a10-9f69-002db6933f22" May 16 16:42:41.862597 containerd[1577]: time="2025-05-16T16:42:41.862524812Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:41.863418 containerd[1577]: time="2025-05-16T16:42:41.863364176Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 16 16:42:41.864887 containerd[1577]: time="2025-05-16T16:42:41.864819414Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:41.866626 containerd[1577]: time="2025-05-16T16:42:41.866584643Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:41.867148 containerd[1577]: time="2025-05-16T16:42:41.867120207Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 2.789297621s" May 16 16:42:41.867148 containerd[1577]: time="2025-05-16T16:42:41.867147839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 16 16:42:41.872089 containerd[1577]: time="2025-05-16T16:42:41.872022979Z" level=info msg="CreateContainer within sandbox \"3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 16 16:42:41.880015 containerd[1577]: time="2025-05-16T16:42:41.879974746Z" level=info msg="Container ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:41.889342 containerd[1577]: time="2025-05-16T16:42:41.889295650Z" level=info msg="CreateContainer within sandbox \"3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67\"" May 16 16:42:41.889902 containerd[1577]: time="2025-05-16T16:42:41.889862953Z" level=info msg="StartContainer for \"ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67\"" May 16 16:42:41.891323 containerd[1577]: time="2025-05-16T16:42:41.891297122Z" level=info msg="connecting to shim ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67" address="unix:///run/containerd/s/b3ad7dc1c3820a19d2cd3c1e1748a709560c3152b9cf47a7577f817424b8235c" protocol=ttrpc version=3 May 16 16:42:41.921879 systemd[1]: Started cri-containerd-ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67.scope - libcontainer container ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67. May 16 16:42:41.965165 containerd[1577]: time="2025-05-16T16:42:41.965049996Z" level=info msg="StartContainer for \"ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67\" returns successfully" May 16 16:42:42.002383 kubelet[2673]: E0516 16:42:42.001293 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-bb6pt" podUID="7f67342a-4c47-4a10-9f69-002db6933f22" May 16 16:42:42.932920 containerd[1577]: time="2025-05-16T16:42:42.932858973Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 16 16:42:42.936087 systemd[1]: cri-containerd-ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67.scope: Deactivated successfully. May 16 16:42:42.936489 systemd[1]: cri-containerd-ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67.scope: Consumed 519ms CPU time, 177M memory peak, 3.8M read from disk, 170.9M written to disk. May 16 16:42:42.937218 containerd[1577]: time="2025-05-16T16:42:42.937154997Z" level=info msg="received exit event container_id:\"ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67\" id:\"ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67\" pid:3475 exited_at:{seconds:1747413762 nanos:936921980}" May 16 16:42:42.937294 containerd[1577]: time="2025-05-16T16:42:42.937227012Z" level=info msg="TaskExit event in podsandbox handler container_id:\"ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67\" id:\"ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67\" pid:3475 exited_at:{seconds:1747413762 nanos:936921980}" May 16 16:42:42.939818 kubelet[2673]: I0516 16:42:42.939787 2673 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 16 16:42:42.967014 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ac06050eace7f1db7b8a6a7a39d5bdfacfbe413ca7e2460a366877e1910f9e67-rootfs.mount: Deactivated successfully. May 16 16:42:42.979248 systemd[1]: Created slice kubepods-burstable-pod6fc31b70_0736_43f1_8329_54c347d76131.slice - libcontainer container kubepods-burstable-pod6fc31b70_0736_43f1_8329_54c347d76131.slice. May 16 16:42:43.004845 kubelet[2673]: I0516 16:42:43.004730 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6fc31b70-0736-43f1-8329-54c347d76131-config-volume\") pod \"coredns-674b8bbfcf-z584z\" (UID: \"6fc31b70-0736-43f1-8329-54c347d76131\") " pod="kube-system/coredns-674b8bbfcf-z584z" May 16 16:42:43.004845 kubelet[2673]: I0516 16:42:43.004806 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw9qc\" (UniqueName: \"kubernetes.io/projected/6fc31b70-0736-43f1-8329-54c347d76131-kube-api-access-fw9qc\") pod \"coredns-674b8bbfcf-z584z\" (UID: \"6fc31b70-0736-43f1-8329-54c347d76131\") " pod="kube-system/coredns-674b8bbfcf-z584z" May 16 16:42:43.175381 systemd[1]: Created slice kubepods-burstable-pod9f415d75_aaf3_47ae_a8b2_f1ffaa36e1c4.slice - libcontainer container kubepods-burstable-pod9f415d75_aaf3_47ae_a8b2_f1ffaa36e1c4.slice. May 16 16:42:43.200428 systemd[1]: Created slice kubepods-besteffort-podb11afc1c_9b60_4f5c_a29d_78fd53879323.slice - libcontainer container kubepods-besteffort-podb11afc1c_9b60_4f5c_a29d_78fd53879323.slice. May 16 16:42:43.207369 kubelet[2673]: I0516 16:42:43.206968 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q2l4\" (UniqueName: \"kubernetes.io/projected/9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4-kube-api-access-7q2l4\") pod \"coredns-674b8bbfcf-krmsj\" (UID: \"9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4\") " pod="kube-system/coredns-674b8bbfcf-krmsj" May 16 16:42:43.207727 kubelet[2673]: I0516 16:42:43.207668 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4-config-volume\") pod \"coredns-674b8bbfcf-krmsj\" (UID: \"9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4\") " pod="kube-system/coredns-674b8bbfcf-krmsj" May 16 16:42:43.212820 systemd[1]: Created slice kubepods-besteffort-podafac284b_1d53_4f0b_b60b_8d046a842c57.slice - libcontainer container kubepods-besteffort-podafac284b_1d53_4f0b_b60b_8d046a842c57.slice. May 16 16:42:43.223064 systemd[1]: Created slice kubepods-besteffort-pod3208f219_fe99_4b68_b23c_e5f9b103f8b4.slice - libcontainer container kubepods-besteffort-pod3208f219_fe99_4b68_b23c_e5f9b103f8b4.slice. May 16 16:42:43.229055 systemd[1]: Created slice kubepods-besteffort-pode3efb2f6_048d_46a8_9ddf_da2b1ef69bfa.slice - libcontainer container kubepods-besteffort-pode3efb2f6_048d_46a8_9ddf_da2b1ef69bfa.slice. May 16 16:42:43.235228 systemd[1]: Created slice kubepods-besteffort-pod544c5973_02d3_447d_9edb_08e215486937.slice - libcontainer container kubepods-besteffort-pod544c5973_02d3_447d_9edb_08e215486937.slice. May 16 16:42:43.297748 containerd[1577]: time="2025-05-16T16:42:43.297693534Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-z584z,Uid:6fc31b70-0736-43f1-8329-54c347d76131,Namespace:kube-system,Attempt:0,}" May 16 16:42:43.308076 kubelet[2673]: I0516 16:42:43.308031 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3208f219-fe99-4b68-b23c-e5f9b103f8b4-config\") pod \"goldmane-78d55f7ddc-fcfrt\" (UID: \"3208f219-fe99-4b68-b23c-e5f9b103f8b4\") " pod="calico-system/goldmane-78d55f7ddc-fcfrt" May 16 16:42:43.308076 kubelet[2673]: I0516 16:42:43.308073 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps666\" (UniqueName: \"kubernetes.io/projected/b11afc1c-9b60-4f5c-a29d-78fd53879323-kube-api-access-ps666\") pod \"calico-apiserver-cd8784d74-2wgvs\" (UID: \"b11afc1c-9b60-4f5c-a29d-78fd53879323\") " pod="calico-apiserver/calico-apiserver-cd8784d74-2wgvs" May 16 16:42:43.308235 kubelet[2673]: I0516 16:42:43.308093 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7llr\" (UniqueName: \"kubernetes.io/projected/544c5973-02d3-447d-9edb-08e215486937-kube-api-access-z7llr\") pod \"whisker-65867ff4b5-ngxpz\" (UID: \"544c5973-02d3-447d-9edb-08e215486937\") " pod="calico-system/whisker-65867ff4b5-ngxpz" May 16 16:42:43.308235 kubelet[2673]: I0516 16:42:43.308107 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7ztc\" (UniqueName: \"kubernetes.io/projected/3208f219-fe99-4b68-b23c-e5f9b103f8b4-kube-api-access-q7ztc\") pod \"goldmane-78d55f7ddc-fcfrt\" (UID: \"3208f219-fe99-4b68-b23c-e5f9b103f8b4\") " pod="calico-system/goldmane-78d55f7ddc-fcfrt" May 16 16:42:43.308235 kubelet[2673]: I0516 16:42:43.308123 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544c5973-02d3-447d-9edb-08e215486937-whisker-ca-bundle\") pod \"whisker-65867ff4b5-ngxpz\" (UID: \"544c5973-02d3-447d-9edb-08e215486937\") " pod="calico-system/whisker-65867ff4b5-ngxpz" May 16 16:42:43.308235 kubelet[2673]: I0516 16:42:43.308139 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/544c5973-02d3-447d-9edb-08e215486937-whisker-backend-key-pair\") pod \"whisker-65867ff4b5-ngxpz\" (UID: \"544c5973-02d3-447d-9edb-08e215486937\") " pod="calico-system/whisker-65867ff4b5-ngxpz" May 16 16:42:43.308235 kubelet[2673]: I0516 16:42:43.308168 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwzh4\" (UniqueName: \"kubernetes.io/projected/e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa-kube-api-access-vwzh4\") pod \"calico-kube-controllers-7685c7f4cf-98w45\" (UID: \"e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa\") " pod="calico-system/calico-kube-controllers-7685c7f4cf-98w45" May 16 16:42:43.308355 kubelet[2673]: I0516 16:42:43.308234 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/afac284b-1d53-4f0b-b60b-8d046a842c57-calico-apiserver-certs\") pod \"calico-apiserver-cd8784d74-zmxbh\" (UID: \"afac284b-1d53-4f0b-b60b-8d046a842c57\") " pod="calico-apiserver/calico-apiserver-cd8784d74-zmxbh" May 16 16:42:43.308355 kubelet[2673]: I0516 16:42:43.308290 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsqdz\" (UniqueName: \"kubernetes.io/projected/afac284b-1d53-4f0b-b60b-8d046a842c57-kube-api-access-jsqdz\") pod \"calico-apiserver-cd8784d74-zmxbh\" (UID: \"afac284b-1d53-4f0b-b60b-8d046a842c57\") " pod="calico-apiserver/calico-apiserver-cd8784d74-zmxbh" May 16 16:42:43.308355 kubelet[2673]: I0516 16:42:43.308341 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/3208f219-fe99-4b68-b23c-e5f9b103f8b4-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-fcfrt\" (UID: \"3208f219-fe99-4b68-b23c-e5f9b103f8b4\") " pod="calico-system/goldmane-78d55f7ddc-fcfrt" May 16 16:42:43.308423 kubelet[2673]: I0516 16:42:43.308369 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3208f219-fe99-4b68-b23c-e5f9b103f8b4-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-fcfrt\" (UID: \"3208f219-fe99-4b68-b23c-e5f9b103f8b4\") " pod="calico-system/goldmane-78d55f7ddc-fcfrt" May 16 16:42:43.308423 kubelet[2673]: I0516 16:42:43.308398 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/b11afc1c-9b60-4f5c-a29d-78fd53879323-calico-apiserver-certs\") pod \"calico-apiserver-cd8784d74-2wgvs\" (UID: \"b11afc1c-9b60-4f5c-a29d-78fd53879323\") " pod="calico-apiserver/calico-apiserver-cd8784d74-2wgvs" May 16 16:42:43.308469 kubelet[2673]: I0516 16:42:43.308418 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa-tigera-ca-bundle\") pod \"calico-kube-controllers-7685c7f4cf-98w45\" (UID: \"e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa\") " pod="calico-system/calico-kube-controllers-7685c7f4cf-98w45" May 16 16:42:43.360443 containerd[1577]: time="2025-05-16T16:42:43.360391162Z" level=error msg="Failed to destroy network for sandbox \"196d7704b98468e99b21d4e5de169d4f8e615f3c4c9934ad457ccff0134308ff\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.361883 containerd[1577]: time="2025-05-16T16:42:43.361842293Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-z584z,Uid:6fc31b70-0736-43f1-8329-54c347d76131,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"196d7704b98468e99b21d4e5de169d4f8e615f3c4c9934ad457ccff0134308ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.362535 systemd[1]: run-netns-cni\x2d804e6d61\x2d121c\x2db7b8\x2d3452\x2d702a2cee8543.mount: Deactivated successfully. May 16 16:42:43.371843 kubelet[2673]: E0516 16:42:43.371788 2673 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"196d7704b98468e99b21d4e5de169d4f8e615f3c4c9934ad457ccff0134308ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.372038 kubelet[2673]: E0516 16:42:43.371862 2673 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"196d7704b98468e99b21d4e5de169d4f8e615f3c4c9934ad457ccff0134308ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-z584z" May 16 16:42:43.372038 kubelet[2673]: E0516 16:42:43.371883 2673 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"196d7704b98468e99b21d4e5de169d4f8e615f3c4c9934ad457ccff0134308ff\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-z584z" May 16 16:42:43.372038 kubelet[2673]: E0516 16:42:43.371938 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-z584z_kube-system(6fc31b70-0736-43f1-8329-54c347d76131)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-z584z_kube-system(6fc31b70-0736-43f1-8329-54c347d76131)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"196d7704b98468e99b21d4e5de169d4f8e615f3c4c9934ad457ccff0134308ff\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-z584z" podUID="6fc31b70-0736-43f1-8329-54c347d76131" May 16 16:42:43.478655 containerd[1577]: time="2025-05-16T16:42:43.478531867Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-krmsj,Uid:9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4,Namespace:kube-system,Attempt:0,}" May 16 16:42:43.505391 containerd[1577]: time="2025-05-16T16:42:43.505355788Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd8784d74-2wgvs,Uid:b11afc1c-9b60-4f5c-a29d-78fd53879323,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:43.518492 containerd[1577]: time="2025-05-16T16:42:43.518433937Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd8784d74-zmxbh,Uid:afac284b-1d53-4f0b-b60b-8d046a842c57,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:43.528269 containerd[1577]: time="2025-05-16T16:42:43.527959146Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-fcfrt,Uid:3208f219-fe99-4b68-b23c-e5f9b103f8b4,Namespace:calico-system,Attempt:0,}" May 16 16:42:43.531815 containerd[1577]: time="2025-05-16T16:42:43.531755443Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7685c7f4cf-98w45,Uid:e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa,Namespace:calico-system,Attempt:0,}" May 16 16:42:43.539866 containerd[1577]: time="2025-05-16T16:42:43.539818951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65867ff4b5-ngxpz,Uid:544c5973-02d3-447d-9edb-08e215486937,Namespace:calico-system,Attempt:0,}" May 16 16:42:43.576111 containerd[1577]: time="2025-05-16T16:42:43.576049857Z" level=error msg="Failed to destroy network for sandbox \"f443815c46ffbd9028414ba14823ad474f80342ac72b73188f9ff732841a8f60\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.578156 containerd[1577]: time="2025-05-16T16:42:43.578118957Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-krmsj,Uid:9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"f443815c46ffbd9028414ba14823ad474f80342ac72b73188f9ff732841a8f60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.578454 kubelet[2673]: E0516 16:42:43.578407 2673 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f443815c46ffbd9028414ba14823ad474f80342ac72b73188f9ff732841a8f60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.579069 kubelet[2673]: E0516 16:42:43.579002 2673 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f443815c46ffbd9028414ba14823ad474f80342ac72b73188f9ff732841a8f60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-krmsj" May 16 16:42:43.579069 kubelet[2673]: E0516 16:42:43.579034 2673 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"f443815c46ffbd9028414ba14823ad474f80342ac72b73188f9ff732841a8f60\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-krmsj" May 16 16:42:43.579144 kubelet[2673]: E0516 16:42:43.579082 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-krmsj_kube-system(9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-krmsj_kube-system(9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"f443815c46ffbd9028414ba14823ad474f80342ac72b73188f9ff732841a8f60\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-krmsj" podUID="9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4" May 16 16:42:43.604432 containerd[1577]: time="2025-05-16T16:42:43.604380563Z" level=error msg="Failed to destroy network for sandbox \"9040482b90b5fb3d7f40731323ae7ed1d1934ba378ef5919adca2fe7e5e8b12d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.604745 containerd[1577]: time="2025-05-16T16:42:43.604686176Z" level=error msg="Failed to destroy network for sandbox \"9dd02a3d749af63393869c32e3655b1b9995229b0836436a97eb029c7b126ab1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.606558 containerd[1577]: time="2025-05-16T16:42:43.606054491Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd8784d74-2wgvs,Uid:b11afc1c-9b60-4f5c-a29d-78fd53879323,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dd02a3d749af63393869c32e3655b1b9995229b0836436a97eb029c7b126ab1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.606655 kubelet[2673]: E0516 16:42:43.606347 2673 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dd02a3d749af63393869c32e3655b1b9995229b0836436a97eb029c7b126ab1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.606655 kubelet[2673]: E0516 16:42:43.606429 2673 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dd02a3d749af63393869c32e3655b1b9995229b0836436a97eb029c7b126ab1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd8784d74-2wgvs" May 16 16:42:43.606655 kubelet[2673]: E0516 16:42:43.606451 2673 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9dd02a3d749af63393869c32e3655b1b9995229b0836436a97eb029c7b126ab1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd8784d74-2wgvs" May 16 16:42:43.606756 kubelet[2673]: E0516 16:42:43.606519 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cd8784d74-2wgvs_calico-apiserver(b11afc1c-9b60-4f5c-a29d-78fd53879323)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cd8784d74-2wgvs_calico-apiserver(b11afc1c-9b60-4f5c-a29d-78fd53879323)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9dd02a3d749af63393869c32e3655b1b9995229b0836436a97eb029c7b126ab1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cd8784d74-2wgvs" podUID="b11afc1c-9b60-4f5c-a29d-78fd53879323" May 16 16:42:43.609513 containerd[1577]: time="2025-05-16T16:42:43.609470967Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd8784d74-zmxbh,Uid:afac284b-1d53-4f0b-b60b-8d046a842c57,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9040482b90b5fb3d7f40731323ae7ed1d1934ba378ef5919adca2fe7e5e8b12d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.609941 kubelet[2673]: E0516 16:42:43.609917 2673 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9040482b90b5fb3d7f40731323ae7ed1d1934ba378ef5919adca2fe7e5e8b12d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.610155 kubelet[2673]: E0516 16:42:43.610069 2673 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9040482b90b5fb3d7f40731323ae7ed1d1934ba378ef5919adca2fe7e5e8b12d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd8784d74-zmxbh" May 16 16:42:43.610262 kubelet[2673]: E0516 16:42:43.610247 2673 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9040482b90b5fb3d7f40731323ae7ed1d1934ba378ef5919adca2fe7e5e8b12d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-cd8784d74-zmxbh" May 16 16:42:43.610395 kubelet[2673]: E0516 16:42:43.610354 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-cd8784d74-zmxbh_calico-apiserver(afac284b-1d53-4f0b-b60b-8d046a842c57)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-cd8784d74-zmxbh_calico-apiserver(afac284b-1d53-4f0b-b60b-8d046a842c57)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9040482b90b5fb3d7f40731323ae7ed1d1934ba378ef5919adca2fe7e5e8b12d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-cd8784d74-zmxbh" podUID="afac284b-1d53-4f0b-b60b-8d046a842c57" May 16 16:42:43.613782 containerd[1577]: time="2025-05-16T16:42:43.613726695Z" level=error msg="Failed to destroy network for sandbox \"bcffa68aa2986ff10fdc7b991397e203d3d34515c465e34f5b4ace604d1aaa09\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.615420 containerd[1577]: time="2025-05-16T16:42:43.615284306Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7685c7f4cf-98w45,Uid:e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcffa68aa2986ff10fdc7b991397e203d3d34515c465e34f5b4ace604d1aaa09\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.616159 kubelet[2673]: E0516 16:42:43.615762 2673 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcffa68aa2986ff10fdc7b991397e203d3d34515c465e34f5b4ace604d1aaa09\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.616159 kubelet[2673]: E0516 16:42:43.615819 2673 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcffa68aa2986ff10fdc7b991397e203d3d34515c465e34f5b4ace604d1aaa09\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7685c7f4cf-98w45" May 16 16:42:43.616159 kubelet[2673]: E0516 16:42:43.615843 2673 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"bcffa68aa2986ff10fdc7b991397e203d3d34515c465e34f5b4ace604d1aaa09\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7685c7f4cf-98w45" May 16 16:42:43.616339 kubelet[2673]: E0516 16:42:43.615890 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7685c7f4cf-98w45_calico-system(e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7685c7f4cf-98w45_calico-system(e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"bcffa68aa2986ff10fdc7b991397e203d3d34515c465e34f5b4ace604d1aaa09\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7685c7f4cf-98w45" podUID="e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa" May 16 16:42:43.619172 containerd[1577]: time="2025-05-16T16:42:43.619119065Z" level=error msg="Failed to destroy network for sandbox \"7b2d6f12c9b82081a6bb9229c3288390064f992140efb113b64042a01b895791\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.620442 containerd[1577]: time="2025-05-16T16:42:43.620389498Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-fcfrt,Uid:3208f219-fe99-4b68-b23c-e5f9b103f8b4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b2d6f12c9b82081a6bb9229c3288390064f992140efb113b64042a01b895791\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.620704 kubelet[2673]: E0516 16:42:43.620666 2673 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b2d6f12c9b82081a6bb9229c3288390064f992140efb113b64042a01b895791\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.620756 kubelet[2673]: E0516 16:42:43.620724 2673 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b2d6f12c9b82081a6bb9229c3288390064f992140efb113b64042a01b895791\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-fcfrt" May 16 16:42:43.620756 kubelet[2673]: E0516 16:42:43.620745 2673 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7b2d6f12c9b82081a6bb9229c3288390064f992140efb113b64042a01b895791\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-fcfrt" May 16 16:42:43.621036 kubelet[2673]: E0516 16:42:43.620807 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-fcfrt_calico-system(3208f219-fe99-4b68-b23c-e5f9b103f8b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-fcfrt_calico-system(3208f219-fe99-4b68-b23c-e5f9b103f8b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7b2d6f12c9b82081a6bb9229c3288390064f992140efb113b64042a01b895791\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-fcfrt" podUID="3208f219-fe99-4b68-b23c-e5f9b103f8b4" May 16 16:42:43.622876 containerd[1577]: time="2025-05-16T16:42:43.622840974Z" level=error msg="Failed to destroy network for sandbox \"b7a6fddd5044d0258af9f858c103462d554c63d0d4c262d1e3df084a32a97c34\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.624200 containerd[1577]: time="2025-05-16T16:42:43.624149666Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-65867ff4b5-ngxpz,Uid:544c5973-02d3-447d-9edb-08e215486937,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7a6fddd5044d0258af9f858c103462d554c63d0d4c262d1e3df084a32a97c34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.624370 kubelet[2673]: E0516 16:42:43.624340 2673 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7a6fddd5044d0258af9f858c103462d554c63d0d4c262d1e3df084a32a97c34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:43.624436 kubelet[2673]: E0516 16:42:43.624385 2673 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7a6fddd5044d0258af9f858c103462d554c63d0d4c262d1e3df084a32a97c34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65867ff4b5-ngxpz" May 16 16:42:43.624436 kubelet[2673]: E0516 16:42:43.624402 2673 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b7a6fddd5044d0258af9f858c103462d554c63d0d4c262d1e3df084a32a97c34\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-65867ff4b5-ngxpz" May 16 16:42:43.624486 kubelet[2673]: E0516 16:42:43.624445 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-65867ff4b5-ngxpz_calico-system(544c5973-02d3-447d-9edb-08e215486937)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-65867ff4b5-ngxpz_calico-system(544c5973-02d3-447d-9edb-08e215486937)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b7a6fddd5044d0258af9f858c103462d554c63d0d4c262d1e3df084a32a97c34\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-65867ff4b5-ngxpz" podUID="544c5973-02d3-447d-9edb-08e215486937" May 16 16:42:44.005669 systemd[1]: Created slice kubepods-besteffort-pod7f67342a_4c47_4a10_9f69_002db6933f22.slice - libcontainer container kubepods-besteffort-pod7f67342a_4c47_4a10_9f69_002db6933f22.slice. May 16 16:42:44.007793 containerd[1577]: time="2025-05-16T16:42:44.007754795Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bb6pt,Uid:7f67342a-4c47-4a10-9f69-002db6933f22,Namespace:calico-system,Attempt:0,}" May 16 16:42:44.057610 containerd[1577]: time="2025-05-16T16:42:44.057543925Z" level=error msg="Failed to destroy network for sandbox \"b8a0082cd712c820f560aa61998208e50066bf403d7e9c5d3756d8fe20edbb8b\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:44.058953 containerd[1577]: time="2025-05-16T16:42:44.058904646Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bb6pt,Uid:7f67342a-4c47-4a10-9f69-002db6933f22,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"b8a0082cd712c820f560aa61998208e50066bf403d7e9c5d3756d8fe20edbb8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:44.059232 kubelet[2673]: E0516 16:42:44.059132 2673 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b8a0082cd712c820f560aa61998208e50066bf403d7e9c5d3756d8fe20edbb8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 16 16:42:44.059695 kubelet[2673]: E0516 16:42:44.059553 2673 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b8a0082cd712c820f560aa61998208e50066bf403d7e9c5d3756d8fe20edbb8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bb6pt" May 16 16:42:44.059695 kubelet[2673]: E0516 16:42:44.059599 2673 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"b8a0082cd712c820f560aa61998208e50066bf403d7e9c5d3756d8fe20edbb8b\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-bb6pt" May 16 16:42:44.059695 kubelet[2673]: E0516 16:42:44.059655 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-bb6pt_calico-system(7f67342a-4c47-4a10-9f69-002db6933f22)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-bb6pt_calico-system(7f67342a-4c47-4a10-9f69-002db6933f22)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"b8a0082cd712c820f560aa61998208e50066bf403d7e9c5d3756d8fe20edbb8b\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-bb6pt" podUID="7f67342a-4c47-4a10-9f69-002db6933f22" May 16 16:42:44.059804 systemd[1]: run-netns-cni\x2d476db01e\x2da158\x2defb6\x2d70d9\x2df6e7494cc9b5.mount: Deactivated successfully. May 16 16:42:44.091589 containerd[1577]: time="2025-05-16T16:42:44.091463370Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 16 16:42:50.240665 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4129170006.mount: Deactivated successfully. May 16 16:42:51.542023 containerd[1577]: time="2025-05-16T16:42:51.541959830Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:51.571401 containerd[1577]: time="2025-05-16T16:42:51.571338262Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 16 16:42:51.585860 containerd[1577]: time="2025-05-16T16:42:51.585822874Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:51.614662 containerd[1577]: time="2025-05-16T16:42:51.614631638Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 7.523122512s" May 16 16:42:51.614721 containerd[1577]: time="2025-05-16T16:42:51.614663698Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 16 16:42:51.634323 containerd[1577]: time="2025-05-16T16:42:51.634260116Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:51.730539 containerd[1577]: time="2025-05-16T16:42:51.730496705Z" level=info msg="CreateContainer within sandbox \"3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 16 16:42:51.838339 containerd[1577]: time="2025-05-16T16:42:51.838236092Z" level=info msg="Container 4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:51.849369 containerd[1577]: time="2025-05-16T16:42:51.849322562Z" level=info msg="CreateContainer within sandbox \"3e53e983d6aa87f8c17c0d6df10e8d5d62690d5c887bd2a1459f95cffda5dfbf\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8\"" May 16 16:42:51.850198 containerd[1577]: time="2025-05-16T16:42:51.850077408Z" level=info msg="StartContainer for \"4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8\"" May 16 16:42:51.851671 containerd[1577]: time="2025-05-16T16:42:51.851639377Z" level=info msg="connecting to shim 4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8" address="unix:///run/containerd/s/b3ad7dc1c3820a19d2cd3c1e1748a709560c3152b9cf47a7577f817424b8235c" protocol=ttrpc version=3 May 16 16:42:51.877784 systemd[1]: Started cri-containerd-4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8.scope - libcontainer container 4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8. May 16 16:42:51.929300 containerd[1577]: time="2025-05-16T16:42:51.929251229Z" level=info msg="StartContainer for \"4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8\" returns successfully" May 16 16:42:52.001839 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 16 16:42:52.002948 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 16 16:42:52.136890 kubelet[2673]: I0516 16:42:52.136189 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-h98bp" podStartSLOduration=1.612994383 podStartE2EDuration="19.136155381s" podCreationTimestamp="2025-05-16 16:42:33 +0000 UTC" firstStartedPulling="2025-05-16 16:42:34.092166174 +0000 UTC m=+18.196051446" lastFinishedPulling="2025-05-16 16:42:51.615327162 +0000 UTC m=+35.719212444" observedRunningTime="2025-05-16 16:42:52.135745973 +0000 UTC m=+36.239631255" watchObservedRunningTime="2025-05-16 16:42:52.136155381 +0000 UTC m=+36.240040663" May 16 16:42:52.168177 kubelet[2673]: I0516 16:42:52.168120 2673 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7llr\" (UniqueName: \"kubernetes.io/projected/544c5973-02d3-447d-9edb-08e215486937-kube-api-access-z7llr\") pod \"544c5973-02d3-447d-9edb-08e215486937\" (UID: \"544c5973-02d3-447d-9edb-08e215486937\") " May 16 16:42:52.168177 kubelet[2673]: I0516 16:42:52.168175 2673 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544c5973-02d3-447d-9edb-08e215486937-whisker-ca-bundle\") pod \"544c5973-02d3-447d-9edb-08e215486937\" (UID: \"544c5973-02d3-447d-9edb-08e215486937\") " May 16 16:42:52.168345 kubelet[2673]: I0516 16:42:52.168200 2673 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/544c5973-02d3-447d-9edb-08e215486937-whisker-backend-key-pair\") pod \"544c5973-02d3-447d-9edb-08e215486937\" (UID: \"544c5973-02d3-447d-9edb-08e215486937\") " May 16 16:42:52.170322 kubelet[2673]: I0516 16:42:52.170257 2673 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/544c5973-02d3-447d-9edb-08e215486937-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "544c5973-02d3-447d-9edb-08e215486937" (UID: "544c5973-02d3-447d-9edb-08e215486937"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 16 16:42:52.174730 kubelet[2673]: I0516 16:42:52.174682 2673 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/544c5973-02d3-447d-9edb-08e215486937-kube-api-access-z7llr" (OuterVolumeSpecName: "kube-api-access-z7llr") pod "544c5973-02d3-447d-9edb-08e215486937" (UID: "544c5973-02d3-447d-9edb-08e215486937"). InnerVolumeSpecName "kube-api-access-z7llr". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 16 16:42:52.176973 kubelet[2673]: I0516 16:42:52.176821 2673 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/544c5973-02d3-447d-9edb-08e215486937-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "544c5973-02d3-447d-9edb-08e215486937" (UID: "544c5973-02d3-447d-9edb-08e215486937"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 16 16:42:52.246772 containerd[1577]: time="2025-05-16T16:42:52.246714692Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8\" id:\"866fe8f863cc4d7666459c5bf3f7e1ccf9b3dcade79b150e44734dc6331ce66f\" pid:3843 exit_status:1 exited_at:{seconds:1747413772 nanos:246240883}" May 16 16:42:52.268995 kubelet[2673]: I0516 16:42:52.268930 2673 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z7llr\" (UniqueName: \"kubernetes.io/projected/544c5973-02d3-447d-9edb-08e215486937-kube-api-access-z7llr\") on node \"localhost\" DevicePath \"\"" May 16 16:42:52.268995 kubelet[2673]: I0516 16:42:52.268966 2673 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/544c5973-02d3-447d-9edb-08e215486937-whisker-ca-bundle\") on node \"localhost\" DevicePath \"\"" May 16 16:42:52.268995 kubelet[2673]: I0516 16:42:52.268977 2673 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/544c5973-02d3-447d-9edb-08e215486937-whisker-backend-key-pair\") on node \"localhost\" DevicePath \"\"" May 16 16:42:52.620730 systemd[1]: var-lib-kubelet-pods-544c5973\x2d02d3\x2d447d\x2d9edb\x2d08e215486937-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dz7llr.mount: Deactivated successfully. May 16 16:42:52.620873 systemd[1]: var-lib-kubelet-pods-544c5973\x2d02d3\x2d447d\x2d9edb\x2d08e215486937-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 16 16:42:53.123675 systemd[1]: Removed slice kubepods-besteffort-pod544c5973_02d3_447d_9edb_08e215486937.slice - libcontainer container kubepods-besteffort-pod544c5973_02d3_447d_9edb_08e215486937.slice. May 16 16:42:53.192667 containerd[1577]: time="2025-05-16T16:42:53.192614485Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8\" id:\"952df754aae722d2caa03e6f418e88ca6b8056a325d4a7783c3b4c909c842ee2\" pid:3880 exit_status:1 exited_at:{seconds:1747413773 nanos:192265050}" May 16 16:42:53.267260 systemd[1]: Created slice kubepods-besteffort-pod85410909_fb8d_4393_9be5_a24a0ba7b5ef.slice - libcontainer container kubepods-besteffort-pod85410909_fb8d_4393_9be5_a24a0ba7b5ef.slice. May 16 16:42:53.375392 kubelet[2673]: I0516 16:42:53.374895 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/85410909-fb8d-4393-9be5-a24a0ba7b5ef-whisker-ca-bundle\") pod \"whisker-5967895df7-wgjpx\" (UID: \"85410909-fb8d-4393-9be5-a24a0ba7b5ef\") " pod="calico-system/whisker-5967895df7-wgjpx" May 16 16:42:53.375392 kubelet[2673]: I0516 16:42:53.374941 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lf7w\" (UniqueName: \"kubernetes.io/projected/85410909-fb8d-4393-9be5-a24a0ba7b5ef-kube-api-access-6lf7w\") pod \"whisker-5967895df7-wgjpx\" (UID: \"85410909-fb8d-4393-9be5-a24a0ba7b5ef\") " pod="calico-system/whisker-5967895df7-wgjpx" May 16 16:42:53.375392 kubelet[2673]: I0516 16:42:53.374964 2673 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/85410909-fb8d-4393-9be5-a24a0ba7b5ef-whisker-backend-key-pair\") pod \"whisker-5967895df7-wgjpx\" (UID: \"85410909-fb8d-4393-9be5-a24a0ba7b5ef\") " pod="calico-system/whisker-5967895df7-wgjpx" May 16 16:42:53.571419 containerd[1577]: time="2025-05-16T16:42:53.571371709Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5967895df7-wgjpx,Uid:85410909-fb8d-4393-9be5-a24a0ba7b5ef,Namespace:calico-system,Attempt:0,}" May 16 16:42:53.889495 systemd-networkd[1488]: cali9d207ff8561: Link UP May 16 16:42:53.890024 systemd-networkd[1488]: cali9d207ff8561: Gained carrier May 16 16:42:53.917773 containerd[1577]: 2025-05-16 16:42:53.767 [INFO][3996] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 16 16:42:53.917773 containerd[1577]: 2025-05-16 16:42:53.783 [INFO][3996] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-whisker--5967895df7--wgjpx-eth0 whisker-5967895df7- calico-system 85410909-fb8d-4393-9be5-a24a0ba7b5ef 911 0 2025-05-16 16:42:53 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5967895df7 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s localhost whisker-5967895df7-wgjpx eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali9d207ff8561 [] [] }} ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Namespace="calico-system" Pod="whisker-5967895df7-wgjpx" WorkloadEndpoint="localhost-k8s-whisker--5967895df7--wgjpx-" May 16 16:42:53.917773 containerd[1577]: 2025-05-16 16:42:53.783 [INFO][3996] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Namespace="calico-system" Pod="whisker-5967895df7-wgjpx" WorkloadEndpoint="localhost-k8s-whisker--5967895df7--wgjpx-eth0" May 16 16:42:53.917773 containerd[1577]: 2025-05-16 16:42:53.842 [INFO][4011] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" HandleID="k8s-pod-network.41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Workload="localhost-k8s-whisker--5967895df7--wgjpx-eth0" May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.843 [INFO][4011] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" HandleID="k8s-pod-network.41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Workload="localhost-k8s-whisker--5967895df7--wgjpx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000123db0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"whisker-5967895df7-wgjpx", "timestamp":"2025-05-16 16:42:53.842729703 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.843 [INFO][4011] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.843 [INFO][4011] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.843 [INFO][4011] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.854 [INFO][4011] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" host="localhost" May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.860 [INFO][4011] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.866 [INFO][4011] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.868 [INFO][4011] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.870 [INFO][4011] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:53.918019 containerd[1577]: 2025-05-16 16:42:53.870 [INFO][4011] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" host="localhost" May 16 16:42:53.918244 containerd[1577]: 2025-05-16 16:42:53.871 [INFO][4011] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69 May 16 16:42:53.918244 containerd[1577]: 2025-05-16 16:42:53.874 [INFO][4011] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" host="localhost" May 16 16:42:53.918244 containerd[1577]: 2025-05-16 16:42:53.879 [INFO][4011] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.129/26] block=192.168.88.128/26 handle="k8s-pod-network.41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" host="localhost" May 16 16:42:53.918244 containerd[1577]: 2025-05-16 16:42:53.879 [INFO][4011] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.129/26] handle="k8s-pod-network.41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" host="localhost" May 16 16:42:53.918244 containerd[1577]: 2025-05-16 16:42:53.879 [INFO][4011] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:53.918244 containerd[1577]: 2025-05-16 16:42:53.879 [INFO][4011] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.129/26] IPv6=[] ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" HandleID="k8s-pod-network.41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Workload="localhost-k8s-whisker--5967895df7--wgjpx-eth0" May 16 16:42:53.918366 containerd[1577]: 2025-05-16 16:42:53.882 [INFO][3996] cni-plugin/k8s.go 418: Populated endpoint ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Namespace="calico-system" Pod="whisker-5967895df7-wgjpx" WorkloadEndpoint="localhost-k8s-whisker--5967895df7--wgjpx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5967895df7--wgjpx-eth0", GenerateName:"whisker-5967895df7-", Namespace:"calico-system", SelfLink:"", UID:"85410909-fb8d-4393-9be5-a24a0ba7b5ef", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5967895df7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"whisker-5967895df7-wgjpx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9d207ff8561", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:53.918366 containerd[1577]: 2025-05-16 16:42:53.882 [INFO][3996] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.129/32] ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Namespace="calico-system" Pod="whisker-5967895df7-wgjpx" WorkloadEndpoint="localhost-k8s-whisker--5967895df7--wgjpx-eth0" May 16 16:42:53.918437 containerd[1577]: 2025-05-16 16:42:53.882 [INFO][3996] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9d207ff8561 ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Namespace="calico-system" Pod="whisker-5967895df7-wgjpx" WorkloadEndpoint="localhost-k8s-whisker--5967895df7--wgjpx-eth0" May 16 16:42:53.918437 containerd[1577]: 2025-05-16 16:42:53.889 [INFO][3996] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Namespace="calico-system" Pod="whisker-5967895df7-wgjpx" WorkloadEndpoint="localhost-k8s-whisker--5967895df7--wgjpx-eth0" May 16 16:42:53.918479 containerd[1577]: 2025-05-16 16:42:53.890 [INFO][3996] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Namespace="calico-system" Pod="whisker-5967895df7-wgjpx" WorkloadEndpoint="localhost-k8s-whisker--5967895df7--wgjpx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-whisker--5967895df7--wgjpx-eth0", GenerateName:"whisker-5967895df7-", Namespace:"calico-system", SelfLink:"", UID:"85410909-fb8d-4393-9be5-a24a0ba7b5ef", ResourceVersion:"911", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5967895df7", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69", Pod:"whisker-5967895df7-wgjpx", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.88.129/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali9d207ff8561", MAC:"26:7c:9c:b7:68:d9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:53.918525 containerd[1577]: 2025-05-16 16:42:53.908 [INFO][3996] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" Namespace="calico-system" Pod="whisker-5967895df7-wgjpx" WorkloadEndpoint="localhost-k8s-whisker--5967895df7--wgjpx-eth0" May 16 16:42:54.000778 containerd[1577]: time="2025-05-16T16:42:54.000736348Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-z584z,Uid:6fc31b70-0736-43f1-8329-54c347d76131,Namespace:kube-system,Attempt:0,}" May 16 16:42:54.007333 kubelet[2673]: I0516 16:42:54.007125 2673 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="544c5973-02d3-447d-9edb-08e215486937" path="/var/lib/kubelet/pods/544c5973-02d3-447d-9edb-08e215486937/volumes" May 16 16:42:54.077043 containerd[1577]: time="2025-05-16T16:42:54.076961276Z" level=info msg="connecting to shim 41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69" address="unix:///run/containerd/s/e3a62f0e241f9169743dd81fe911a19e5aab3182b69608e2c651bb8c8097f4a3" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:54.107837 systemd[1]: Started cri-containerd-41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69.scope - libcontainer container 41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69. May 16 16:42:54.125739 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:54.126040 systemd-networkd[1488]: cali9c3cb71b896: Link UP May 16 16:42:54.126752 systemd-networkd[1488]: cali9c3cb71b896: Gained carrier May 16 16:42:54.144013 containerd[1577]: 2025-05-16 16:42:54.030 [INFO][4027] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 16 16:42:54.144013 containerd[1577]: 2025-05-16 16:42:54.045 [INFO][4027] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--z584z-eth0 coredns-674b8bbfcf- kube-system 6fc31b70-0736-43f1-8329-54c347d76131 829 0 2025-05-16 16:42:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-z584z eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali9c3cb71b896 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Namespace="kube-system" Pod="coredns-674b8bbfcf-z584z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--z584z-" May 16 16:42:54.144013 containerd[1577]: 2025-05-16 16:42:54.045 [INFO][4027] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Namespace="kube-system" Pod="coredns-674b8bbfcf-z584z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" May 16 16:42:54.144013 containerd[1577]: 2025-05-16 16:42:54.074 [INFO][4046] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" HandleID="k8s-pod-network.f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Workload="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.074 [INFO][4046] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" HandleID="k8s-pod-network.f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Workload="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0001a5450), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-z584z", "timestamp":"2025-05-16 16:42:54.074141278 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.074 [INFO][4046] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.074 [INFO][4046] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.074 [INFO][4046] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.083 [INFO][4046] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" host="localhost" May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.094 [INFO][4046] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.098 [INFO][4046] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.099 [INFO][4046] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.101 [INFO][4046] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:54.144352 containerd[1577]: 2025-05-16 16:42:54.101 [INFO][4046] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" host="localhost" May 16 16:42:54.144701 containerd[1577]: 2025-05-16 16:42:54.102 [INFO][4046] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7 May 16 16:42:54.144701 containerd[1577]: 2025-05-16 16:42:54.109 [INFO][4046] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" host="localhost" May 16 16:42:54.144701 containerd[1577]: 2025-05-16 16:42:54.118 [INFO][4046] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.130/26] block=192.168.88.128/26 handle="k8s-pod-network.f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" host="localhost" May 16 16:42:54.144701 containerd[1577]: 2025-05-16 16:42:54.118 [INFO][4046] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.130/26] handle="k8s-pod-network.f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" host="localhost" May 16 16:42:54.144701 containerd[1577]: 2025-05-16 16:42:54.118 [INFO][4046] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:54.144701 containerd[1577]: 2025-05-16 16:42:54.118 [INFO][4046] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.130/26] IPv6=[] ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" HandleID="k8s-pod-network.f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Workload="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" May 16 16:42:54.144825 containerd[1577]: 2025-05-16 16:42:54.122 [INFO][4027] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Namespace="kube-system" Pod="coredns-674b8bbfcf-z584z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--z584z-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6fc31b70-0736-43f1-8329-54c347d76131", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-z584z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9c3cb71b896", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:54.144919 containerd[1577]: 2025-05-16 16:42:54.122 [INFO][4027] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.130/32] ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Namespace="kube-system" Pod="coredns-674b8bbfcf-z584z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" May 16 16:42:54.144919 containerd[1577]: 2025-05-16 16:42:54.122 [INFO][4027] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali9c3cb71b896 ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Namespace="kube-system" Pod="coredns-674b8bbfcf-z584z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" May 16 16:42:54.144919 containerd[1577]: 2025-05-16 16:42:54.128 [INFO][4027] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Namespace="kube-system" Pod="coredns-674b8bbfcf-z584z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" May 16 16:42:54.145005 containerd[1577]: 2025-05-16 16:42:54.129 [INFO][4027] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Namespace="kube-system" Pod="coredns-674b8bbfcf-z584z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--z584z-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"6fc31b70-0736-43f1-8329-54c347d76131", ResourceVersion:"829", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7", Pod:"coredns-674b8bbfcf-z584z", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.130/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali9c3cb71b896", MAC:"76:92:05:d4:33:a3", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:54.145005 containerd[1577]: 2025-05-16 16:42:54.139 [INFO][4027] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" Namespace="kube-system" Pod="coredns-674b8bbfcf-z584z" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--z584z-eth0" May 16 16:42:54.185763 containerd[1577]: time="2025-05-16T16:42:54.185718415Z" level=info msg="connecting to shim f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7" address="unix:///run/containerd/s/1575768032681fb6e0634f0163d1ab1e83aca63b1509b90d32c183252fbb66a6" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:54.190920 containerd[1577]: time="2025-05-16T16:42:54.190826444Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5967895df7-wgjpx,Uid:85410909-fb8d-4393-9be5-a24a0ba7b5ef,Namespace:calico-system,Attempt:0,} returns sandbox id \"41b006b699a023b92fe35baf242e55446178eca5e5a92503c96f788de6be9d69\"" May 16 16:42:54.193449 containerd[1577]: time="2025-05-16T16:42:54.193245280Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 16:42:54.221879 systemd[1]: Started cri-containerd-f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7.scope - libcontainer container f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7. May 16 16:42:54.243098 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:54.367892 containerd[1577]: time="2025-05-16T16:42:54.367825963Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-z584z,Uid:6fc31b70-0736-43f1-8329-54c347d76131,Namespace:kube-system,Attempt:0,} returns sandbox id \"f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7\"" May 16 16:42:54.422135 containerd[1577]: time="2025-05-16T16:42:54.422015136Z" level=info msg="CreateContainer within sandbox \"f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 16 16:42:54.460497 containerd[1577]: time="2025-05-16T16:42:54.460360457Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:42:54.491978 containerd[1577]: time="2025-05-16T16:42:54.491912149Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 16:42:54.506909 containerd[1577]: time="2025-05-16T16:42:54.506677078Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:42:54.522071 kubelet[2673]: E0516 16:42:54.521905 2673 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:42:54.522071 kubelet[2673]: E0516 16:42:54.522036 2673 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:42:54.527253 kubelet[2673]: E0516 16:42:54.527172 2673 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:aec8e0cd5a7941d98f9fb853e513af3b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6lf7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5967895df7-wgjpx_calico-system(85410909-fb8d-4393-9be5-a24a0ba7b5ef): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:42:54.530686 containerd[1577]: time="2025-05-16T16:42:54.530625991Z" level=info msg="Container acb870b3716321b9f659de6becc21ac48313547d3f4d5c26e8e87b95eb9785af: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:54.530993 containerd[1577]: time="2025-05-16T16:42:54.530910023Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 16:42:54.561715 containerd[1577]: time="2025-05-16T16:42:54.561673847Z" level=info msg="CreateContainer within sandbox \"f40a926a9bf1b7d5bcf2b4ae24bf83aba6685a63a7949ef29134cfd2922e23d7\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"acb870b3716321b9f659de6becc21ac48313547d3f4d5c26e8e87b95eb9785af\"" May 16 16:42:54.563226 containerd[1577]: time="2025-05-16T16:42:54.562801391Z" level=info msg="StartContainer for \"acb870b3716321b9f659de6becc21ac48313547d3f4d5c26e8e87b95eb9785af\"" May 16 16:42:54.564259 containerd[1577]: time="2025-05-16T16:42:54.563967127Z" level=info msg="connecting to shim acb870b3716321b9f659de6becc21ac48313547d3f4d5c26e8e87b95eb9785af" address="unix:///run/containerd/s/1575768032681fb6e0634f0163d1ab1e83aca63b1509b90d32c183252fbb66a6" protocol=ttrpc version=3 May 16 16:42:54.587702 systemd[1]: Started cri-containerd-acb870b3716321b9f659de6becc21ac48313547d3f4d5c26e8e87b95eb9785af.scope - libcontainer container acb870b3716321b9f659de6becc21ac48313547d3f4d5c26e8e87b95eb9785af. May 16 16:42:54.620724 containerd[1577]: time="2025-05-16T16:42:54.620685174Z" level=info msg="StartContainer for \"acb870b3716321b9f659de6becc21ac48313547d3f4d5c26e8e87b95eb9785af\" returns successfully" May 16 16:42:54.820496 containerd[1577]: time="2025-05-16T16:42:54.820372409Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:42:54.821504 containerd[1577]: time="2025-05-16T16:42:54.821469517Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:42:54.821616 containerd[1577]: time="2025-05-16T16:42:54.821546632Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 16:42:54.821780 kubelet[2673]: E0516 16:42:54.821739 2673 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:42:54.821842 kubelet[2673]: E0516 16:42:54.821789 2673 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:42:54.821967 kubelet[2673]: E0516 16:42:54.821916 2673 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lf7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5967895df7-wgjpx_calico-system(85410909-fb8d-4393-9be5-a24a0ba7b5ef): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:42:54.823144 kubelet[2673]: E0516 16:42:54.823108 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5967895df7-wgjpx" podUID="85410909-fb8d-4393-9be5-a24a0ba7b5ef" May 16 16:42:55.001279 containerd[1577]: time="2025-05-16T16:42:55.001235392Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bb6pt,Uid:7f67342a-4c47-4a10-9f69-002db6933f22,Namespace:calico-system,Attempt:0,}" May 16 16:42:55.001445 containerd[1577]: time="2025-05-16T16:42:55.001424257Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd8784d74-2wgvs,Uid:b11afc1c-9b60-4f5c-a29d-78fd53879323,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:55.053248 systemd[1]: Started sshd@7-10.0.0.80:22-10.0.0.1:57788.service - OpenSSH per-connection server daemon (10.0.0.1:57788). May 16 16:42:55.113800 systemd-networkd[1488]: cali476542ade6a: Link UP May 16 16:42:55.116235 sshd[4250]: Accepted publickey for core from 10.0.0.1 port 57788 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:42:55.113985 systemd-networkd[1488]: cali476542ade6a: Gained carrier May 16 16:42:55.116454 sshd-session[4250]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:42:55.123788 systemd-logind[1562]: New session 8 of user core. May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.026 [INFO][4209] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.039 [INFO][4209] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-csi--node--driver--bb6pt-eth0 csi-node-driver- calico-system 7f67342a-4c47-4a10-9f69-002db6933f22 727 0 2025-05-16 16:42:33 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s localhost csi-node-driver-bb6pt eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali476542ade6a [] [] }} ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Namespace="calico-system" Pod="csi-node-driver-bb6pt" WorkloadEndpoint="localhost-k8s-csi--node--driver--bb6pt-" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.039 [INFO][4209] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Namespace="calico-system" Pod="csi-node-driver-bb6pt" WorkloadEndpoint="localhost-k8s-csi--node--driver--bb6pt-eth0" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.069 [INFO][4238] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" HandleID="k8s-pod-network.7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Workload="localhost-k8s-csi--node--driver--bb6pt-eth0" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.069 [INFO][4238] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" HandleID="k8s-pod-network.7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Workload="localhost-k8s-csi--node--driver--bb6pt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002c5c10), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"csi-node-driver-bb6pt", "timestamp":"2025-05-16 16:42:55.069500446 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.069 [INFO][4238] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.069 [INFO][4238] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.069 [INFO][4238] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.076 [INFO][4238] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" host="localhost" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.084 [INFO][4238] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.088 [INFO][4238] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.090 [INFO][4238] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.092 [INFO][4238] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.092 [INFO][4238] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" host="localhost" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.093 [INFO][4238] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447 May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.097 [INFO][4238] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" host="localhost" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.103 [INFO][4238] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.131/26] block=192.168.88.128/26 handle="k8s-pod-network.7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" host="localhost" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.103 [INFO][4238] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.131/26] handle="k8s-pod-network.7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" host="localhost" May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.103 [INFO][4238] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:55.127961 containerd[1577]: 2025-05-16 16:42:55.103 [INFO][4238] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.131/26] IPv6=[] ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" HandleID="k8s-pod-network.7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Workload="localhost-k8s-csi--node--driver--bb6pt-eth0" May 16 16:42:55.128610 containerd[1577]: 2025-05-16 16:42:55.109 [INFO][4209] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Namespace="calico-system" Pod="csi-node-driver-bb6pt" WorkloadEndpoint="localhost-k8s-csi--node--driver--bb6pt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--bb6pt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7f67342a-4c47-4a10-9f69-002db6933f22", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"csi-node-driver-bb6pt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali476542ade6a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:55.128610 containerd[1577]: 2025-05-16 16:42:55.109 [INFO][4209] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.131/32] ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Namespace="calico-system" Pod="csi-node-driver-bb6pt" WorkloadEndpoint="localhost-k8s-csi--node--driver--bb6pt-eth0" May 16 16:42:55.128610 containerd[1577]: 2025-05-16 16:42:55.110 [INFO][4209] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali476542ade6a ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Namespace="calico-system" Pod="csi-node-driver-bb6pt" WorkloadEndpoint="localhost-k8s-csi--node--driver--bb6pt-eth0" May 16 16:42:55.128610 containerd[1577]: 2025-05-16 16:42:55.112 [INFO][4209] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Namespace="calico-system" Pod="csi-node-driver-bb6pt" WorkloadEndpoint="localhost-k8s-csi--node--driver--bb6pt-eth0" May 16 16:42:55.128610 containerd[1577]: 2025-05-16 16:42:55.113 [INFO][4209] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Namespace="calico-system" Pod="csi-node-driver-bb6pt" WorkloadEndpoint="localhost-k8s-csi--node--driver--bb6pt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-csi--node--driver--bb6pt-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"7f67342a-4c47-4a10-9f69-002db6933f22", ResourceVersion:"727", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447", Pod:"csi-node-driver-bb6pt", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.88.131/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali476542ade6a", MAC:"ee:26:09:59:5e:a4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:55.128610 containerd[1577]: 2025-05-16 16:42:55.125 [INFO][4209] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" Namespace="calico-system" Pod="csi-node-driver-bb6pt" WorkloadEndpoint="localhost-k8s-csi--node--driver--bb6pt-eth0" May 16 16:42:55.130763 systemd[1]: Started session-8.scope - Session 8 of User core. May 16 16:42:55.132388 kubelet[2673]: E0516 16:42:55.132342 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5967895df7-wgjpx" podUID="85410909-fb8d-4393-9be5-a24a0ba7b5ef" May 16 16:42:55.152342 kubelet[2673]: I0516 16:42:55.152226 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-z584z" podStartSLOduration=32.152203797 podStartE2EDuration="32.152203797s" podCreationTimestamp="2025-05-16 16:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:55.140063179 +0000 UTC m=+39.243948461" watchObservedRunningTime="2025-05-16 16:42:55.152203797 +0000 UTC m=+39.256089079" May 16 16:42:55.163650 containerd[1577]: time="2025-05-16T16:42:55.163473893Z" level=info msg="connecting to shim 7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447" address="unix:///run/containerd/s/c87c4f9549dadbb8115ce5647f564cf6720c4ee6555e185c5aff4bfc932a3a7b" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:55.201266 systemd[1]: Started cri-containerd-7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447.scope - libcontainer container 7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447. May 16 16:42:55.220548 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:55.229980 systemd-networkd[1488]: cali8d6e48afbc8: Link UP May 16 16:42:55.230156 systemd-networkd[1488]: cali8d6e48afbc8: Gained carrier May 16 16:42:55.243166 containerd[1577]: time="2025-05-16T16:42:55.243126343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-bb6pt,Uid:7f67342a-4c47-4a10-9f69-002db6933f22,Namespace:calico-system,Attempt:0,} returns sandbox id \"7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447\"" May 16 16:42:55.246631 containerd[1577]: time="2025-05-16T16:42:55.246603183Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.035 [INFO][4213] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.045 [INFO][4213] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0 calico-apiserver-cd8784d74- calico-apiserver b11afc1c-9b60-4f5c-a29d-78fd53879323 836 0 2025-05-16 16:42:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cd8784d74 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-cd8784d74-2wgvs eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali8d6e48afbc8 [] [] }} ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-2wgvs" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.045 [INFO][4213] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-2wgvs" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.076 [INFO][4245] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" HandleID="k8s-pod-network.9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Workload="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.076 [INFO][4245] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" HandleID="k8s-pod-network.9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Workload="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000139590), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-cd8784d74-2wgvs", "timestamp":"2025-05-16 16:42:55.076115982 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.076 [INFO][4245] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.103 [INFO][4245] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.103 [INFO][4245] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.181 [INFO][4245] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" host="localhost" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.194 [INFO][4245] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.199 [INFO][4245] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.202 [INFO][4245] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.205 [INFO][4245] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.205 [INFO][4245] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" host="localhost" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.208 [INFO][4245] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.213 [INFO][4245] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" host="localhost" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.221 [INFO][4245] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.132/26] block=192.168.88.128/26 handle="k8s-pod-network.9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" host="localhost" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.221 [INFO][4245] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.132/26] handle="k8s-pod-network.9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" host="localhost" May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.221 [INFO][4245] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:55.247309 containerd[1577]: 2025-05-16 16:42:55.221 [INFO][4245] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.132/26] IPv6=[] ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" HandleID="k8s-pod-network.9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Workload="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" May 16 16:42:55.247867 containerd[1577]: 2025-05-16 16:42:55.225 [INFO][4213] cni-plugin/k8s.go 418: Populated endpoint ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-2wgvs" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0", GenerateName:"calico-apiserver-cd8784d74-", Namespace:"calico-apiserver", SelfLink:"", UID:"b11afc1c-9b60-4f5c-a29d-78fd53879323", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd8784d74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-cd8784d74-2wgvs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8d6e48afbc8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:55.247867 containerd[1577]: 2025-05-16 16:42:55.225 [INFO][4213] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.132/32] ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-2wgvs" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" May 16 16:42:55.247867 containerd[1577]: 2025-05-16 16:42:55.226 [INFO][4213] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8d6e48afbc8 ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-2wgvs" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" May 16 16:42:55.247867 containerd[1577]: 2025-05-16 16:42:55.230 [INFO][4213] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-2wgvs" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" May 16 16:42:55.247867 containerd[1577]: 2025-05-16 16:42:55.230 [INFO][4213] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-2wgvs" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0", GenerateName:"calico-apiserver-cd8784d74-", Namespace:"calico-apiserver", SelfLink:"", UID:"b11afc1c-9b60-4f5c-a29d-78fd53879323", ResourceVersion:"836", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd8784d74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e", Pod:"calico-apiserver-cd8784d74-2wgvs", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.132/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali8d6e48afbc8", MAC:"86:0d:1c:95:92:ed", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:55.247867 containerd[1577]: 2025-05-16 16:42:55.242 [INFO][4213] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-2wgvs" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--2wgvs-eth0" May 16 16:42:55.280638 containerd[1577]: time="2025-05-16T16:42:55.280556088Z" level=info msg="connecting to shim 9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e" address="unix:///run/containerd/s/4363dba362fca31168eafad53d990333a777c249aaf9d84188ec551f8283c358" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:55.291725 sshd[4265]: Connection closed by 10.0.0.1 port 57788 May 16 16:42:55.291284 sshd-session[4250]: pam_unix(sshd:session): session closed for user core May 16 16:42:55.295791 systemd[1]: sshd@7-10.0.0.80:22-10.0.0.1:57788.service: Deactivated successfully. May 16 16:42:55.299196 systemd[1]: session-8.scope: Deactivated successfully. May 16 16:42:55.301260 systemd-logind[1562]: Session 8 logged out. Waiting for processes to exit. May 16 16:42:55.315000 systemd[1]: Started cri-containerd-9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e.scope - libcontainer container 9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e. May 16 16:42:55.316011 systemd-logind[1562]: Removed session 8. May 16 16:42:55.328509 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:55.357838 containerd[1577]: time="2025-05-16T16:42:55.357800732Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd8784d74-2wgvs,Uid:b11afc1c-9b60-4f5c-a29d-78fd53879323,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e\"" May 16 16:42:55.767704 systemd-networkd[1488]: cali9d207ff8561: Gained IPv6LL May 16 16:42:55.927755 kubelet[2673]: I0516 16:42:55.927711 2673 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 16 16:42:56.087742 systemd-networkd[1488]: cali9c3cb71b896: Gained IPv6LL May 16 16:42:56.141482 kubelet[2673]: E0516 16:42:56.140945 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5967895df7-wgjpx" podUID="85410909-fb8d-4393-9be5-a24a0ba7b5ef" May 16 16:42:56.343711 systemd-networkd[1488]: cali476542ade6a: Gained IPv6LL May 16 16:42:56.960422 systemd-networkd[1488]: vxlan.calico: Link UP May 16 16:42:56.960432 systemd-networkd[1488]: vxlan.calico: Gained carrier May 16 16:42:57.001430 containerd[1577]: time="2025-05-16T16:42:57.001394822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd8784d74-zmxbh,Uid:afac284b-1d53-4f0b-b60b-8d046a842c57,Namespace:calico-apiserver,Attempt:0,}" May 16 16:42:57.002142 containerd[1577]: time="2025-05-16T16:42:57.001449835Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-krmsj,Uid:9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4,Namespace:kube-system,Attempt:0,}" May 16 16:42:57.111763 systemd-networkd[1488]: cali8d6e48afbc8: Gained IPv6LL May 16 16:42:57.209392 systemd-networkd[1488]: calieb84edd8f17: Link UP May 16 16:42:57.211877 systemd-networkd[1488]: calieb84edd8f17: Gained carrier May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.129 [INFO][4504] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0 calico-apiserver-cd8784d74- calico-apiserver afac284b-1d53-4f0b-b60b-8d046a842c57 837 0 2025-05-16 16:42:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:cd8784d74 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s localhost calico-apiserver-cd8784d74-zmxbh eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calieb84edd8f17 [] [] }} ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-zmxbh" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.130 [INFO][4504] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-zmxbh" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.157 [INFO][4536] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" HandleID="k8s-pod-network.45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Workload="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.157 [INFO][4536] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" HandleID="k8s-pod-network.45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Workload="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002ad6f0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"localhost", "pod":"calico-apiserver-cd8784d74-zmxbh", "timestamp":"2025-05-16 16:42:57.157195338 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.157 [INFO][4536] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.157 [INFO][4536] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.157 [INFO][4536] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.167 [INFO][4536] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" host="localhost" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.172 [INFO][4536] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.177 [INFO][4536] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.179 [INFO][4536] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.182 [INFO][4536] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.182 [INFO][4536] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" host="localhost" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.184 [INFO][4536] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351 May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.188 [INFO][4536] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" host="localhost" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.194 [INFO][4536] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.133/26] block=192.168.88.128/26 handle="k8s-pod-network.45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" host="localhost" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.195 [INFO][4536] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.133/26] handle="k8s-pod-network.45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" host="localhost" May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.195 [INFO][4536] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:57.234621 containerd[1577]: 2025-05-16 16:42:57.195 [INFO][4536] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.133/26] IPv6=[] ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" HandleID="k8s-pod-network.45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Workload="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" May 16 16:42:57.235337 containerd[1577]: 2025-05-16 16:42:57.201 [INFO][4504] cni-plugin/k8s.go 418: Populated endpoint ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-zmxbh" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0", GenerateName:"calico-apiserver-cd8784d74-", Namespace:"calico-apiserver", SelfLink:"", UID:"afac284b-1d53-4f0b-b60b-8d046a842c57", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd8784d74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-apiserver-cd8784d74-zmxbh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calieb84edd8f17", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:57.235337 containerd[1577]: 2025-05-16 16:42:57.201 [INFO][4504] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.133/32] ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-zmxbh" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" May 16 16:42:57.235337 containerd[1577]: 2025-05-16 16:42:57.202 [INFO][4504] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calieb84edd8f17 ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-zmxbh" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" May 16 16:42:57.235337 containerd[1577]: 2025-05-16 16:42:57.219 [INFO][4504] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-zmxbh" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" May 16 16:42:57.235337 containerd[1577]: 2025-05-16 16:42:57.219 [INFO][4504] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-zmxbh" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0", GenerateName:"calico-apiserver-cd8784d74-", Namespace:"calico-apiserver", SelfLink:"", UID:"afac284b-1d53-4f0b-b60b-8d046a842c57", ResourceVersion:"837", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"cd8784d74", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351", Pod:"calico-apiserver-cd8784d74-zmxbh", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.88.133/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calieb84edd8f17", MAC:"46:ab:aa:c9:3d:7b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:57.235337 containerd[1577]: 2025-05-16 16:42:57.231 [INFO][4504] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" Namespace="calico-apiserver" Pod="calico-apiserver-cd8784d74-zmxbh" WorkloadEndpoint="localhost-k8s-calico--apiserver--cd8784d74--zmxbh-eth0" May 16 16:42:57.277479 containerd[1577]: time="2025-05-16T16:42:57.277370670Z" level=info msg="connecting to shim 45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351" address="unix:///run/containerd/s/9133cf0f82afd85568564a303a7040e56c0c2f777b4f45459fcfd031b70139ae" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:57.307855 systemd[1]: Started cri-containerd-45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351.scope - libcontainer container 45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351. May 16 16:42:57.323685 systemd-networkd[1488]: calie726b189e32: Link UP May 16 16:42:57.324673 systemd-networkd[1488]: calie726b189e32: Gained carrier May 16 16:42:57.345403 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.129 [INFO][4507] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-coredns--674b8bbfcf--krmsj-eth0 coredns-674b8bbfcf- kube-system 9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4 835 0 2025-05-16 16:42:23 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s localhost coredns-674b8bbfcf-krmsj eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] calie726b189e32 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-krmsj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--krmsj-" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.129 [INFO][4507] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-krmsj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.165 [INFO][4542] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" HandleID="k8s-pod-network.8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Workload="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.165 [INFO][4542] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" HandleID="k8s-pod-network.8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Workload="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000324480), Attrs:map[string]string{"namespace":"kube-system", "node":"localhost", "pod":"coredns-674b8bbfcf-krmsj", "timestamp":"2025-05-16 16:42:57.165191284 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.165 [INFO][4542] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.195 [INFO][4542] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.195 [INFO][4542] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.268 [INFO][4542] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" host="localhost" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.281 [INFO][4542] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.290 [INFO][4542] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.292 [INFO][4542] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.294 [INFO][4542] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.294 [INFO][4542] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" host="localhost" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.296 [INFO][4542] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9 May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.305 [INFO][4542] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" host="localhost" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.317 [INFO][4542] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.134/26] block=192.168.88.128/26 handle="k8s-pod-network.8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" host="localhost" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.317 [INFO][4542] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.134/26] handle="k8s-pod-network.8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" host="localhost" May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.317 [INFO][4542] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:57.353427 containerd[1577]: 2025-05-16 16:42:57.317 [INFO][4542] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.134/26] IPv6=[] ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" HandleID="k8s-pod-network.8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Workload="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" May 16 16:42:57.354150 containerd[1577]: 2025-05-16 16:42:57.320 [INFO][4507] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-krmsj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--krmsj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"coredns-674b8bbfcf-krmsj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie726b189e32", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:57.354150 containerd[1577]: 2025-05-16 16:42:57.321 [INFO][4507] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.134/32] ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-krmsj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" May 16 16:42:57.354150 containerd[1577]: 2025-05-16 16:42:57.321 [INFO][4507] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie726b189e32 ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-krmsj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" May 16 16:42:57.354150 containerd[1577]: 2025-05-16 16:42:57.324 [INFO][4507] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-krmsj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" May 16 16:42:57.354150 containerd[1577]: 2025-05-16 16:42:57.325 [INFO][4507] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-krmsj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-coredns--674b8bbfcf--krmsj-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4", ResourceVersion:"835", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 23, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9", Pod:"coredns-674b8bbfcf-krmsj", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.88.134/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"calie726b189e32", MAC:"ce:cf:f2:68:66:33", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:57.354150 containerd[1577]: 2025-05-16 16:42:57.340 [INFO][4507] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" Namespace="kube-system" Pod="coredns-674b8bbfcf-krmsj" WorkloadEndpoint="localhost-k8s-coredns--674b8bbfcf--krmsj-eth0" May 16 16:42:57.377344 containerd[1577]: time="2025-05-16T16:42:57.377305924Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:57.377785 containerd[1577]: time="2025-05-16T16:42:57.377726102Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 16 16:42:57.382641 containerd[1577]: time="2025-05-16T16:42:57.382596075Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:57.386556 containerd[1577]: time="2025-05-16T16:42:57.386509985Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:57.393013 containerd[1577]: time="2025-05-16T16:42:57.392967957Z" level=info msg="connecting to shim 8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9" address="unix:///run/containerd/s/eddd14caee5c078e6abb7a36068eaf0079d74b9d5af4f8b0cc9028fc94fbb58c" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:57.393251 containerd[1577]: time="2025-05-16T16:42:57.393225550Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-cd8784d74-zmxbh,Uid:afac284b-1d53-4f0b-b60b-8d046a842c57,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351\"" May 16 16:42:57.401595 containerd[1577]: time="2025-05-16T16:42:57.401538971Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 2.154902105s" May 16 16:42:57.401595 containerd[1577]: time="2025-05-16T16:42:57.401588524Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 16 16:42:57.403799 containerd[1577]: time="2025-05-16T16:42:57.403758354Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 16 16:42:57.408393 containerd[1577]: time="2025-05-16T16:42:57.408352450Z" level=info msg="CreateContainer within sandbox \"7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 16 16:42:57.425301 containerd[1577]: time="2025-05-16T16:42:57.425246824Z" level=info msg="Container 8365e4a36a86c3d722e785a974b88e0f2cace80a01ab0d6dba339c34503d2178: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:57.431127 systemd[1]: Started cri-containerd-8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9.scope - libcontainer container 8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9. May 16 16:42:57.447471 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:57.452392 containerd[1577]: time="2025-05-16T16:42:57.452353521Z" level=info msg="CreateContainer within sandbox \"7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"8365e4a36a86c3d722e785a974b88e0f2cace80a01ab0d6dba339c34503d2178\"" May 16 16:42:57.453346 containerd[1577]: time="2025-05-16T16:42:57.453283927Z" level=info msg="StartContainer for \"8365e4a36a86c3d722e785a974b88e0f2cace80a01ab0d6dba339c34503d2178\"" May 16 16:42:57.454691 containerd[1577]: time="2025-05-16T16:42:57.454656851Z" level=info msg="connecting to shim 8365e4a36a86c3d722e785a974b88e0f2cace80a01ab0d6dba339c34503d2178" address="unix:///run/containerd/s/c87c4f9549dadbb8115ce5647f564cf6720c4ee6555e185c5aff4bfc932a3a7b" protocol=ttrpc version=3 May 16 16:42:57.478895 systemd[1]: Started cri-containerd-8365e4a36a86c3d722e785a974b88e0f2cace80a01ab0d6dba339c34503d2178.scope - libcontainer container 8365e4a36a86c3d722e785a974b88e0f2cace80a01ab0d6dba339c34503d2178. May 16 16:42:57.488434 containerd[1577]: time="2025-05-16T16:42:57.488385748Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-krmsj,Uid:9f415d75-aaf3-47ae-a8b2-f1ffaa36e1c4,Namespace:kube-system,Attempt:0,} returns sandbox id \"8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9\"" May 16 16:42:57.496809 containerd[1577]: time="2025-05-16T16:42:57.496758241Z" level=info msg="CreateContainer within sandbox \"8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 16 16:42:57.507900 containerd[1577]: time="2025-05-16T16:42:57.507824184Z" level=info msg="Container 7c200833edce1c0910aba5cbf98ffe11153262324ad5dea127c4e1df44cea054: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:57.516142 containerd[1577]: time="2025-05-16T16:42:57.516086890Z" level=info msg="CreateContainer within sandbox \"8c477d357bfe2d170ea17cde37d9968a2ab0e628b2bee73590344871eeee98d9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7c200833edce1c0910aba5cbf98ffe11153262324ad5dea127c4e1df44cea054\"" May 16 16:42:57.519124 containerd[1577]: time="2025-05-16T16:42:57.519076327Z" level=info msg="StartContainer for \"7c200833edce1c0910aba5cbf98ffe11153262324ad5dea127c4e1df44cea054\"" May 16 16:42:57.520590 containerd[1577]: time="2025-05-16T16:42:57.520541384Z" level=info msg="connecting to shim 7c200833edce1c0910aba5cbf98ffe11153262324ad5dea127c4e1df44cea054" address="unix:///run/containerd/s/eddd14caee5c078e6abb7a36068eaf0079d74b9d5af4f8b0cc9028fc94fbb58c" protocol=ttrpc version=3 May 16 16:42:57.531437 containerd[1577]: time="2025-05-16T16:42:57.531348453Z" level=info msg="StartContainer for \"8365e4a36a86c3d722e785a974b88e0f2cace80a01ab0d6dba339c34503d2178\" returns successfully" May 16 16:42:57.554699 systemd[1]: Started cri-containerd-7c200833edce1c0910aba5cbf98ffe11153262324ad5dea127c4e1df44cea054.scope - libcontainer container 7c200833edce1c0910aba5cbf98ffe11153262324ad5dea127c4e1df44cea054. May 16 16:42:57.587008 containerd[1577]: time="2025-05-16T16:42:57.586973184Z" level=info msg="StartContainer for \"7c200833edce1c0910aba5cbf98ffe11153262324ad5dea127c4e1df44cea054\" returns successfully" May 16 16:42:58.001148 containerd[1577]: time="2025-05-16T16:42:58.001105638Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7685c7f4cf-98w45,Uid:e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa,Namespace:calico-system,Attempt:0,}" May 16 16:42:58.095390 systemd-networkd[1488]: cali6c29940390a: Link UP May 16 16:42:58.096094 systemd-networkd[1488]: cali6c29940390a: Gained carrier May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.035 [INFO][4770] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0 calico-kube-controllers-7685c7f4cf- calico-system e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa 840 0 2025-05-16 16:42:33 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7685c7f4cf projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s localhost calico-kube-controllers-7685c7f4cf-98w45 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali6c29940390a [] [] }} ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Namespace="calico-system" Pod="calico-kube-controllers-7685c7f4cf-98w45" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.035 [INFO][4770] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Namespace="calico-system" Pod="calico-kube-controllers-7685c7f4cf-98w45" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.062 [INFO][4784] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" HandleID="k8s-pod-network.4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Workload="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.062 [INFO][4784] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" HandleID="k8s-pod-network.4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Workload="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000285040), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"calico-kube-controllers-7685c7f4cf-98w45", "timestamp":"2025-05-16 16:42:58.062380267 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.062 [INFO][4784] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.062 [INFO][4784] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.062 [INFO][4784] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.068 [INFO][4784] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" host="localhost" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.073 [INFO][4784] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.077 [INFO][4784] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.079 [INFO][4784] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.080 [INFO][4784] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.080 [INFO][4784] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" host="localhost" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.082 [INFO][4784] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648 May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.084 [INFO][4784] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" host="localhost" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.090 [INFO][4784] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.135/26] block=192.168.88.128/26 handle="k8s-pod-network.4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" host="localhost" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.090 [INFO][4784] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.135/26] handle="k8s-pod-network.4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" host="localhost" May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.090 [INFO][4784] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:58.107759 containerd[1577]: 2025-05-16 16:42:58.090 [INFO][4784] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.135/26] IPv6=[] ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" HandleID="k8s-pod-network.4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Workload="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" May 16 16:42:58.108641 containerd[1577]: 2025-05-16 16:42:58.093 [INFO][4770] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Namespace="calico-system" Pod="calico-kube-controllers-7685c7f4cf-98w45" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0", GenerateName:"calico-kube-controllers-7685c7f4cf-", Namespace:"calico-system", SelfLink:"", UID:"e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7685c7f4cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"calico-kube-controllers-7685c7f4cf-98w45", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6c29940390a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:58.108641 containerd[1577]: 2025-05-16 16:42:58.093 [INFO][4770] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.135/32] ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Namespace="calico-system" Pod="calico-kube-controllers-7685c7f4cf-98w45" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" May 16 16:42:58.108641 containerd[1577]: 2025-05-16 16:42:58.093 [INFO][4770] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali6c29940390a ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Namespace="calico-system" Pod="calico-kube-controllers-7685c7f4cf-98w45" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" May 16 16:42:58.108641 containerd[1577]: 2025-05-16 16:42:58.096 [INFO][4770] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Namespace="calico-system" Pod="calico-kube-controllers-7685c7f4cf-98w45" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" May 16 16:42:58.108641 containerd[1577]: 2025-05-16 16:42:58.096 [INFO][4770] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Namespace="calico-system" Pod="calico-kube-controllers-7685c7f4cf-98w45" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0", GenerateName:"calico-kube-controllers-7685c7f4cf-", Namespace:"calico-system", SelfLink:"", UID:"e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 33, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7685c7f4cf", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648", Pod:"calico-kube-controllers-7685c7f4cf-98w45", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.88.135/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali6c29940390a", MAC:"ce:2f:be:a2:ef:59", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:58.108641 containerd[1577]: 2025-05-16 16:42:58.104 [INFO][4770] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" Namespace="calico-system" Pod="calico-kube-controllers-7685c7f4cf-98w45" WorkloadEndpoint="localhost-k8s-calico--kube--controllers--7685c7f4cf--98w45-eth0" May 16 16:42:58.154561 containerd[1577]: time="2025-05-16T16:42:58.154502790Z" level=info msg="connecting to shim 4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648" address="unix:///run/containerd/s/45c0d4eef3cda09c04640c43e0a4f0fb6def1f5b8a3819284feb547e43d335a7" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:58.157101 kubelet[2673]: I0516 16:42:58.157053 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-krmsj" podStartSLOduration=35.157036653 podStartE2EDuration="35.157036653s" podCreationTimestamp="2025-05-16 16:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-16 16:42:58.156415748 +0000 UTC m=+42.260301030" watchObservedRunningTime="2025-05-16 16:42:58.157036653 +0000 UTC m=+42.260921935" May 16 16:42:58.200799 systemd[1]: Started cri-containerd-4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648.scope - libcontainer container 4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648. May 16 16:42:58.214489 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:58.244290 containerd[1577]: time="2025-05-16T16:42:58.244250520Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7685c7f4cf-98w45,Uid:e3efb2f6-048d-46a8-9ddf-da2b1ef69bfa,Namespace:calico-system,Attempt:0,} returns sandbox id \"4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648\"" May 16 16:42:58.519764 systemd-networkd[1488]: calie726b189e32: Gained IPv6LL May 16 16:42:58.839781 systemd-networkd[1488]: vxlan.calico: Gained IPv6LL May 16 16:42:59.001593 containerd[1577]: time="2025-05-16T16:42:59.001489822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-fcfrt,Uid:3208f219-fe99-4b68-b23c-e5f9b103f8b4,Namespace:calico-system,Attempt:0,}" May 16 16:42:59.136472 systemd-networkd[1488]: cali7f8b0f1123c: Link UP May 16 16:42:59.136727 systemd-networkd[1488]: cali7f8b0f1123c: Gained carrier May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.060 [INFO][4852] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0 goldmane-78d55f7ddc- calico-system 3208f219-fe99-4b68-b23c-e5f9b103f8b4 839 0 2025-05-16 16:42:32 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s localhost goldmane-78d55f7ddc-fcfrt eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali7f8b0f1123c [] [] }} ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fcfrt" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--fcfrt-" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.060 [INFO][4852] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fcfrt" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.093 [INFO][4867] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" HandleID="k8s-pod-network.f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Workload="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.094 [INFO][4867] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" HandleID="k8s-pod-network.f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Workload="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00027b4e0), Attrs:map[string]string{"namespace":"calico-system", "node":"localhost", "pod":"goldmane-78d55f7ddc-fcfrt", "timestamp":"2025-05-16 16:42:59.093820739 +0000 UTC"}, Hostname:"localhost", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.094 [INFO][4867] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.094 [INFO][4867] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.094 [INFO][4867] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'localhost' May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.100 [INFO][4867] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" host="localhost" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.104 [INFO][4867] ipam/ipam.go 394: Looking up existing affinities for host host="localhost" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.108 [INFO][4867] ipam/ipam.go 511: Trying affinity for 192.168.88.128/26 host="localhost" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.110 [INFO][4867] ipam/ipam.go 158: Attempting to load block cidr=192.168.88.128/26 host="localhost" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.112 [INFO][4867] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.88.128/26 host="localhost" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.112 [INFO][4867] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.88.128/26 handle="k8s-pod-network.f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" host="localhost" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.113 [INFO][4867] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.123 [INFO][4867] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.88.128/26 handle="k8s-pod-network.f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" host="localhost" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.130 [INFO][4867] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.88.136/26] block=192.168.88.128/26 handle="k8s-pod-network.f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" host="localhost" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.130 [INFO][4867] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.88.136/26] handle="k8s-pod-network.f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" host="localhost" May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.130 [INFO][4867] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 16 16:42:59.158828 containerd[1577]: 2025-05-16 16:42:59.130 [INFO][4867] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.88.136/26] IPv6=[] ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" HandleID="k8s-pod-network.f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Workload="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" May 16 16:42:59.160153 containerd[1577]: 2025-05-16 16:42:59.133 [INFO][4852] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fcfrt" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"3208f219-fe99-4b68-b23c-e5f9b103f8b4", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"", Pod:"goldmane-78d55f7ddc-fcfrt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7f8b0f1123c", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:59.160153 containerd[1577]: 2025-05-16 16:42:59.133 [INFO][4852] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.88.136/32] ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fcfrt" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" May 16 16:42:59.160153 containerd[1577]: 2025-05-16 16:42:59.133 [INFO][4852] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7f8b0f1123c ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fcfrt" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" May 16 16:42:59.160153 containerd[1577]: 2025-05-16 16:42:59.136 [INFO][4852] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fcfrt" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" May 16 16:42:59.160153 containerd[1577]: 2025-05-16 16:42:59.138 [INFO][4852] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fcfrt" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"3208f219-fe99-4b68-b23c-e5f9b103f8b4", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2025, time.May, 16, 16, 42, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"localhost", ContainerID:"f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b", Pod:"goldmane-78d55f7ddc-fcfrt", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.88.136/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali7f8b0f1123c", MAC:"66:5b:43:af:54:b7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 16 16:42:59.160153 containerd[1577]: 2025-05-16 16:42:59.155 [INFO][4852] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" Namespace="calico-system" Pod="goldmane-78d55f7ddc-fcfrt" WorkloadEndpoint="localhost-k8s-goldmane--78d55f7ddc--fcfrt-eth0" May 16 16:42:59.160877 systemd-networkd[1488]: calieb84edd8f17: Gained IPv6LL May 16 16:42:59.261246 containerd[1577]: time="2025-05-16T16:42:59.260937275Z" level=info msg="connecting to shim f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b" address="unix:///run/containerd/s/0b481218dcb34bf13bdd6220c15bb2e62bb872f44574fe38e32c951990cdc623" namespace=k8s.io protocol=ttrpc version=3 May 16 16:42:59.294737 systemd[1]: Started cri-containerd-f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b.scope - libcontainer container f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b. May 16 16:42:59.309916 systemd-resolved[1405]: Failed to determine the local hostname and LLMNR/mDNS names, ignoring: No such device or address May 16 16:42:59.372152 containerd[1577]: time="2025-05-16T16:42:59.372111473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-fcfrt,Uid:3208f219-fe99-4b68-b23c-e5f9b103f8b4,Namespace:calico-system,Attempt:0,} returns sandbox id \"f3586d0b54d05a26d93f8117ce1f66b9b17483d34b060d4006a42882fc9be99b\"" May 16 16:42:59.544901 systemd-networkd[1488]: cali6c29940390a: Gained IPv6LL May 16 16:42:59.790867 containerd[1577]: time="2025-05-16T16:42:59.790811083Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:59.791982 containerd[1577]: time="2025-05-16T16:42:59.791949879Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 16 16:42:59.793451 containerd[1577]: time="2025-05-16T16:42:59.793406941Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:59.811498 containerd[1577]: time="2025-05-16T16:42:59.811385951Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:42:59.812011 containerd[1577]: time="2025-05-16T16:42:59.811977240Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 2.408181226s" May 16 16:42:59.812053 containerd[1577]: time="2025-05-16T16:42:59.812015171Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 16 16:42:59.813019 containerd[1577]: time="2025-05-16T16:42:59.812954212Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 16 16:42:59.897890 containerd[1577]: time="2025-05-16T16:42:59.897846658Z" level=info msg="CreateContainer within sandbox \"9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 16 16:42:59.907502 containerd[1577]: time="2025-05-16T16:42:59.906789982Z" level=info msg="Container fc702dcca95fb950be698c46745b9ae5d20a37cc4745c4002eb1cd2e1611fbf9: CDI devices from CRI Config.CDIDevices: []" May 16 16:42:59.914290 containerd[1577]: time="2025-05-16T16:42:59.914256255Z" level=info msg="CreateContainer within sandbox \"9853e6f745160ec9184151fa4f6e9dc95fe71beee803c2e58f05eed15e7c6c3e\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"fc702dcca95fb950be698c46745b9ae5d20a37cc4745c4002eb1cd2e1611fbf9\"" May 16 16:42:59.914914 containerd[1577]: time="2025-05-16T16:42:59.914880715Z" level=info msg="StartContainer for \"fc702dcca95fb950be698c46745b9ae5d20a37cc4745c4002eb1cd2e1611fbf9\"" May 16 16:42:59.915949 containerd[1577]: time="2025-05-16T16:42:59.915925004Z" level=info msg="connecting to shim fc702dcca95fb950be698c46745b9ae5d20a37cc4745c4002eb1cd2e1611fbf9" address="unix:///run/containerd/s/4363dba362fca31168eafad53d990333a777c249aaf9d84188ec551f8283c358" protocol=ttrpc version=3 May 16 16:42:59.938711 systemd[1]: Started cri-containerd-fc702dcca95fb950be698c46745b9ae5d20a37cc4745c4002eb1cd2e1611fbf9.scope - libcontainer container fc702dcca95fb950be698c46745b9ae5d20a37cc4745c4002eb1cd2e1611fbf9. May 16 16:42:59.986394 containerd[1577]: time="2025-05-16T16:42:59.986353863Z" level=info msg="StartContainer for \"fc702dcca95fb950be698c46745b9ae5d20a37cc4745c4002eb1cd2e1611fbf9\" returns successfully" May 16 16:43:00.177983 kubelet[2673]: I0516 16:43:00.177910 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cd8784d74-2wgvs" podStartSLOduration=24.724024668 podStartE2EDuration="29.177893379s" podCreationTimestamp="2025-05-16 16:42:31 +0000 UTC" firstStartedPulling="2025-05-16 16:42:55.358972289 +0000 UTC m=+39.462857571" lastFinishedPulling="2025-05-16 16:42:59.812841 +0000 UTC m=+43.916726282" observedRunningTime="2025-05-16 16:43:00.177363486 +0000 UTC m=+44.281248768" watchObservedRunningTime="2025-05-16 16:43:00.177893379 +0000 UTC m=+44.281778661" May 16 16:43:00.305832 systemd[1]: Started sshd@8-10.0.0.80:22-10.0.0.1:41556.service - OpenSSH per-connection server daemon (10.0.0.1:41556). May 16 16:43:00.367292 containerd[1577]: time="2025-05-16T16:43:00.367244672Z" level=info msg="ImageUpdate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:00.368206 containerd[1577]: time="2025-05-16T16:43:00.368185716Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=77" May 16 16:43:00.370024 containerd[1577]: time="2025-05-16T16:43:00.369998827Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 557.018596ms" May 16 16:43:00.370082 containerd[1577]: time="2025-05-16T16:43:00.370025988Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 16 16:43:00.370112 sshd[4974]: Accepted publickey for core from 10.0.0.1 port 41556 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:00.370859 containerd[1577]: time="2025-05-16T16:43:00.370842420Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 16 16:43:00.371985 sshd-session[4974]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:00.378742 systemd-logind[1562]: New session 9 of user core. May 16 16:43:00.383715 systemd[1]: Started session-9.scope - Session 9 of User core. May 16 16:43:00.385382 containerd[1577]: time="2025-05-16T16:43:00.385337475Z" level=info msg="CreateContainer within sandbox \"45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 16 16:43:00.393598 containerd[1577]: time="2025-05-16T16:43:00.393157782Z" level=info msg="Container f525d3906513449963370a08b23311eb0eed65526c2cae607c9ddb41c07a3c21: CDI devices from CRI Config.CDIDevices: []" May 16 16:43:00.402800 containerd[1577]: time="2025-05-16T16:43:00.402761565Z" level=info msg="CreateContainer within sandbox \"45f75cf81d8dc6c053eda6e871b99a09e63ae55915f43d1a4e44838837f7d351\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"f525d3906513449963370a08b23311eb0eed65526c2cae607c9ddb41c07a3c21\"" May 16 16:43:00.403324 containerd[1577]: time="2025-05-16T16:43:00.403299543Z" level=info msg="StartContainer for \"f525d3906513449963370a08b23311eb0eed65526c2cae607c9ddb41c07a3c21\"" May 16 16:43:00.404836 containerd[1577]: time="2025-05-16T16:43:00.404781984Z" level=info msg="connecting to shim f525d3906513449963370a08b23311eb0eed65526c2cae607c9ddb41c07a3c21" address="unix:///run/containerd/s/9133cf0f82afd85568564a303a7040e56c0c2f777b4f45459fcfd031b70139ae" protocol=ttrpc version=3 May 16 16:43:00.435043 systemd[1]: Started cri-containerd-f525d3906513449963370a08b23311eb0eed65526c2cae607c9ddb41c07a3c21.scope - libcontainer container f525d3906513449963370a08b23311eb0eed65526c2cae607c9ddb41c07a3c21. May 16 16:43:00.507307 containerd[1577]: time="2025-05-16T16:43:00.507272557Z" level=info msg="StartContainer for \"f525d3906513449963370a08b23311eb0eed65526c2cae607c9ddb41c07a3c21\" returns successfully" May 16 16:43:00.543762 sshd[4976]: Connection closed by 10.0.0.1 port 41556 May 16 16:43:00.544181 sshd-session[4974]: pam_unix(sshd:session): session closed for user core May 16 16:43:00.547061 systemd[1]: sshd@8-10.0.0.80:22-10.0.0.1:41556.service: Deactivated successfully. May 16 16:43:00.549064 systemd[1]: session-9.scope: Deactivated successfully. May 16 16:43:00.551904 systemd-logind[1562]: Session 9 logged out. Waiting for processes to exit. May 16 16:43:00.553972 systemd-logind[1562]: Removed session 9. May 16 16:43:00.952743 systemd-networkd[1488]: cali7f8b0f1123c: Gained IPv6LL May 16 16:43:01.183878 kubelet[2673]: I0516 16:43:01.183768 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-cd8784d74-zmxbh" podStartSLOduration=27.215238446 podStartE2EDuration="30.183718139s" podCreationTimestamp="2025-05-16 16:42:31 +0000 UTC" firstStartedPulling="2025-05-16 16:42:57.402303045 +0000 UTC m=+41.506188327" lastFinishedPulling="2025-05-16 16:43:00.370782738 +0000 UTC m=+44.474668020" observedRunningTime="2025-05-16 16:43:01.181374744 +0000 UTC m=+45.285260026" watchObservedRunningTime="2025-05-16 16:43:01.183718139 +0000 UTC m=+45.287603421" May 16 16:43:02.169376 containerd[1577]: time="2025-05-16T16:43:02.169334707Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:02.170027 containerd[1577]: time="2025-05-16T16:43:02.170001569Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 16 16:43:02.171126 containerd[1577]: time="2025-05-16T16:43:02.171101542Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:02.173181 containerd[1577]: time="2025-05-16T16:43:02.173118344Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:02.173594 containerd[1577]: time="2025-05-16T16:43:02.173560954Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 1.802629257s" May 16 16:43:02.173638 containerd[1577]: time="2025-05-16T16:43:02.173597021Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 16 16:43:02.174497 containerd[1577]: time="2025-05-16T16:43:02.174463357Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 16 16:43:02.177934 containerd[1577]: time="2025-05-16T16:43:02.177911534Z" level=info msg="CreateContainer within sandbox \"7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 16 16:43:02.188509 containerd[1577]: time="2025-05-16T16:43:02.188480447Z" level=info msg="Container c6cb9e3d155fda0e0c1b6e3ef17b50515af723897be62c2d1674f668518558f1: CDI devices from CRI Config.CDIDevices: []" May 16 16:43:02.198082 containerd[1577]: time="2025-05-16T16:43:02.198038323Z" level=info msg="CreateContainer within sandbox \"7a30d2af3c6885dfabbc9cc3ccce9a090b15f3ed4e78924668cf4387b86c3447\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"c6cb9e3d155fda0e0c1b6e3ef17b50515af723897be62c2d1674f668518558f1\"" May 16 16:43:02.199620 containerd[1577]: time="2025-05-16T16:43:02.198719801Z" level=info msg="StartContainer for \"c6cb9e3d155fda0e0c1b6e3ef17b50515af723897be62c2d1674f668518558f1\"" May 16 16:43:02.200437 containerd[1577]: time="2025-05-16T16:43:02.200386106Z" level=info msg="connecting to shim c6cb9e3d155fda0e0c1b6e3ef17b50515af723897be62c2d1674f668518558f1" address="unix:///run/containerd/s/c87c4f9549dadbb8115ce5647f564cf6720c4ee6555e185c5aff4bfc932a3a7b" protocol=ttrpc version=3 May 16 16:43:02.264716 systemd[1]: Started cri-containerd-c6cb9e3d155fda0e0c1b6e3ef17b50515af723897be62c2d1674f668518558f1.scope - libcontainer container c6cb9e3d155fda0e0c1b6e3ef17b50515af723897be62c2d1674f668518558f1. May 16 16:43:02.310088 containerd[1577]: time="2025-05-16T16:43:02.310044347Z" level=info msg="StartContainer for \"c6cb9e3d155fda0e0c1b6e3ef17b50515af723897be62c2d1674f668518558f1\" returns successfully" May 16 16:43:03.090641 kubelet[2673]: I0516 16:43:03.090606 2673 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 16 16:43:03.091741 kubelet[2673]: I0516 16:43:03.091722 2673 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 16 16:43:03.189061 kubelet[2673]: I0516 16:43:03.188992 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-bb6pt" podStartSLOduration=23.259775895 podStartE2EDuration="30.188976152s" podCreationTimestamp="2025-05-16 16:42:33 +0000 UTC" firstStartedPulling="2025-05-16 16:42:55.245117968 +0000 UTC m=+39.349003250" lastFinishedPulling="2025-05-16 16:43:02.174318225 +0000 UTC m=+46.278203507" observedRunningTime="2025-05-16 16:43:03.188055966 +0000 UTC m=+47.291941278" watchObservedRunningTime="2025-05-16 16:43:03.188976152 +0000 UTC m=+47.292861434" May 16 16:43:05.315029 containerd[1577]: time="2025-05-16T16:43:05.314976527Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:05.316172 containerd[1577]: time="2025-05-16T16:43:05.316136182Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 16 16:43:05.317474 containerd[1577]: time="2025-05-16T16:43:05.317433795Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:05.319530 containerd[1577]: time="2025-05-16T16:43:05.319491566Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 16 16:43:05.320200 containerd[1577]: time="2025-05-16T16:43:05.320165349Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 3.145659082s" May 16 16:43:05.320250 containerd[1577]: time="2025-05-16T16:43:05.320199633Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 16 16:43:05.321841 containerd[1577]: time="2025-05-16T16:43:05.321815414Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 16:43:05.349146 containerd[1577]: time="2025-05-16T16:43:05.349095290Z" level=info msg="CreateContainer within sandbox \"4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 16 16:43:05.361532 containerd[1577]: time="2025-05-16T16:43:05.360700067Z" level=info msg="Container 15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa: CDI devices from CRI Config.CDIDevices: []" May 16 16:43:05.369828 containerd[1577]: time="2025-05-16T16:43:05.369784636Z" level=info msg="CreateContainer within sandbox \"4b218bf23fe2d78f74ea3f44ab200e379e2643fe0b29838700ac5eeffc3d5648\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa\"" May 16 16:43:05.370241 containerd[1577]: time="2025-05-16T16:43:05.370216586Z" level=info msg="StartContainer for \"15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa\"" May 16 16:43:05.371254 containerd[1577]: time="2025-05-16T16:43:05.371225498Z" level=info msg="connecting to shim 15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa" address="unix:///run/containerd/s/45c0d4eef3cda09c04640c43e0a4f0fb6def1f5b8a3819284feb547e43d335a7" protocol=ttrpc version=3 May 16 16:43:05.395691 systemd[1]: Started cri-containerd-15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa.scope - libcontainer container 15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa. May 16 16:43:05.450388 containerd[1577]: time="2025-05-16T16:43:05.450350917Z" level=info msg="StartContainer for \"15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa\" returns successfully" May 16 16:43:05.561364 systemd[1]: Started sshd@9-10.0.0.80:22-10.0.0.1:41560.service - OpenSSH per-connection server daemon (10.0.0.1:41560). May 16 16:43:05.638154 sshd[5128]: Accepted publickey for core from 10.0.0.1 port 41560 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:05.639838 sshd-session[5128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:05.646824 systemd-logind[1562]: New session 10 of user core. May 16 16:43:05.651749 systemd[1]: Started session-10.scope - Session 10 of User core. May 16 16:43:05.771852 containerd[1577]: time="2025-05-16T16:43:05.771798769Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:05.774448 containerd[1577]: time="2025-05-16T16:43:05.774419565Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 16:43:05.781643 containerd[1577]: time="2025-05-16T16:43:05.781601075Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:05.781921 kubelet[2673]: E0516 16:43:05.781872 2673 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:05.782359 kubelet[2673]: E0516 16:43:05.781930 2673 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:05.783302 kubelet[2673]: E0516 16:43:05.783165 2673 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7ztc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-fcfrt_calico-system(3208f219-fe99-4b68-b23c-e5f9b103f8b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:05.784584 kubelet[2673]: E0516 16:43:05.784526 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fcfrt" podUID="3208f219-fe99-4b68-b23c-e5f9b103f8b4" May 16 16:43:05.796852 sshd[5130]: Connection closed by 10.0.0.1 port 41560 May 16 16:43:05.797226 sshd-session[5128]: pam_unix(sshd:session): session closed for user core May 16 16:43:05.803021 systemd[1]: sshd@9-10.0.0.80:22-10.0.0.1:41560.service: Deactivated successfully. May 16 16:43:05.805084 systemd[1]: session-10.scope: Deactivated successfully. May 16 16:43:05.805900 systemd-logind[1562]: Session 10 logged out. Waiting for processes to exit. May 16 16:43:05.807069 systemd-logind[1562]: Removed session 10. May 16 16:43:06.204008 kubelet[2673]: E0516 16:43:06.203949 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fcfrt" podUID="3208f219-fe99-4b68-b23c-e5f9b103f8b4" May 16 16:43:06.220089 kubelet[2673]: I0516 16:43:06.220017 2673 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7685c7f4cf-98w45" podStartSLOduration=26.143716947 podStartE2EDuration="33.220000588s" podCreationTimestamp="2025-05-16 16:42:33 +0000 UTC" firstStartedPulling="2025-05-16 16:42:58.245374648 +0000 UTC m=+42.349259930" lastFinishedPulling="2025-05-16 16:43:05.321658289 +0000 UTC m=+49.425543571" observedRunningTime="2025-05-16 16:43:06.217851035 +0000 UTC m=+50.321736317" watchObservedRunningTime="2025-05-16 16:43:06.220000588 +0000 UTC m=+50.323885860" May 16 16:43:07.258656 containerd[1577]: time="2025-05-16T16:43:07.258417839Z" level=info msg="TaskExit event in podsandbox handler container_id:\"15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa\" id:\"19161185632fafe237edab3de47376baaff36265a4626b0c1eaae48d257552c4\" pid:5164 exited_at:{seconds:1747413787 nanos:258033978}" May 16 16:43:08.001215 containerd[1577]: time="2025-05-16T16:43:08.001159558Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 16:43:08.611283 containerd[1577]: time="2025-05-16T16:43:08.611226887Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:08.612438 containerd[1577]: time="2025-05-16T16:43:08.612403343Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:08.612492 containerd[1577]: time="2025-05-16T16:43:08.612470860Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 16:43:08.612713 kubelet[2673]: E0516 16:43:08.612633 2673 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:43:08.612713 kubelet[2673]: E0516 16:43:08.612697 2673 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:43:08.613061 kubelet[2673]: E0516 16:43:08.612815 2673 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:aec8e0cd5a7941d98f9fb853e513af3b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6lf7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5967895df7-wgjpx_calico-system(85410909-fb8d-4393-9be5-a24a0ba7b5ef): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:08.614858 containerd[1577]: time="2025-05-16T16:43:08.614818373Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 16:43:08.856722 containerd[1577]: time="2025-05-16T16:43:08.856664314Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:08.858366 containerd[1577]: time="2025-05-16T16:43:08.858328364Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:08.858436 containerd[1577]: time="2025-05-16T16:43:08.858407272Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 16:43:08.858676 kubelet[2673]: E0516 16:43:08.858620 2673 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:43:08.858783 kubelet[2673]: E0516 16:43:08.858688 2673 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:43:08.858908 kubelet[2673]: E0516 16:43:08.858857 2673 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lf7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5967895df7-wgjpx_calico-system(85410909-fb8d-4393-9be5-a24a0ba7b5ef): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:08.860062 kubelet[2673]: E0516 16:43:08.859999 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5967895df7-wgjpx" podUID="85410909-fb8d-4393-9be5-a24a0ba7b5ef" May 16 16:43:10.813146 systemd[1]: Started sshd@10-10.0.0.80:22-10.0.0.1:38796.service - OpenSSH per-connection server daemon (10.0.0.1:38796). May 16 16:43:10.994117 sshd[5183]: Accepted publickey for core from 10.0.0.1 port 38796 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:10.995598 sshd-session[5183]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:11.000019 systemd-logind[1562]: New session 11 of user core. May 16 16:43:11.010732 systemd[1]: Started session-11.scope - Session 11 of User core. May 16 16:43:11.123777 sshd[5185]: Connection closed by 10.0.0.1 port 38796 May 16 16:43:11.124235 sshd-session[5183]: pam_unix(sshd:session): session closed for user core May 16 16:43:11.134194 systemd[1]: sshd@10-10.0.0.80:22-10.0.0.1:38796.service: Deactivated successfully. May 16 16:43:11.136066 systemd[1]: session-11.scope: Deactivated successfully. May 16 16:43:11.136751 systemd-logind[1562]: Session 11 logged out. Waiting for processes to exit. May 16 16:43:11.140305 systemd[1]: Started sshd@11-10.0.0.80:22-10.0.0.1:38804.service - OpenSSH per-connection server daemon (10.0.0.1:38804). May 16 16:43:11.140888 systemd-logind[1562]: Removed session 11. May 16 16:43:11.193389 sshd[5199]: Accepted publickey for core from 10.0.0.1 port 38804 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:11.194824 sshd-session[5199]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:11.198858 systemd-logind[1562]: New session 12 of user core. May 16 16:43:11.207714 systemd[1]: Started session-12.scope - Session 12 of User core. May 16 16:43:11.344606 sshd[5201]: Connection closed by 10.0.0.1 port 38804 May 16 16:43:11.344006 sshd-session[5199]: pam_unix(sshd:session): session closed for user core May 16 16:43:11.354740 systemd[1]: sshd@11-10.0.0.80:22-10.0.0.1:38804.service: Deactivated successfully. May 16 16:43:11.358078 systemd[1]: session-12.scope: Deactivated successfully. May 16 16:43:11.359529 systemd-logind[1562]: Session 12 logged out. Waiting for processes to exit. May 16 16:43:11.364691 systemd[1]: Started sshd@12-10.0.0.80:22-10.0.0.1:38816.service - OpenSSH per-connection server daemon (10.0.0.1:38816). May 16 16:43:11.366090 systemd-logind[1562]: Removed session 12. May 16 16:43:11.417081 sshd[5213]: Accepted publickey for core from 10.0.0.1 port 38816 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:11.418709 sshd-session[5213]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:11.423426 systemd-logind[1562]: New session 13 of user core. May 16 16:43:11.436733 systemd[1]: Started session-13.scope - Session 13 of User core. May 16 16:43:11.554975 sshd[5215]: Connection closed by 10.0.0.1 port 38816 May 16 16:43:11.555302 sshd-session[5213]: pam_unix(sshd:session): session closed for user core May 16 16:43:11.560260 systemd[1]: sshd@12-10.0.0.80:22-10.0.0.1:38816.service: Deactivated successfully. May 16 16:43:11.562552 systemd[1]: session-13.scope: Deactivated successfully. May 16 16:43:11.563360 systemd-logind[1562]: Session 13 logged out. Waiting for processes to exit. May 16 16:43:11.564898 systemd-logind[1562]: Removed session 13. May 16 16:43:16.570643 systemd[1]: Started sshd@13-10.0.0.80:22-10.0.0.1:35144.service - OpenSSH per-connection server daemon (10.0.0.1:35144). May 16 16:43:16.609924 sshd[5235]: Accepted publickey for core from 10.0.0.1 port 35144 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:16.611281 sshd-session[5235]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:16.615847 systemd-logind[1562]: New session 14 of user core. May 16 16:43:16.626693 systemd[1]: Started session-14.scope - Session 14 of User core. May 16 16:43:16.737049 sshd[5237]: Connection closed by 10.0.0.1 port 35144 May 16 16:43:16.737332 sshd-session[5235]: pam_unix(sshd:session): session closed for user core May 16 16:43:16.741136 systemd[1]: sshd@13-10.0.0.80:22-10.0.0.1:35144.service: Deactivated successfully. May 16 16:43:16.743190 systemd[1]: session-14.scope: Deactivated successfully. May 16 16:43:16.744053 systemd-logind[1562]: Session 14 logged out. Waiting for processes to exit. May 16 16:43:16.745271 systemd-logind[1562]: Removed session 14. May 16 16:43:18.001449 containerd[1577]: time="2025-05-16T16:43:18.001390489Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 16:43:18.298430 containerd[1577]: time="2025-05-16T16:43:18.298297344Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:18.299493 containerd[1577]: time="2025-05-16T16:43:18.299420931Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:18.299493 containerd[1577]: time="2025-05-16T16:43:18.299499829Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 16:43:18.299719 kubelet[2673]: E0516 16:43:18.299653 2673 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:18.299719 kubelet[2673]: E0516 16:43:18.299705 2673 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:18.300114 kubelet[2673]: E0516 16:43:18.299844 2673 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7ztc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-fcfrt_calico-system(3208f219-fe99-4b68-b23c-e5f9b103f8b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:18.301043 kubelet[2673]: E0516 16:43:18.301004 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fcfrt" podUID="3208f219-fe99-4b68-b23c-e5f9b103f8b4" May 16 16:43:21.002147 kubelet[2673]: E0516 16:43:21.001999 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5967895df7-wgjpx" podUID="85410909-fb8d-4393-9be5-a24a0ba7b5ef" May 16 16:43:21.753666 systemd[1]: Started sshd@14-10.0.0.80:22-10.0.0.1:35156.service - OpenSSH per-connection server daemon (10.0.0.1:35156). May 16 16:43:21.801234 sshd[5258]: Accepted publickey for core from 10.0.0.1 port 35156 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:21.802903 sshd-session[5258]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:21.807491 systemd-logind[1562]: New session 15 of user core. May 16 16:43:21.812724 systemd[1]: Started session-15.scope - Session 15 of User core. May 16 16:43:21.932216 sshd[5260]: Connection closed by 10.0.0.1 port 35156 May 16 16:43:21.932594 sshd-session[5258]: pam_unix(sshd:session): session closed for user core May 16 16:43:21.939402 systemd[1]: sshd@14-10.0.0.80:22-10.0.0.1:35156.service: Deactivated successfully. May 16 16:43:21.942256 systemd[1]: session-15.scope: Deactivated successfully. May 16 16:43:21.943166 systemd-logind[1562]: Session 15 logged out. Waiting for processes to exit. May 16 16:43:21.944632 systemd-logind[1562]: Removed session 15. May 16 16:43:23.207465 containerd[1577]: time="2025-05-16T16:43:23.207396203Z" level=info msg="TaskExit event in podsandbox handler container_id:\"4eb6703121c08a845b17c06894a489f9087012da87f936c1b20ef75fe218bcf8\" id:\"966efea4f5ff46571ab68cfd867f2ac22d16d233884e8f692753d3895a74e434\" pid:5284 exited_at:{seconds:1747413803 nanos:206994910}" May 16 16:43:26.949814 systemd[1]: Started sshd@15-10.0.0.80:22-10.0.0.1:46100.service - OpenSSH per-connection server daemon (10.0.0.1:46100). May 16 16:43:27.385134 sshd[5301]: Accepted publickey for core from 10.0.0.1 port 46100 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:27.386947 sshd-session[5301]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:27.391854 systemd-logind[1562]: New session 16 of user core. May 16 16:43:27.400699 systemd[1]: Started session-16.scope - Session 16 of User core. May 16 16:43:27.528016 sshd[5303]: Connection closed by 10.0.0.1 port 46100 May 16 16:43:27.528336 sshd-session[5301]: pam_unix(sshd:session): session closed for user core May 16 16:43:27.532842 systemd[1]: sshd@15-10.0.0.80:22-10.0.0.1:46100.service: Deactivated successfully. May 16 16:43:27.534917 systemd[1]: session-16.scope: Deactivated successfully. May 16 16:43:27.535727 systemd-logind[1562]: Session 16 logged out. Waiting for processes to exit. May 16 16:43:27.536907 systemd-logind[1562]: Removed session 16. May 16 16:43:29.001319 kubelet[2673]: E0516 16:43:29.001261 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fcfrt" podUID="3208f219-fe99-4b68-b23c-e5f9b103f8b4" May 16 16:43:32.545639 systemd[1]: Started sshd@16-10.0.0.80:22-10.0.0.1:46114.service - OpenSSH per-connection server daemon (10.0.0.1:46114). May 16 16:43:32.586507 sshd[5317]: Accepted publickey for core from 10.0.0.1 port 46114 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:32.588046 sshd-session[5317]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:32.592849 systemd-logind[1562]: New session 17 of user core. May 16 16:43:32.600791 systemd[1]: Started session-17.scope - Session 17 of User core. May 16 16:43:32.721312 sshd[5319]: Connection closed by 10.0.0.1 port 46114 May 16 16:43:32.721698 sshd-session[5317]: pam_unix(sshd:session): session closed for user core May 16 16:43:32.731638 systemd[1]: sshd@16-10.0.0.80:22-10.0.0.1:46114.service: Deactivated successfully. May 16 16:43:32.733745 systemd[1]: session-17.scope: Deactivated successfully. May 16 16:43:32.734561 systemd-logind[1562]: Session 17 logged out. Waiting for processes to exit. May 16 16:43:32.738057 systemd[1]: Started sshd@17-10.0.0.80:22-10.0.0.1:46120.service - OpenSSH per-connection server daemon (10.0.0.1:46120). May 16 16:43:32.738664 systemd-logind[1562]: Removed session 17. May 16 16:43:32.791003 sshd[5332]: Accepted publickey for core from 10.0.0.1 port 46120 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:32.792617 sshd-session[5332]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:32.797040 systemd-logind[1562]: New session 18 of user core. May 16 16:43:32.805706 systemd[1]: Started session-18.scope - Session 18 of User core. May 16 16:43:33.142947 sshd[5334]: Connection closed by 10.0.0.1 port 46120 May 16 16:43:33.143596 sshd-session[5332]: pam_unix(sshd:session): session closed for user core May 16 16:43:33.152517 systemd[1]: sshd@17-10.0.0.80:22-10.0.0.1:46120.service: Deactivated successfully. May 16 16:43:33.154895 systemd[1]: session-18.scope: Deactivated successfully. May 16 16:43:33.155834 systemd-logind[1562]: Session 18 logged out. Waiting for processes to exit. May 16 16:43:33.159066 systemd[1]: Started sshd@18-10.0.0.80:22-10.0.0.1:46128.service - OpenSSH per-connection server daemon (10.0.0.1:46128). May 16 16:43:33.159941 systemd-logind[1562]: Removed session 18. May 16 16:43:33.208594 sshd[5345]: Accepted publickey for core from 10.0.0.1 port 46128 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:33.209987 sshd-session[5345]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:33.214787 systemd-logind[1562]: New session 19 of user core. May 16 16:43:33.226742 systemd[1]: Started session-19.scope - Session 19 of User core. May 16 16:43:34.128124 sshd[5347]: Connection closed by 10.0.0.1 port 46128 May 16 16:43:34.129317 sshd-session[5345]: pam_unix(sshd:session): session closed for user core May 16 16:43:34.141515 systemd[1]: sshd@18-10.0.0.80:22-10.0.0.1:46128.service: Deactivated successfully. May 16 16:43:34.145098 systemd[1]: session-19.scope: Deactivated successfully. May 16 16:43:34.146101 systemd-logind[1562]: Session 19 logged out. Waiting for processes to exit. May 16 16:43:34.150774 systemd[1]: Started sshd@19-10.0.0.80:22-10.0.0.1:46142.service - OpenSSH per-connection server daemon (10.0.0.1:46142). May 16 16:43:34.151518 systemd-logind[1562]: Removed session 19. May 16 16:43:34.203389 sshd[5366]: Accepted publickey for core from 10.0.0.1 port 46142 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:34.205096 sshd-session[5366]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:34.209895 systemd-logind[1562]: New session 20 of user core. May 16 16:43:34.218724 systemd[1]: Started session-20.scope - Session 20 of User core. May 16 16:43:34.505937 sshd[5368]: Connection closed by 10.0.0.1 port 46142 May 16 16:43:34.506660 sshd-session[5366]: pam_unix(sshd:session): session closed for user core May 16 16:43:34.518678 systemd[1]: sshd@19-10.0.0.80:22-10.0.0.1:46142.service: Deactivated successfully. May 16 16:43:34.524543 systemd[1]: session-20.scope: Deactivated successfully. May 16 16:43:34.527621 systemd-logind[1562]: Session 20 logged out. Waiting for processes to exit. May 16 16:43:34.530766 systemd[1]: Started sshd@20-10.0.0.80:22-10.0.0.1:46152.service - OpenSSH per-connection server daemon (10.0.0.1:46152). May 16 16:43:34.533536 systemd-logind[1562]: Removed session 20. May 16 16:43:34.581333 sshd[5380]: Accepted publickey for core from 10.0.0.1 port 46152 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:34.583064 sshd-session[5380]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:34.588407 systemd-logind[1562]: New session 21 of user core. May 16 16:43:34.594715 systemd[1]: Started session-21.scope - Session 21 of User core. May 16 16:43:34.760867 sshd[5382]: Connection closed by 10.0.0.1 port 46152 May 16 16:43:34.761612 sshd-session[5380]: pam_unix(sshd:session): session closed for user core May 16 16:43:34.766204 systemd[1]: sshd@20-10.0.0.80:22-10.0.0.1:46152.service: Deactivated successfully. May 16 16:43:34.768422 systemd[1]: session-21.scope: Deactivated successfully. May 16 16:43:34.769744 systemd-logind[1562]: Session 21 logged out. Waiting for processes to exit. May 16 16:43:34.771377 systemd-logind[1562]: Removed session 21. May 16 16:43:36.002626 containerd[1577]: time="2025-05-16T16:43:36.002536442Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 16 16:43:36.303939 containerd[1577]: time="2025-05-16T16:43:36.303787224Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:36.322638 containerd[1577]: time="2025-05-16T16:43:36.322523846Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 16 16:43:36.322791 containerd[1577]: time="2025-05-16T16:43:36.322643575Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:36.323039 kubelet[2673]: E0516 16:43:36.322960 2673 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:43:36.323039 kubelet[2673]: E0516 16:43:36.323034 2673 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 16 16:43:36.323684 kubelet[2673]: E0516 16:43:36.323165 2673 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:aec8e0cd5a7941d98f9fb853e513af3b,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6lf7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5967895df7-wgjpx_calico-system(85410909-fb8d-4393-9be5-a24a0ba7b5ef): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:36.325504 containerd[1577]: time="2025-05-16T16:43:36.325462937Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 16 16:43:36.671070 containerd[1577]: time="2025-05-16T16:43:36.670999560Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:36.672158 containerd[1577]: time="2025-05-16T16:43:36.672102850Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:36.672158 containerd[1577]: time="2025-05-16T16:43:36.672143928Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 16 16:43:36.672421 kubelet[2673]: E0516 16:43:36.672372 2673 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:43:36.672488 kubelet[2673]: E0516 16:43:36.672433 2673 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 16 16:43:36.672623 kubelet[2673]: E0516 16:43:36.672561 2673 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6lf7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5967895df7-wgjpx_calico-system(85410909-fb8d-4393-9be5-a24a0ba7b5ef): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:36.673881 kubelet[2673]: E0516 16:43:36.673812 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5967895df7-wgjpx" podUID="85410909-fb8d-4393-9be5-a24a0ba7b5ef" May 16 16:43:37.254227 containerd[1577]: time="2025-05-16T16:43:37.254186223Z" level=info msg="TaskExit event in podsandbox handler container_id:\"15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa\" id:\"ea38c08f11c22e4442c9aae6e6a6422aed42b411e07292c826efaed0bf18c846\" pid:5407 exited_at:{seconds:1747413817 nanos:253910666}" May 16 16:43:39.781476 systemd[1]: Started sshd@21-10.0.0.80:22-10.0.0.1:58748.service - OpenSSH per-connection server daemon (10.0.0.1:58748). May 16 16:43:39.837693 sshd[5428]: Accepted publickey for core from 10.0.0.1 port 58748 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:39.839013 sshd-session[5428]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:39.844221 systemd-logind[1562]: New session 22 of user core. May 16 16:43:39.854691 systemd[1]: Started session-22.scope - Session 22 of User core. May 16 16:43:39.973978 sshd[5430]: Connection closed by 10.0.0.1 port 58748 May 16 16:43:39.974296 sshd-session[5428]: pam_unix(sshd:session): session closed for user core May 16 16:43:39.977888 systemd[1]: sshd@21-10.0.0.80:22-10.0.0.1:58748.service: Deactivated successfully. May 16 16:43:39.979805 systemd[1]: session-22.scope: Deactivated successfully. May 16 16:43:39.980723 systemd-logind[1562]: Session 22 logged out. Waiting for processes to exit. May 16 16:43:39.982208 systemd-logind[1562]: Removed session 22. May 16 16:43:41.900338 containerd[1577]: time="2025-05-16T16:43:41.900277655Z" level=info msg="TaskExit event in podsandbox handler container_id:\"15eec7d91c2cbb16c602e5d2039e1f1e0a82dd7e1156255a670e1f2aba3b9afa\" id:\"033f23869648b30a61680a2a76d894a059c8be1101c0e7eaf24a6d54bfc02539\" pid:5455 exited_at:{seconds:1747413821 nanos:900036485}" May 16 16:43:43.001676 containerd[1577]: time="2025-05-16T16:43:43.001606722Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 16 16:43:43.435434 containerd[1577]: time="2025-05-16T16:43:43.435365639Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 16 16:43:43.436628 containerd[1577]: time="2025-05-16T16:43:43.436549015Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 16 16:43:43.436725 containerd[1577]: time="2025-05-16T16:43:43.436609410Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 16 16:43:43.436833 kubelet[2673]: E0516 16:43:43.436784 2673 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:43.437151 kubelet[2673]: E0516 16:43:43.436841 2673 kuberuntime_image.go:42] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 16 16:43:43.437151 kubelet[2673]: E0516 16:43:43.437011 2673 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q7ztc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-fcfrt_calico-system(3208f219-fe99-4b68-b23c-e5f9b103f8b4): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 16 16:43:43.438263 kubelet[2673]: E0516 16:43:43.438211 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-fcfrt" podUID="3208f219-fe99-4b68-b23c-e5f9b103f8b4" May 16 16:43:44.987036 systemd[1]: Started sshd@22-10.0.0.80:22-10.0.0.1:58756.service - OpenSSH per-connection server daemon (10.0.0.1:58756). May 16 16:43:45.000768 kubelet[2673]: E0516 16:43:45.000723 2673 dns.go:153] "Nameserver limits exceeded" err="Nameserver limits were exceeded, some nameservers have been omitted, the applied nameserver line is: 1.1.1.1 1.0.0.1 8.8.8.8" May 16 16:43:45.044506 sshd[5468]: Accepted publickey for core from 10.0.0.1 port 58756 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:45.046145 sshd-session[5468]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:45.051225 systemd-logind[1562]: New session 23 of user core. May 16 16:43:45.065852 systemd[1]: Started session-23.scope - Session 23 of User core. May 16 16:43:45.197143 sshd[5470]: Connection closed by 10.0.0.1 port 58756 May 16 16:43:45.197498 sshd-session[5468]: pam_unix(sshd:session): session closed for user core May 16 16:43:45.201677 systemd[1]: sshd@22-10.0.0.80:22-10.0.0.1:58756.service: Deactivated successfully. May 16 16:43:45.203880 systemd[1]: session-23.scope: Deactivated successfully. May 16 16:43:45.204698 systemd-logind[1562]: Session 23 logged out. Waiting for processes to exit. May 16 16:43:45.206069 systemd-logind[1562]: Removed session 23. May 16 16:43:50.218166 systemd[1]: Started sshd@23-10.0.0.80:22-10.0.0.1:40210.service - OpenSSH per-connection server daemon (10.0.0.1:40210). May 16 16:43:50.286456 sshd[5485]: Accepted publickey for core from 10.0.0.1 port 40210 ssh2: RSA SHA256:xtDF+SM00BVA4NOIUT0zDz1Cb4IyRmiUgC3yMm9bHIM May 16 16:43:50.288639 sshd-session[5485]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 16 16:43:50.293685 systemd-logind[1562]: New session 24 of user core. May 16 16:43:50.307786 systemd[1]: Started session-24.scope - Session 24 of User core. May 16 16:43:50.428737 sshd[5487]: Connection closed by 10.0.0.1 port 40210 May 16 16:43:50.429326 sshd-session[5485]: pam_unix(sshd:session): session closed for user core May 16 16:43:50.434242 systemd[1]: sshd@23-10.0.0.80:22-10.0.0.1:40210.service: Deactivated successfully. May 16 16:43:50.436262 systemd[1]: session-24.scope: Deactivated successfully. May 16 16:43:50.437896 systemd-logind[1562]: Session 24 logged out. Waiting for processes to exit. May 16 16:43:50.439404 systemd-logind[1562]: Removed session 24. May 16 16:43:51.002133 kubelet[2673]: E0516 16:43:51.002075 2673 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-5967895df7-wgjpx" podUID="85410909-fb8d-4393-9be5-a24a0ba7b5ef"