May 27 03:21:36.921103 kernel: Linux version 6.12.30-flatcar (build@pony-truck.infra.kinvolk.io) (x86_64-cros-linux-gnu-gcc (Gentoo Hardened 14.2.1_p20241221 p7) 14.2.1 20241221, GNU ld (Gentoo 2.44 p1) 2.44.0) #1 SMP PREEMPT_DYNAMIC Tue May 27 01:09:43 -00 2025 May 27 03:21:36.921126 kernel: Command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=f6c186658a19d5a08471ef76df75f82494b37b46908f9237b2c3cf497da860c6 May 27 03:21:36.921135 kernel: BIOS-provided physical RAM map: May 27 03:21:36.921141 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009ffff] usable May 27 03:21:36.921147 kernel: BIOS-e820: [mem 0x00000000000c0000-0x00000000000fffff] reserved May 27 03:21:36.921152 kernel: BIOS-e820: [mem 0x0000000000100000-0x000000003ff40fff] usable May 27 03:21:36.921161 kernel: BIOS-e820: [mem 0x000000003ff41000-0x000000003ffc4fff] reserved May 27 03:21:36.921166 kernel: BIOS-e820: [mem 0x000000003ffc5000-0x000000003ffd0fff] usable May 27 03:21:36.921172 kernel: BIOS-e820: [mem 0x000000003ffd1000-0x000000003fffafff] ACPI data May 27 03:21:36.921178 kernel: BIOS-e820: [mem 0x000000003fffb000-0x000000003fffefff] ACPI NVS May 27 03:21:36.921184 kernel: BIOS-e820: [mem 0x000000003ffff000-0x000000003fffffff] usable May 27 03:21:36.921189 kernel: BIOS-e820: [mem 0x0000000100000000-0x00000002bfffffff] usable May 27 03:21:36.921195 kernel: printk: legacy bootconsole [earlyser0] enabled May 27 03:21:36.921201 kernel: NX (Execute Disable) protection: active May 27 03:21:36.921210 kernel: APIC: Static calls initialized May 27 03:21:36.921216 kernel: efi: EFI v2.7 by Microsoft May 27 03:21:36.921222 kernel: efi: ACPI=0x3fffa000 ACPI 2.0=0x3fffa014 SMBIOS=0x3ff85000 SMBIOS 3.0=0x3ff83000 MEMATTR=0x3ebb9a98 RNG=0x3ffd2018 May 27 03:21:36.921229 kernel: random: crng init done May 27 03:21:36.921235 kernel: secureboot: Secure boot disabled May 27 03:21:36.921241 kernel: SMBIOS 3.1.0 present. May 27 03:21:36.921247 kernel: DMI: Microsoft Corporation Virtual Machine/Virtual Machine, BIOS Hyper-V UEFI Release v4.1 11/21/2024 May 27 03:21:36.921253 kernel: DMI: Memory slots populated: 2/2 May 27 03:21:36.921260 kernel: Hypervisor detected: Microsoft Hyper-V May 27 03:21:36.921267 kernel: Hyper-V: privilege flags low 0xae7f, high 0x3b8030, hints 0x9e4e24, misc 0xe0bed7b2 May 27 03:21:36.921273 kernel: Hyper-V: Nested features: 0x3e0101 May 27 03:21:36.921279 kernel: Hyper-V: LAPIC Timer Frequency: 0x30d40 May 27 03:21:36.921285 kernel: Hyper-V: Using hypercall for remote TLB flush May 27 03:21:36.921291 kernel: clocksource: hyperv_clocksource_tsc_page: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 27 03:21:36.921298 kernel: clocksource: hyperv_clocksource_msr: mask: 0xffffffffffffffff max_cycles: 0x24e6a1710, max_idle_ns: 440795202120 ns May 27 03:21:36.921304 kernel: tsc: Detected 2300.000 MHz processor May 27 03:21:36.921310 kernel: e820: update [mem 0x00000000-0x00000fff] usable ==> reserved May 27 03:21:36.921317 kernel: e820: remove [mem 0x000a0000-0x000fffff] usable May 27 03:21:36.921324 kernel: last_pfn = 0x2c0000 max_arch_pfn = 0x10000000000 May 27 03:21:36.921332 kernel: MTRR map: 4 entries (2 fixed + 2 variable; max 18), built from 8 variable MTRRs May 27 03:21:36.921338 kernel: x86/PAT: Configuration [0-7]: WB WC UC- UC WB WP UC- WT May 27 03:21:36.921345 kernel: e820: update [mem 0x48000000-0xffffffff] usable ==> reserved May 27 03:21:36.921351 kernel: last_pfn = 0x40000 max_arch_pfn = 0x10000000000 May 27 03:21:36.921358 kernel: Using GB pages for direct mapping May 27 03:21:36.921364 kernel: ACPI: Early table checksum verification disabled May 27 03:21:36.921371 kernel: ACPI: RSDP 0x000000003FFFA014 000024 (v02 VRTUAL) May 27 03:21:36.921380 kernel: ACPI: XSDT 0x000000003FFF90E8 00005C (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 03:21:36.921388 kernel: ACPI: FACP 0x000000003FFF8000 000114 (v06 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 03:21:36.921394 kernel: ACPI: DSDT 0x000000003FFD6000 01E11C (v02 MSFTVM DSDT01 00000001 INTL 20230628) May 27 03:21:36.921400 kernel: ACPI: FACS 0x000000003FFFE000 000040 May 27 03:21:36.921407 kernel: ACPI: OEM0 0x000000003FFF7000 000064 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 03:21:36.921414 kernel: ACPI: SPCR 0x000000003FFF6000 000050 (v02 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 03:21:36.921422 kernel: ACPI: WAET 0x000000003FFF5000 000028 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 03:21:36.921429 kernel: ACPI: APIC 0x000000003FFD5000 000052 (v05 HVLITE HVLITETB 00000000 MSHV 00000000) May 27 03:21:36.921436 kernel: ACPI: SRAT 0x000000003FFD4000 0000A0 (v03 HVLITE HVLITETB 00000000 MSHV 00000000) May 27 03:21:36.921443 kernel: ACPI: BGRT 0x000000003FFD3000 000038 (v01 VRTUAL MICROSFT 00000001 MSFT 00000001) May 27 03:21:36.921449 kernel: ACPI: Reserving FACP table memory at [mem 0x3fff8000-0x3fff8113] May 27 03:21:36.921456 kernel: ACPI: Reserving DSDT table memory at [mem 0x3ffd6000-0x3fff411b] May 27 03:21:36.921463 kernel: ACPI: Reserving FACS table memory at [mem 0x3fffe000-0x3fffe03f] May 27 03:21:36.921469 kernel: ACPI: Reserving OEM0 table memory at [mem 0x3fff7000-0x3fff7063] May 27 03:21:36.921476 kernel: ACPI: Reserving SPCR table memory at [mem 0x3fff6000-0x3fff604f] May 27 03:21:36.921484 kernel: ACPI: Reserving WAET table memory at [mem 0x3fff5000-0x3fff5027] May 27 03:21:36.921491 kernel: ACPI: Reserving APIC table memory at [mem 0x3ffd5000-0x3ffd5051] May 27 03:21:36.921497 kernel: ACPI: Reserving SRAT table memory at [mem 0x3ffd4000-0x3ffd409f] May 27 03:21:36.921504 kernel: ACPI: Reserving BGRT table memory at [mem 0x3ffd3000-0x3ffd3037] May 27 03:21:36.921511 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x00000000-0x3fffffff] May 27 03:21:36.921518 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x100000000-0x2bfffffff] May 27 03:21:36.921525 kernel: NUMA: Node 0 [mem 0x00001000-0x3fffffff] + [mem 0x100000000-0x2bfffffff] -> [mem 0x00001000-0x2bfffffff] May 27 03:21:36.921531 kernel: NODE_DATA(0) allocated [mem 0x2bfff8dc0-0x2bfffffff] May 27 03:21:36.921538 kernel: Zone ranges: May 27 03:21:36.921546 kernel: DMA [mem 0x0000000000001000-0x0000000000ffffff] May 27 03:21:36.921552 kernel: DMA32 [mem 0x0000000001000000-0x00000000ffffffff] May 27 03:21:36.921558 kernel: Normal [mem 0x0000000100000000-0x00000002bfffffff] May 27 03:21:36.921565 kernel: Device empty May 27 03:21:36.921585 kernel: Movable zone start for each node May 27 03:21:36.921593 kernel: Early memory node ranges May 27 03:21:36.921599 kernel: node 0: [mem 0x0000000000001000-0x000000000009ffff] May 27 03:21:36.921606 kernel: node 0: [mem 0x0000000000100000-0x000000003ff40fff] May 27 03:21:36.921613 kernel: node 0: [mem 0x000000003ffc5000-0x000000003ffd0fff] May 27 03:21:36.921621 kernel: node 0: [mem 0x000000003ffff000-0x000000003fffffff] May 27 03:21:36.921632 kernel: node 0: [mem 0x0000000100000000-0x00000002bfffffff] May 27 03:21:36.921639 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x00000002bfffffff] May 27 03:21:36.921646 kernel: On node 0, zone DMA: 1 pages in unavailable ranges May 27 03:21:36.921653 kernel: On node 0, zone DMA: 96 pages in unavailable ranges May 27 03:21:36.921659 kernel: On node 0, zone DMA32: 132 pages in unavailable ranges May 27 03:21:36.921666 kernel: On node 0, zone DMA32: 46 pages in unavailable ranges May 27 03:21:36.921673 kernel: ACPI: PM-Timer IO Port: 0x408 May 27 03:21:36.921680 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 May 27 03:21:36.921687 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) May 27 03:21:36.921693 kernel: ACPI: Using ACPI (MADT) for SMP configuration information May 27 03:21:36.921699 kernel: ACPI: SPCR: console: uart,io,0x3f8,115200 May 27 03:21:36.921706 kernel: TSC deadline timer available May 27 03:21:36.921712 kernel: CPU topo: Max. logical packages: 1 May 27 03:21:36.921718 kernel: CPU topo: Max. logical dies: 1 May 27 03:21:36.921724 kernel: CPU topo: Max. dies per package: 1 May 27 03:21:36.921730 kernel: CPU topo: Max. threads per core: 2 May 27 03:21:36.921736 kernel: CPU topo: Num. cores per package: 1 May 27 03:21:36.921744 kernel: CPU topo: Num. threads per package: 2 May 27 03:21:36.921751 kernel: CPU topo: Allowing 2 present CPUs plus 0 hotplug CPUs May 27 03:21:36.921758 kernel: [mem 0x40000000-0xffffffff] available for PCI devices May 27 03:21:36.921765 kernel: Booting paravirtualized kernel on Hyper-V May 27 03:21:36.921772 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns May 27 03:21:36.921779 kernel: setup_percpu: NR_CPUS:512 nr_cpumask_bits:2 nr_cpu_ids:2 nr_node_ids:1 May 27 03:21:36.921786 kernel: percpu: Embedded 60 pages/cpu s207832 r8192 d29736 u1048576 May 27 03:21:36.921793 kernel: pcpu-alloc: s207832 r8192 d29736 u1048576 alloc=1*2097152 May 27 03:21:36.921799 kernel: pcpu-alloc: [0] 0 1 May 27 03:21:36.921808 kernel: Hyper-V: PV spinlocks enabled May 27 03:21:36.921814 kernel: PV qspinlock hash table entries: 256 (order: 0, 4096 bytes, linear) May 27 03:21:36.921822 kernel: Kernel command line: rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=f6c186658a19d5a08471ef76df75f82494b37b46908f9237b2c3cf497da860c6 May 27 03:21:36.921830 kernel: Unknown kernel command line parameters "BOOT_IMAGE=/flatcar/vmlinuz-a", will be passed to user space. May 27 03:21:36.921836 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) May 27 03:21:36.921843 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) May 27 03:21:36.921850 kernel: Fallback order for Node 0: 0 May 27 03:21:36.921857 kernel: Built 1 zonelists, mobility grouping on. Total pages: 2096877 May 27 03:21:36.921865 kernel: Policy zone: Normal May 27 03:21:36.921872 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off May 27 03:21:36.921879 kernel: software IO TLB: area num 2. May 27 03:21:36.921886 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 May 27 03:21:36.921892 kernel: ftrace: allocating 40081 entries in 157 pages May 27 03:21:36.921898 kernel: ftrace: allocated 157 pages with 5 groups May 27 03:21:36.921904 kernel: Dynamic Preempt: voluntary May 27 03:21:36.921911 kernel: rcu: Preemptible hierarchical RCU implementation. May 27 03:21:36.921918 kernel: rcu: RCU event tracing is enabled. May 27 03:21:36.921927 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. May 27 03:21:36.921939 kernel: Trampoline variant of Tasks RCU enabled. May 27 03:21:36.921946 kernel: Rude variant of Tasks RCU enabled. May 27 03:21:36.921955 kernel: Tracing variant of Tasks RCU enabled. May 27 03:21:36.921962 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. May 27 03:21:36.921970 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 May 27 03:21:36.921978 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 03:21:36.921986 kernel: RCU Tasks Rude: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 03:21:36.921994 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. May 27 03:21:36.922001 kernel: Using NULL legacy PIC May 27 03:21:36.922009 kernel: NR_IRQS: 33024, nr_irqs: 440, preallocated irqs: 0 May 27 03:21:36.922018 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. May 27 03:21:36.922026 kernel: Console: colour dummy device 80x25 May 27 03:21:36.922034 kernel: printk: legacy console [tty1] enabled May 27 03:21:36.922042 kernel: printk: legacy console [ttyS0] enabled May 27 03:21:36.922049 kernel: printk: legacy bootconsole [earlyser0] disabled May 27 03:21:36.922056 kernel: ACPI: Core revision 20240827 May 27 03:21:36.922064 kernel: Failed to register legacy timer interrupt May 27 03:21:36.922071 kernel: APIC: Switch to symmetric I/O mode setup May 27 03:21:36.922079 kernel: x2apic enabled May 27 03:21:36.922086 kernel: APIC: Switched APIC routing to: physical x2apic May 27 03:21:36.922093 kernel: Hyper-V: Host Build 10.0.26100.1221-1-0 May 27 03:21:36.922100 kernel: Hyper-V: enabling crash_kexec_post_notifiers May 27 03:21:36.922108 kernel: Hyper-V: Disabling IBT because of Hyper-V bug May 27 03:21:36.922116 kernel: Hyper-V: Using IPI hypercalls May 27 03:21:36.922124 kernel: APIC: send_IPI() replaced with hv_send_ipi() May 27 03:21:36.922133 kernel: APIC: send_IPI_mask() replaced with hv_send_ipi_mask() May 27 03:21:36.922140 kernel: APIC: send_IPI_mask_allbutself() replaced with hv_send_ipi_mask_allbutself() May 27 03:21:36.922148 kernel: APIC: send_IPI_allbutself() replaced with hv_send_ipi_allbutself() May 27 03:21:36.922156 kernel: APIC: send_IPI_all() replaced with hv_send_ipi_all() May 27 03:21:36.922163 kernel: APIC: send_IPI_self() replaced with hv_send_ipi_self() May 27 03:21:36.922171 kernel: clocksource: tsc-early: mask: 0xffffffffffffffff max_cycles: 0x212735223b2, max_idle_ns: 440795277976 ns May 27 03:21:36.922178 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 4600.00 BogoMIPS (lpj=2300000) May 27 03:21:36.922186 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated May 27 03:21:36.922193 kernel: Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 May 27 03:21:36.922201 kernel: Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0, 1GB 0 May 27 03:21:36.922209 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization May 27 03:21:36.922216 kernel: Spectre V2 : Mitigation: Retpolines May 27 03:21:36.922223 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT May 27 03:21:36.922231 kernel: RETBleed: WARNING: Spectre v2 mitigation leaves CPU vulnerable to RETBleed attacks, data leaks possible! May 27 03:21:36.922238 kernel: RETBleed: Vulnerable May 27 03:21:36.922246 kernel: Speculative Store Bypass: Vulnerable May 27 03:21:36.922253 kernel: ITS: Mitigation: Aligned branch/return thunks May 27 03:21:36.922260 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers' May 27 03:21:36.922268 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers' May 27 03:21:36.922275 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers' May 27 03:21:36.922284 kernel: x86/fpu: Supporting XSAVE feature 0x020: 'AVX-512 opmask' May 27 03:21:36.922291 kernel: x86/fpu: Supporting XSAVE feature 0x040: 'AVX-512 Hi256' May 27 03:21:36.922299 kernel: x86/fpu: Supporting XSAVE feature 0x080: 'AVX-512 ZMM_Hi256' May 27 03:21:36.922306 kernel: x86/fpu: Supporting XSAVE feature 0x800: 'Control-flow User registers' May 27 03:21:36.922314 kernel: x86/fpu: Supporting XSAVE feature 0x20000: 'AMX Tile config' May 27 03:21:36.922322 kernel: x86/fpu: Supporting XSAVE feature 0x40000: 'AMX Tile data' May 27 03:21:36.922329 kernel: x86/fpu: xstate_offset[2]: 576, xstate_sizes[2]: 256 May 27 03:21:36.922336 kernel: x86/fpu: xstate_offset[5]: 832, xstate_sizes[5]: 64 May 27 03:21:36.922344 kernel: x86/fpu: xstate_offset[6]: 896, xstate_sizes[6]: 512 May 27 03:21:36.922351 kernel: x86/fpu: xstate_offset[7]: 1408, xstate_sizes[7]: 1024 May 27 03:21:36.922360 kernel: x86/fpu: xstate_offset[11]: 2432, xstate_sizes[11]: 16 May 27 03:21:36.922368 kernel: x86/fpu: xstate_offset[17]: 2496, xstate_sizes[17]: 64 May 27 03:21:36.922375 kernel: x86/fpu: xstate_offset[18]: 2560, xstate_sizes[18]: 8192 May 27 03:21:36.922383 kernel: x86/fpu: Enabled xstate features 0x608e7, context size is 10752 bytes, using 'compacted' format. May 27 03:21:36.922390 kernel: Freeing SMP alternatives memory: 32K May 27 03:21:36.922398 kernel: pid_max: default: 32768 minimum: 301 May 27 03:21:36.922405 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima May 27 03:21:36.922412 kernel: landlock: Up and running. May 27 03:21:36.922420 kernel: SELinux: Initializing. May 27 03:21:36.922427 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 27 03:21:36.922435 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear) May 27 03:21:36.922443 kernel: smpboot: CPU0: Intel INTEL(R) XEON(R) PLATINUM 8573C (family: 0x6, model: 0xcf, stepping: 0x2) May 27 03:21:36.922452 kernel: Performance Events: unsupported p6 CPU model 207 no PMU driver, software events only. May 27 03:21:36.922459 kernel: signal: max sigframe size: 11952 May 27 03:21:36.922467 kernel: rcu: Hierarchical SRCU implementation. May 27 03:21:36.922474 kernel: rcu: Max phase no-delay instances is 400. May 27 03:21:36.922481 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level May 27 03:21:36.922489 kernel: NMI watchdog: Perf NMI watchdog permanently disabled May 27 03:21:36.922497 kernel: smp: Bringing up secondary CPUs ... May 27 03:21:36.922504 kernel: smpboot: x86: Booting SMP configuration: May 27 03:21:36.922512 kernel: .... node #0, CPUs: #1 May 27 03:21:36.922521 kernel: smp: Brought up 1 node, 2 CPUs May 27 03:21:36.922529 kernel: smpboot: Total of 2 processors activated (9200.00 BogoMIPS) May 27 03:21:36.922538 kernel: Memory: 8082312K/8387508K available (14336K kernel code, 2430K rwdata, 9952K rodata, 54416K init, 2552K bss, 299988K reserved, 0K cma-reserved) May 27 03:21:36.922546 kernel: devtmpfs: initialized May 27 03:21:36.922554 kernel: x86/mm: Memory block size: 128MB May 27 03:21:36.922562 kernel: ACPI: PM: Registering ACPI NVS region [mem 0x3fffb000-0x3fffefff] (16384 bytes) May 27 03:21:36.922570 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns May 27 03:21:36.923620 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) May 27 03:21:36.923631 kernel: pinctrl core: initialized pinctrl subsystem May 27 03:21:36.923641 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family May 27 03:21:36.923648 kernel: audit: initializing netlink subsys (disabled) May 27 03:21:36.923655 kernel: audit: type=2000 audit(1748316094.031:1): state=initialized audit_enabled=0 res=1 May 27 03:21:36.923663 kernel: thermal_sys: Registered thermal governor 'step_wise' May 27 03:21:36.923671 kernel: thermal_sys: Registered thermal governor 'user_space' May 27 03:21:36.923678 kernel: cpuidle: using governor menu May 27 03:21:36.923685 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 May 27 03:21:36.923693 kernel: dca service started, version 1.12.1 May 27 03:21:36.923701 kernel: e820: reserve RAM buffer [mem 0x3ff41000-0x3fffffff] May 27 03:21:36.923710 kernel: e820: reserve RAM buffer [mem 0x3ffd1000-0x3fffffff] May 27 03:21:36.923717 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible. May 27 03:21:36.923724 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages May 27 03:21:36.923732 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page May 27 03:21:36.923739 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages May 27 03:21:36.923747 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page May 27 03:21:36.923754 kernel: ACPI: Added _OSI(Module Device) May 27 03:21:36.923761 kernel: ACPI: Added _OSI(Processor Device) May 27 03:21:36.923769 kernel: ACPI: Added _OSI(3.0 _SCP Extensions) May 27 03:21:36.923777 kernel: ACPI: Added _OSI(Processor Aggregator Device) May 27 03:21:36.923784 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded May 27 03:21:36.923791 kernel: ACPI: Interpreter enabled May 27 03:21:36.923799 kernel: ACPI: PM: (supports S0 S5) May 27 03:21:36.923806 kernel: ACPI: Using IOAPIC for interrupt routing May 27 03:21:36.923813 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug May 27 03:21:36.923821 kernel: PCI: Ignoring E820 reservations for host bridge windows May 27 03:21:36.923828 kernel: ACPI: Enabled 1 GPEs in block 00 to 0F May 27 03:21:36.923835 kernel: iommu: Default domain type: Translated May 27 03:21:36.923844 kernel: iommu: DMA domain TLB invalidation policy: lazy mode May 27 03:21:36.923851 kernel: efivars: Registered efivars operations May 27 03:21:36.923858 kernel: PCI: Using ACPI for IRQ routing May 27 03:21:36.923865 kernel: PCI: System does not support PCI May 27 03:21:36.923873 kernel: vgaarb: loaded May 27 03:21:36.923880 kernel: clocksource: Switched to clocksource tsc-early May 27 03:21:36.923887 kernel: VFS: Disk quotas dquot_6.6.0 May 27 03:21:36.923895 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) May 27 03:21:36.923902 kernel: pnp: PnP ACPI init May 27 03:21:36.923911 kernel: pnp: PnP ACPI: found 3 devices May 27 03:21:36.923918 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns May 27 03:21:36.923925 kernel: NET: Registered PF_INET protocol family May 27 03:21:36.923933 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear) May 27 03:21:36.923940 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear) May 27 03:21:36.923948 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) May 27 03:21:36.923955 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear) May 27 03:21:36.923963 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) May 27 03:21:36.923970 kernel: TCP: Hash tables configured (established 65536 bind 65536) May 27 03:21:36.923978 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear) May 27 03:21:36.923986 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear) May 27 03:21:36.923993 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family May 27 03:21:36.924001 kernel: NET: Registered PF_XDP protocol family May 27 03:21:36.924008 kernel: PCI: CLS 0 bytes, default 64 May 27 03:21:36.924015 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB) May 27 03:21:36.924023 kernel: software IO TLB: mapped [mem 0x000000003aa59000-0x000000003ea59000] (64MB) May 27 03:21:36.924030 kernel: RAPL PMU: API unit is 2^-32 Joules, 1 fixed counters, 10737418240 ms ovfl timer May 27 03:21:36.924037 kernel: RAPL PMU: hw unit of domain psys 2^-0 Joules May 27 03:21:36.924046 kernel: clocksource: tsc: mask: 0xffffffffffffffff max_cycles: 0x212735223b2, max_idle_ns: 440795277976 ns May 27 03:21:36.924053 kernel: clocksource: Switched to clocksource tsc May 27 03:21:36.924060 kernel: Initialise system trusted keyrings May 27 03:21:36.924068 kernel: workingset: timestamp_bits=39 max_order=21 bucket_order=0 May 27 03:21:36.924075 kernel: Key type asymmetric registered May 27 03:21:36.924082 kernel: Asymmetric key parser 'x509' registered May 27 03:21:36.924090 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 250) May 27 03:21:36.924097 kernel: io scheduler mq-deadline registered May 27 03:21:36.924105 kernel: io scheduler kyber registered May 27 03:21:36.924113 kernel: io scheduler bfq registered May 27 03:21:36.924120 kernel: ioatdma: Intel(R) QuickData Technology Driver 5.00 May 27 03:21:36.924127 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled May 27 03:21:36.924135 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A May 27 03:21:36.924142 kernel: 00:01: ttyS1 at I/O 0x2f8 (irq = 3, base_baud = 115200) is a 16550A May 27 03:21:36.924150 kernel: serial8250: ttyS2 at I/O 0x3e8 (irq = 4, base_baud = 115200) is a 16550A May 27 03:21:36.924157 kernel: i8042: PNP: No PS/2 controller found. May 27 03:21:36.924273 kernel: rtc_cmos 00:02: registered as rtc0 May 27 03:21:36.924339 kernel: rtc_cmos 00:02: setting system clock to 2025-05-27T03:21:36 UTC (1748316096) May 27 03:21:36.924412 kernel: rtc_cmos 00:02: alarms up to one month, 114 bytes nvram May 27 03:21:36.924427 kernel: intel_pstate: Intel P-state driver initializing May 27 03:21:36.924437 kernel: efifb: probing for efifb May 27 03:21:36.924445 kernel: efifb: framebuffer at 0x40000000, using 3072k, total 3072k May 27 03:21:36.924452 kernel: efifb: mode is 1024x768x32, linelength=4096, pages=1 May 27 03:21:36.924460 kernel: efifb: scrolling: redraw May 27 03:21:36.924468 kernel: efifb: Truecolor: size=8:8:8:8, shift=24:16:8:0 May 27 03:21:36.924479 kernel: Console: switching to colour frame buffer device 128x48 May 27 03:21:36.924488 kernel: fb0: EFI VGA frame buffer device May 27 03:21:36.924495 kernel: pstore: Using crash dump compression: deflate May 27 03:21:36.924504 kernel: pstore: Registered efi_pstore as persistent store backend May 27 03:21:36.924514 kernel: NET: Registered PF_INET6 protocol family May 27 03:21:36.924522 kernel: Segment Routing with IPv6 May 27 03:21:36.924532 kernel: In-situ OAM (IOAM) with IPv6 May 27 03:21:36.924541 kernel: NET: Registered PF_PACKET protocol family May 27 03:21:36.924550 kernel: Key type dns_resolver registered May 27 03:21:36.924561 kernel: IPI shorthand broadcast: enabled May 27 03:21:36.924569 kernel: sched_clock: Marking stable (2680003407, 82557920)->(3037829067, -275267740) May 27 03:21:36.924593 kernel: registered taskstats version 1 May 27 03:21:36.924602 kernel: Loading compiled-in X.509 certificates May 27 03:21:36.924610 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.30-flatcar: ba9eddccb334a70147f3ddfe4fbde029feaa991d' May 27 03:21:36.924617 kernel: Demotion targets for Node 0: null May 27 03:21:36.924624 kernel: Key type .fscrypt registered May 27 03:21:36.924631 kernel: Key type fscrypt-provisioning registered May 27 03:21:36.924638 kernel: ima: No TPM chip found, activating TPM-bypass! May 27 03:21:36.924649 kernel: ima: Allocated hash algorithm: sha1 May 27 03:21:36.924657 kernel: ima: No architecture policies found May 27 03:21:36.924665 kernel: clk: Disabling unused clocks May 27 03:21:36.924674 kernel: Warning: unable to open an initial console. May 27 03:21:36.924682 kernel: Freeing unused kernel image (initmem) memory: 54416K May 27 03:21:36.924690 kernel: Write protecting the kernel read-only data: 24576k May 27 03:21:36.924699 kernel: Freeing unused kernel image (rodata/data gap) memory: 288K May 27 03:21:36.924706 kernel: Run /init as init process May 27 03:21:36.924714 kernel: with arguments: May 27 03:21:36.924727 kernel: /init May 27 03:21:36.924734 kernel: with environment: May 27 03:21:36.924742 kernel: HOME=/ May 27 03:21:36.924749 kernel: TERM=linux May 27 03:21:36.924757 kernel: BOOT_IMAGE=/flatcar/vmlinuz-a May 27 03:21:36.924767 systemd[1]: Successfully made /usr/ read-only. May 27 03:21:36.924781 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 03:21:36.924791 systemd[1]: Detected virtualization microsoft. May 27 03:21:36.924824 systemd[1]: Detected architecture x86-64. May 27 03:21:36.924833 systemd[1]: Running in initrd. May 27 03:21:36.924842 systemd[1]: No hostname configured, using default hostname. May 27 03:21:36.924851 systemd[1]: Hostname set to . May 27 03:21:36.924858 systemd[1]: Initializing machine ID from random generator. May 27 03:21:36.924866 systemd[1]: Queued start job for default target initrd.target. May 27 03:21:36.924874 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 03:21:36.924882 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 03:21:36.924893 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... May 27 03:21:36.924902 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 03:21:36.924910 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... May 27 03:21:36.924919 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... May 27 03:21:36.924928 systemd[1]: Expecting device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - /dev/disk/by-partuuid/7130c94a-213a-4e5a-8e26-6cce9662f132... May 27 03:21:36.924936 systemd[1]: Expecting device dev-mapper-usr.device - /dev/mapper/usr... May 27 03:21:36.924946 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 03:21:36.924954 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 03:21:36.924963 systemd[1]: Reached target paths.target - Path Units. May 27 03:21:36.924971 systemd[1]: Reached target slices.target - Slice Units. May 27 03:21:36.924979 systemd[1]: Reached target swap.target - Swaps. May 27 03:21:36.924987 systemd[1]: Reached target timers.target - Timer Units. May 27 03:21:36.924995 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. May 27 03:21:36.925003 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 03:21:36.925011 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). May 27 03:21:36.925021 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. May 27 03:21:36.925028 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 03:21:36.925036 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 03:21:36.925044 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 03:21:36.925052 systemd[1]: Reached target sockets.target - Socket Units. May 27 03:21:36.925061 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... May 27 03:21:36.925069 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 03:21:36.925078 systemd[1]: Finished network-cleanup.service - Network Cleanup. May 27 03:21:36.925086 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). May 27 03:21:36.925096 systemd[1]: Starting systemd-fsck-usr.service... May 27 03:21:36.925105 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 03:21:36.925113 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 03:21:36.925130 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:21:36.925157 systemd-journald[205]: Collecting audit messages is disabled. May 27 03:21:36.925181 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. May 27 03:21:36.925192 systemd-journald[205]: Journal started May 27 03:21:36.925213 systemd-journald[205]: Runtime Journal (/run/log/journal/96ffff2814424f27afc27d70fdcb1f0a) is 8M, max 159M, 151M free. May 27 03:21:36.931462 systemd[1]: Started systemd-journald.service - Journal Service. May 27 03:21:36.932603 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 03:21:36.937356 systemd[1]: Finished systemd-fsck-usr.service. May 27 03:21:36.941008 systemd-modules-load[207]: Inserted module 'overlay' May 27 03:21:36.944833 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... May 27 03:21:36.951666 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 03:21:36.966068 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:21:36.975300 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. May 27 03:21:36.972802 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. May 27 03:21:36.973843 systemd-tmpfiles[218]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. May 27 03:21:36.983556 kernel: Bridge firewalling registered May 27 03:21:36.978505 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... May 27 03:21:36.981692 systemd-modules-load[207]: Inserted module 'br_netfilter' May 27 03:21:36.986667 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 03:21:36.991383 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 03:21:37.003672 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 03:21:37.004850 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 03:21:37.016272 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 03:21:37.021038 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 03:21:37.022978 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 03:21:37.025682 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... May 27 03:21:37.026617 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 03:21:37.043291 dracut-cmdline[243]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 rootflags=rw mount.usrflags=ro BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=tty1 console=ttyS0,115200n8 earlyprintk=ttyS0,115200 flatcar.first_boot=detected flatcar.oem.id=azure flatcar.autologin verity.usrhash=f6c186658a19d5a08471ef76df75f82494b37b46908f9237b2c3cf497da860c6 May 27 03:21:37.073906 systemd-resolved[244]: Positive Trust Anchors: May 27 03:21:37.075208 systemd-resolved[244]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 03:21:37.075290 systemd-resolved[244]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 03:21:37.090371 systemd-resolved[244]: Defaulting to hostname 'linux'. May 27 03:21:37.092750 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 03:21:37.096695 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 03:21:37.108590 kernel: SCSI subsystem initialized May 27 03:21:37.114592 kernel: Loading iSCSI transport class v2.0-870. May 27 03:21:37.122589 kernel: iscsi: registered transport (tcp) May 27 03:21:37.137877 kernel: iscsi: registered transport (qla4xxx) May 27 03:21:37.137911 kernel: QLogic iSCSI HBA Driver May 27 03:21:37.148519 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 03:21:37.158182 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 03:21:37.158659 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 03:21:37.186243 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. May 27 03:21:37.189375 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... May 27 03:21:37.227593 kernel: raid6: avx512x4 gen() 46442 MB/s May 27 03:21:37.244586 kernel: raid6: avx512x2 gen() 45408 MB/s May 27 03:21:37.261584 kernel: raid6: avx512x1 gen() 29963 MB/s May 27 03:21:37.279584 kernel: raid6: avx2x4 gen() 41687 MB/s May 27 03:21:37.296583 kernel: raid6: avx2x2 gen() 43672 MB/s May 27 03:21:37.314120 kernel: raid6: avx2x1 gen() 31552 MB/s May 27 03:21:37.314200 kernel: raid6: using algorithm avx512x4 gen() 46442 MB/s May 27 03:21:37.332891 kernel: raid6: .... xor() 8373 MB/s, rmw enabled May 27 03:21:37.332914 kernel: raid6: using avx512x2 recovery algorithm May 27 03:21:37.348589 kernel: xor: automatically using best checksumming function avx May 27 03:21:37.449593 kernel: Btrfs loaded, zoned=no, fsverity=no May 27 03:21:37.452929 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. May 27 03:21:37.457417 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 03:21:37.476924 systemd-udevd[454]: Using default interface naming scheme 'v255'. May 27 03:21:37.480832 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 03:21:37.483669 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... May 27 03:21:37.498091 dracut-pre-trigger[456]: rd.md=0: removing MD RAID activation May 27 03:21:37.513479 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. May 27 03:21:37.518550 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 03:21:37.547758 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 03:21:37.550675 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... May 27 03:21:37.589591 kernel: cryptd: max_cpu_qlen set to 1000 May 27 03:21:37.597594 kernel: AES CTR mode by8 optimization enabled May 27 03:21:37.628588 kernel: hv_vmbus: Vmbus version:5.3 May 27 03:21:37.636406 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 03:21:37.638135 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:21:37.646543 kernel: pps_core: LinuxPPS API ver. 1 registered May 27 03:21:37.646560 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti May 27 03:21:37.643491 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:21:37.649808 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:21:37.659584 kernel: hv_vmbus: registering driver hv_storvsc May 27 03:21:37.660593 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 03:21:37.666460 kernel: hv_vmbus: registering driver hyperv_keyboard May 27 03:21:37.666511 kernel: input: AT Translated Set 2 keyboard as /devices/LNXSYSTM:00/LNXSYBUS:00/ACPI0004:00/MSFT1000:00/d34b2567-b9b6-42b9-8778-0a4ec0b955bf/serio0/input/input0 May 27 03:21:37.663092 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:21:37.675964 kernel: scsi host0: storvsc_host_t May 27 03:21:37.676182 kernel: PTP clock support registered May 27 03:21:37.676193 kernel: scsi 0:0:0:2: CD-ROM Msft Virtual DVD-ROM 1.0 PQ: 0 ANSI: 5 May 27 03:21:37.671094 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:21:37.681619 kernel: hv_vmbus: registering driver hv_netvsc May 27 03:21:37.681635 kernel: hv_vmbus: registering driver hv_pci May 27 03:21:37.733611 kernel: hid: raw HID events driver (C) Jiri Kosina May 27 03:21:37.733653 kernel: hv_utils: Registering HyperV Utility Driver May 27 03:21:37.733668 kernel: hv_vmbus: registering driver hv_utils May 27 03:21:37.745354 kernel: hv_utils: Shutdown IC version 3.2 May 27 03:21:37.745393 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI VMBus probing: Using version 0x10004 May 27 03:21:37.745546 kernel: hv_utils: Heartbeat IC version 3.0 May 27 03:21:37.748041 kernel: hv_utils: TimeSync IC version 4.0 May 27 03:21:37.738844 systemd-resolved[244]: Clock change detected. Flushing caches. May 27 03:21:37.744864 systemd-journald[205]: Time jumped backwards, rotating. May 27 03:21:37.742479 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:21:37.759859 kernel: hv_netvsc f8615163-0000-1000-2000-0022489d724e (unnamed net_device) (uninitialized): VF slot 1 added May 27 03:21:37.759986 kernel: hv_vmbus: registering driver hid_hyperv May 27 03:21:37.759995 kernel: sr 0:0:0:2: [sr0] scsi-1 drive May 27 03:21:37.760090 kernel: input: Microsoft Vmbus HID-compliant Mouse as /devices/0006:045E:0621.0001/input/input1 May 27 03:21:37.760099 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 May 27 03:21:37.760106 kernel: hid-hyperv 0006:045E:0621.0001: input: VIRTUAL HID v0.01 Mouse [Microsoft Vmbus HID-compliant Mouse] on May 27 03:21:37.765270 kernel: hv_pci 7ad35d50-c05b-47ab-b3a0-56a9a845852b: PCI host bridge to bus c05b:00 May 27 03:21:37.768895 kernel: pci_bus c05b:00: root bus resource [mem 0xfc0000000-0xfc007ffff window] May 27 03:21:37.769031 kernel: pci_bus c05b:00: No busn resource found for root bus, will use [bus 00-ff] May 27 03:21:37.770450 kernel: sr 0:0:0:2: Attached scsi CD-ROM sr0 May 27 03:21:37.772497 kernel: pci c05b:00:00.0: [1414:00a9] type 00 class 0x010802 PCIe Endpoint May 27 03:21:37.775567 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit] May 27 03:21:37.788446 kernel: pci c05b:00:00.0: 32.000 Gb/s available PCIe bandwidth, limited by 2.5 GT/s PCIe x16 link at c05b:00:00.0 (capable of 1024.000 Gb/s with 64.0 GT/s PCIe x16 link) May 27 03:21:37.792601 kernel: pci_bus c05b:00: busn_res: [bus 00-ff] end is updated to 00 May 27 03:21:37.792747 kernel: pci c05b:00:00.0: BAR 0 [mem 0xfc0000000-0xfc007ffff 64bit]: assigned May 27 03:21:37.799506 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#5 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 27 03:21:37.809946 kernel: nvme nvme0: pci function c05b:00:00.0 May 27 03:21:37.810083 kernel: nvme c05b:00:00.0: enabling device (0000 -> 0002) May 27 03:21:37.820455 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#25 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 27 03:21:38.035474 kernel: nvme nvme0: 2/0/0 default/read/poll queues May 27 03:21:38.040448 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 03:21:38.101455 kernel: nvme nvme0: using unchecked data buffer May 27 03:21:38.152434 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - MSFT NVMe Accelerator v1.0 EFI-SYSTEM. May 27 03:21:38.164090 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. May 27 03:21:38.179125 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. May 27 03:21:38.187248 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - MSFT NVMe Accelerator v1.0 ROOT. May 27 03:21:38.194830 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - MSFT NVMe Accelerator v1.0 USR-A. May 27 03:21:38.195541 systemd[1]: Found device dev-disk-by\x2dpartuuid-7130c94a\x2d213a\x2d4e5a\x2d8e26\x2d6cce9662f132.device - MSFT NVMe Accelerator v1.0 USR-A. May 27 03:21:38.200571 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. May 27 03:21:38.203804 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 03:21:38.207470 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 03:21:38.212771 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... May 27 03:21:38.220537 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... May 27 03:21:38.229582 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. May 27 03:21:38.234462 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 03:21:38.769036 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI VMBus probing: Using version 0x10004 May 27 03:21:38.769181 kernel: hv_pci 00000001-7870-47b5-b203-907d12ca697e: PCI host bridge to bus 7870:00 May 27 03:21:38.771514 kernel: pci_bus 7870:00: root bus resource [mem 0xfc2000000-0xfc4007fff window] May 27 03:21:38.773017 kernel: pci_bus 7870:00: No busn resource found for root bus, will use [bus 00-ff] May 27 03:21:38.776570 kernel: pci 7870:00:00.0: [1414:00ba] type 00 class 0x020000 PCIe Endpoint May 27 03:21:38.780498 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref] May 27 03:21:38.784582 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref] May 27 03:21:38.784606 kernel: pci 7870:00:00.0: enabling Extended Tags May 27 03:21:38.798901 kernel: pci_bus 7870:00: busn_res: [bus 00-ff] end is updated to 00 May 27 03:21:38.799095 kernel: pci 7870:00:00.0: BAR 0 [mem 0xfc2000000-0xfc3ffffff 64bit pref]: assigned May 27 03:21:38.801616 kernel: pci 7870:00:00.0: BAR 4 [mem 0xfc4000000-0xfc4007fff 64bit pref]: assigned May 27 03:21:38.804667 kernel: mana 7870:00:00.0: enabling device (0000 -> 0002) May 27 03:21:39.243053 disk-uuid[677]: The operation has completed successfully. May 27 03:21:39.245734 kernel: nvme0n1: p1 p2 p3 p4 p6 p7 p9 May 27 03:22:39.607967 systemd-udevd[454]: 7870:00:00.0: Worker [512] processing SEQNUM=1160 is taking a long time May 27 03:22:40.891459 kernel: mana 7870:00:00.0: Failed to establish HWC: -110 May 27 03:22:40.904450 kernel: mana 7870:00:00.0: gdma probe failed: err = -110 May 27 03:22:40.904636 kernel: mana 7870:00:00.0: probe with driver mana failed with error -110 May 27 03:22:40.907071 systemd[1]: disk-uuid.service: Deactivated successfully. May 27 03:22:40.907158 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. May 27 03:22:40.914159 systemd[1]: Starting verity-setup.service - Verity Setup for /dev/mapper/usr... May 27 03:22:40.927647 sh[717]: Success May 27 03:22:40.949884 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. May 27 03:22:40.949923 kernel: device-mapper: uevent: version 1.0.3 May 27 03:22:40.950996 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev May 27 03:22:40.959455 kernel: device-mapper: verity: sha256 using shash "sha256-ni" May 27 03:22:41.028218 systemd[1]: Found device dev-mapper-usr.device - /dev/mapper/usr. May 27 03:22:41.031605 systemd[1]: Mounting sysusr-usr.mount - /sysusr/usr... May 27 03:22:41.039240 systemd[1]: Finished verity-setup.service - Verity Setup for /dev/mapper/usr. May 27 03:22:41.053462 kernel: BTRFS info: 'norecovery' is for compatibility only, recommended to use 'rescue=nologreplay' May 27 03:22:41.053495 kernel: BTRFS: device fsid f0f66fe8-3990-49eb-980e-559a3dfd3522 devid 1 transid 40 /dev/mapper/usr (254:0) scanned by mount (730) May 27 03:22:41.054777 kernel: BTRFS info (device dm-0): first mount of filesystem f0f66fe8-3990-49eb-980e-559a3dfd3522 May 27 03:22:41.055982 kernel: BTRFS info (device dm-0): using crc32c (crc32c-intel) checksum algorithm May 27 03:22:41.056982 kernel: BTRFS info (device dm-0): using free-space-tree May 27 03:22:41.119558 systemd[1]: Mounted sysusr-usr.mount - /sysusr/usr. May 27 03:22:41.122709 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. May 27 03:22:41.125277 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. May 27 03:22:41.125763 systemd[1]: Starting ignition-setup.service - Ignition (setup)... May 27 03:22:41.130484 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... May 27 03:22:41.164159 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/nvme0n1p6 (259:12) scanned by mount (763) May 27 03:22:41.164204 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:22:41.166302 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 27 03:22:41.167777 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 03:22:41.187657 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:22:41.187661 systemd[1]: Finished ignition-setup.service - Ignition (setup). May 27 03:22:41.193813 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... May 27 03:22:41.207537 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 03:22:41.210901 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 03:22:41.243587 systemd-networkd[899]: lo: Link UP May 27 03:22:41.243593 systemd-networkd[899]: lo: Gained carrier May 27 03:22:41.244248 systemd-networkd[899]: Enumeration completed May 27 03:22:41.244303 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 03:22:41.245155 systemd-networkd[899]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 03:22:41.245157 systemd-networkd[899]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 03:22:41.245771 systemd-networkd[899]: eth0: Link UP May 27 03:22:41.245887 systemd-networkd[899]: eth0: Gained carrier May 27 03:22:41.245896 systemd-networkd[899]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 03:22:41.249552 systemd[1]: Reached target network.target - Network. May 27 03:22:41.260966 systemd-networkd[899]: eth0: DHCPv4 address 10.200.8.20/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 27 03:22:41.410210 ignition[870]: Ignition 2.21.0 May 27 03:22:41.410389 ignition[870]: Stage: fetch-offline May 27 03:22:41.412027 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). May 27 03:22:41.410496 ignition[870]: no configs at "/usr/lib/ignition/base.d" May 27 03:22:41.412918 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... May 27 03:22:41.410503 ignition[870]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 03:22:41.410595 ignition[870]: parsed url from cmdline: "" May 27 03:22:41.410597 ignition[870]: no config URL provided May 27 03:22:41.410601 ignition[870]: reading system config file "/usr/lib/ignition/user.ign" May 27 03:22:41.410606 ignition[870]: no config at "/usr/lib/ignition/user.ign" May 27 03:22:41.410610 ignition[870]: failed to fetch config: resource requires networking May 27 03:22:41.410772 ignition[870]: Ignition finished successfully May 27 03:22:41.430685 ignition[908]: Ignition 2.21.0 May 27 03:22:41.430691 ignition[908]: Stage: fetch May 27 03:22:41.430887 ignition[908]: no configs at "/usr/lib/ignition/base.d" May 27 03:22:41.430894 ignition[908]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 03:22:41.430955 ignition[908]: parsed url from cmdline: "" May 27 03:22:41.430958 ignition[908]: no config URL provided May 27 03:22:41.430961 ignition[908]: reading system config file "/usr/lib/ignition/user.ign" May 27 03:22:41.430966 ignition[908]: no config at "/usr/lib/ignition/user.ign" May 27 03:22:41.430994 ignition[908]: GET http://169.254.169.254/metadata/instance/compute/userData?api-version=2021-01-01&format=text: attempt #1 May 27 03:22:41.481290 ignition[908]: GET result: OK May 27 03:22:41.481350 ignition[908]: config has been read from IMDS userdata May 27 03:22:41.481376 ignition[908]: parsing config with SHA512: baf1122330db92c22f46a6923c99d6ad1b8e50f8296f0a8ec51f25b8403fe20f8e284a3635df87f0a18376fa3e3da4b77abded13328f981ea6d852628708c590 May 27 03:22:41.488311 unknown[908]: fetched base config from "system" May 27 03:22:41.488318 unknown[908]: fetched base config from "system" May 27 03:22:41.488646 ignition[908]: fetch: fetch complete May 27 03:22:41.488322 unknown[908]: fetched user config from "azure" May 27 03:22:41.488650 ignition[908]: fetch: fetch passed May 27 03:22:41.490358 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). May 27 03:22:41.488679 ignition[908]: Ignition finished successfully May 27 03:22:41.494058 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... May 27 03:22:41.511912 ignition[914]: Ignition 2.21.0 May 27 03:22:41.511920 ignition[914]: Stage: kargs May 27 03:22:41.512067 ignition[914]: no configs at "/usr/lib/ignition/base.d" May 27 03:22:41.513898 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). May 27 03:22:41.512074 ignition[914]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 03:22:41.515320 systemd[1]: Starting ignition-disks.service - Ignition (disks)... May 27 03:22:41.512714 ignition[914]: kargs: kargs passed May 27 03:22:41.512742 ignition[914]: Ignition finished successfully May 27 03:22:41.538033 ignition[920]: Ignition 2.21.0 May 27 03:22:41.538041 ignition[920]: Stage: disks May 27 03:22:41.538177 ignition[920]: no configs at "/usr/lib/ignition/base.d" May 27 03:22:41.540166 systemd[1]: Finished ignition-disks.service - Ignition (disks). May 27 03:22:41.538183 ignition[920]: no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 03:22:41.542757 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. May 27 03:22:41.539431 ignition[920]: disks: disks passed May 27 03:22:41.547292 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. May 27 03:22:41.539480 ignition[920]: Ignition finished successfully May 27 03:22:41.551477 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 03:22:41.553337 systemd[1]: Reached target sysinit.target - System Initialization. May 27 03:22:41.554602 systemd[1]: Reached target basic.target - Basic System. May 27 03:22:41.559539 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... May 27 03:22:41.605842 systemd-fsck[929]: ROOT: clean, 15/7326000 files, 477845/7359488 blocks May 27 03:22:41.609061 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. May 27 03:22:41.613324 systemd[1]: Mounting sysroot.mount - /sysroot... May 27 03:22:41.735305 systemd[1]: Mounted sysroot.mount - /sysroot. May 27 03:22:41.738523 kernel: EXT4-fs (nvme0n1p9): mounted filesystem 18301365-b380-45d7-9677-e42472a122bc r/w with ordered data mode. Quota mode: none. May 27 03:22:41.737165 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. May 27 03:22:41.744361 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 03:22:41.748514 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... May 27 03:22:41.749259 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... May 27 03:22:41.749510 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). May 27 03:22:41.749529 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. May 27 03:22:41.761322 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. May 27 03:22:41.765741 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... May 27 03:22:41.773797 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/nvme0n1p6 (259:12) scanned by mount (938) May 27 03:22:41.774045 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:22:41.775449 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 27 03:22:41.777841 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 03:22:41.783951 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 03:22:41.884180 coreos-metadata[940]: May 27 03:22:41.884 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 27 03:22:41.887697 coreos-metadata[940]: May 27 03:22:41.887 INFO Fetch successful May 27 03:22:41.887697 coreos-metadata[940]: May 27 03:22:41.887 INFO Fetching http://169.254.169.254/metadata/instance/compute/name?api-version=2017-08-01&format=text: Attempt #1 May 27 03:22:41.895927 coreos-metadata[940]: May 27 03:22:41.895 INFO Fetch successful May 27 03:22:41.899135 coreos-metadata[940]: May 27 03:22:41.899 INFO wrote hostname ci-4344.0.0-a-c2c0d8ddb2 to /sysroot/etc/hostname May 27 03:22:41.900650 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 03:22:41.918641 initrd-setup-root[968]: cut: /sysroot/etc/passwd: No such file or directory May 27 03:22:41.929519 initrd-setup-root[975]: cut: /sysroot/etc/group: No such file or directory May 27 03:22:41.936700 initrd-setup-root[982]: cut: /sysroot/etc/shadow: No such file or directory May 27 03:22:41.940612 initrd-setup-root[989]: cut: /sysroot/etc/gshadow: No such file or directory May 27 03:22:42.149361 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. May 27 03:22:42.151925 systemd[1]: Starting ignition-mount.service - Ignition (mount)... May 27 03:22:42.156246 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... May 27 03:22:42.166532 systemd[1]: sysroot-oem.mount: Deactivated successfully. May 27 03:22:42.168096 kernel: BTRFS info (device nvme0n1p6): last unmount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:22:42.184141 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. May 27 03:22:42.186506 systemd[1]: Finished ignition-mount.service - Ignition (mount). May 27 03:22:42.191519 ignition[1061]: INFO : Ignition 2.21.0 May 27 03:22:42.191519 ignition[1061]: INFO : Stage: mount May 27 03:22:42.191519 ignition[1061]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 03:22:42.191519 ignition[1061]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 03:22:42.191519 ignition[1061]: INFO : mount: mount passed May 27 03:22:42.191519 ignition[1061]: INFO : Ignition finished successfully May 27 03:22:42.188516 systemd[1]: Starting ignition-files.service - Ignition (files)... May 27 03:22:42.207812 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... May 27 03:22:42.220863 kernel: BTRFS: device label OEM devid 1 transid 16 /dev/nvme0n1p6 (259:12) scanned by mount (1073) May 27 03:22:42.220899 kernel: BTRFS info (device nvme0n1p6): first mount of filesystem fd7bb961-7a0f-4c90-a609-3bffeb956d05 May 27 03:22:42.221748 kernel: BTRFS info (device nvme0n1p6): using crc32c (crc32c-intel) checksum algorithm May 27 03:22:42.222613 kernel: BTRFS info (device nvme0n1p6): using free-space-tree May 27 03:22:42.227022 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. May 27 03:22:42.245110 ignition[1090]: INFO : Ignition 2.21.0 May 27 03:22:42.245110 ignition[1090]: INFO : Stage: files May 27 03:22:42.247383 ignition[1090]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 03:22:42.247383 ignition[1090]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 03:22:42.247383 ignition[1090]: DEBUG : files: compiled without relabeling support, skipping May 27 03:22:42.247383 ignition[1090]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" May 27 03:22:42.247383 ignition[1090]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" May 27 03:22:42.259504 ignition[1090]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" May 27 03:22:42.259504 ignition[1090]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" May 27 03:22:42.259504 ignition[1090]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" May 27 03:22:42.259504 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 27 03:22:42.259504 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-amd64.tar.gz: attempt #1 May 27 03:22:42.254470 unknown[1090]: wrote ssh authorized keys file for user: core May 27 03:22:42.549565 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK May 27 03:22:42.736646 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-amd64.tar.gz" May 27 03:22:42.738797 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" May 27 03:22:42.738797 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" May 27 03:22:42.738797 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" May 27 03:22:42.738797 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" May 27 03:22:42.738797 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 03:22:42.738797 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" May 27 03:22:42.738797 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 03:22:42.738797 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" May 27 03:22:42.759349 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" May 27 03:22:42.759349 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" May 27 03:22:42.759349 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 27 03:22:42.759349 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 27 03:22:42.759349 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 27 03:22:42.759349 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-x86-64.raw: attempt #1 May 27 03:22:42.936601 systemd-networkd[899]: eth0: Gained IPv6LL May 27 03:22:43.583723 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK May 27 03:22:43.908006 ignition[1090]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-x86-64.raw" May 27 03:22:43.908006 ignition[1090]: INFO : files: op(b): [started] processing unit "prepare-helm.service" May 27 03:22:43.920565 ignition[1090]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 03:22:43.927402 ignition[1090]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" May 27 03:22:43.927402 ignition[1090]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" May 27 03:22:43.927402 ignition[1090]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" May 27 03:22:43.932002 ignition[1090]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" May 27 03:22:43.932002 ignition[1090]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" May 27 03:22:43.932002 ignition[1090]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" May 27 03:22:43.932002 ignition[1090]: INFO : files: files passed May 27 03:22:43.932002 ignition[1090]: INFO : Ignition finished successfully May 27 03:22:43.930502 systemd[1]: Finished ignition-files.service - Ignition (files). May 27 03:22:43.945223 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... May 27 03:22:43.947095 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... May 27 03:22:43.960905 systemd[1]: ignition-quench.service: Deactivated successfully. May 27 03:22:43.960974 systemd[1]: Finished ignition-quench.service - Ignition (record completion). May 27 03:22:43.968349 initrd-setup-root-after-ignition[1120]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 03:22:43.968349 initrd-setup-root-after-ignition[1120]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory May 27 03:22:43.977490 initrd-setup-root-after-ignition[1124]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory May 27 03:22:43.971418 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 03:22:43.973480 systemd[1]: Reached target ignition-complete.target - Ignition Complete. May 27 03:22:43.979960 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... May 27 03:22:44.013522 systemd[1]: initrd-parse-etc.service: Deactivated successfully. May 27 03:22:44.013591 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. May 27 03:22:44.017778 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. May 27 03:22:44.018329 systemd[1]: Reached target initrd.target - Initrd Default Target. May 27 03:22:44.018419 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. May 27 03:22:44.020539 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... May 27 03:22:44.053148 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 03:22:44.055554 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... May 27 03:22:44.069908 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. May 27 03:22:44.070053 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 03:22:44.070216 systemd[1]: Stopped target timers.target - Timer Units. May 27 03:22:44.070429 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. May 27 03:22:44.070583 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. May 27 03:22:44.077523 systemd[1]: Stopped target initrd.target - Initrd Default Target. May 27 03:22:44.081560 systemd[1]: Stopped target basic.target - Basic System. May 27 03:22:44.088575 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. May 27 03:22:44.089514 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. May 27 03:22:44.093888 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. May 27 03:22:44.100555 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. May 27 03:22:44.104019 systemd[1]: Stopped target remote-fs.target - Remote File Systems. May 27 03:22:44.105261 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. May 27 03:22:44.105496 systemd[1]: Stopped target sysinit.target - System Initialization. May 27 03:22:44.105749 systemd[1]: Stopped target local-fs.target - Local File Systems. May 27 03:22:44.106177 systemd[1]: Stopped target swap.target - Swaps. May 27 03:22:44.110811 systemd[1]: dracut-pre-mount.service: Deactivated successfully. May 27 03:22:44.110919 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. May 27 03:22:44.121522 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. May 27 03:22:44.124579 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 03:22:44.128524 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. May 27 03:22:44.128718 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 03:22:44.128780 systemd[1]: dracut-initqueue.service: Deactivated successfully. May 27 03:22:44.128866 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. May 27 03:22:44.133787 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. May 27 03:22:44.133890 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. May 27 03:22:44.137947 systemd[1]: ignition-files.service: Deactivated successfully. May 27 03:22:44.138030 systemd[1]: Stopped ignition-files.service - Ignition (files). May 27 03:22:44.147070 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. May 27 03:22:44.147183 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. May 27 03:22:44.149533 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... May 27 03:22:44.155492 systemd[1]: kmod-static-nodes.service: Deactivated successfully. May 27 03:22:44.155638 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. May 27 03:22:44.161410 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... May 27 03:22:44.166289 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. May 27 03:22:44.166419 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. May 27 03:22:44.172559 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. May 27 03:22:44.174328 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. May 27 03:22:44.178553 ignition[1144]: INFO : Ignition 2.21.0 May 27 03:22:44.178553 ignition[1144]: INFO : Stage: umount May 27 03:22:44.178553 ignition[1144]: INFO : no configs at "/usr/lib/ignition/base.d" May 27 03:22:44.178553 ignition[1144]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/azure" May 27 03:22:44.178553 ignition[1144]: INFO : umount: umount passed May 27 03:22:44.178553 ignition[1144]: INFO : Ignition finished successfully May 27 03:22:44.184591 systemd[1]: ignition-mount.service: Deactivated successfully. May 27 03:22:44.184671 systemd[1]: Stopped ignition-mount.service - Ignition (mount). May 27 03:22:44.188587 systemd[1]: initrd-cleanup.service: Deactivated successfully. May 27 03:22:44.188656 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. May 27 03:22:44.190173 systemd[1]: ignition-disks.service: Deactivated successfully. May 27 03:22:44.190213 systemd[1]: Stopped ignition-disks.service - Ignition (disks). May 27 03:22:44.190396 systemd[1]: ignition-kargs.service: Deactivated successfully. May 27 03:22:44.190422 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). May 27 03:22:44.190640 systemd[1]: ignition-fetch.service: Deactivated successfully. May 27 03:22:44.190663 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). May 27 03:22:44.190893 systemd[1]: Stopped target network.target - Network. May 27 03:22:44.190920 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. May 27 03:22:44.190944 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). May 27 03:22:44.191148 systemd[1]: Stopped target paths.target - Path Units. May 27 03:22:44.191167 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. May 27 03:22:44.197114 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 03:22:44.207484 systemd[1]: Stopped target slices.target - Slice Units. May 27 03:22:44.207971 systemd[1]: Stopped target sockets.target - Socket Units. May 27 03:22:44.208014 systemd[1]: iscsid.socket: Deactivated successfully. May 27 03:22:44.208043 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. May 27 03:22:44.208243 systemd[1]: iscsiuio.socket: Deactivated successfully. May 27 03:22:44.208264 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. May 27 03:22:44.208295 systemd[1]: ignition-setup.service: Deactivated successfully. May 27 03:22:44.208325 systemd[1]: Stopped ignition-setup.service - Ignition (setup). May 27 03:22:44.208570 systemd[1]: ignition-setup-pre.service: Deactivated successfully. May 27 03:22:44.208594 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. May 27 03:22:44.208927 systemd[1]: Stopping systemd-networkd.service - Network Configuration... May 27 03:22:44.209231 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... May 27 03:22:44.217590 systemd[1]: systemd-resolved.service: Deactivated successfully. May 27 03:22:44.217659 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. May 27 03:22:44.228877 systemd[1]: run-credentials-systemd\x2dresolved.service.mount: Deactivated successfully. May 27 03:22:44.229053 systemd[1]: systemd-networkd.service: Deactivated successfully. May 27 03:22:44.229125 systemd[1]: Stopped systemd-networkd.service - Network Configuration. May 27 03:22:44.232912 systemd[1]: run-credentials-systemd\x2dnetworkd.service.mount: Deactivated successfully. May 27 03:22:44.233222 systemd[1]: Stopped target network-pre.target - Preparation for Network. May 27 03:22:44.235884 systemd[1]: systemd-networkd.socket: Deactivated successfully. May 27 03:22:44.235909 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. May 27 03:22:44.242485 systemd[1]: Stopping network-cleanup.service - Network Cleanup... May 27 03:22:44.246481 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. May 27 03:22:44.246525 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. May 27 03:22:44.250550 systemd[1]: systemd-sysctl.service: Deactivated successfully. May 27 03:22:44.250592 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. May 27 03:22:44.254426 systemd[1]: systemd-modules-load.service: Deactivated successfully. May 27 03:22:44.254477 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. May 27 03:22:44.257982 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. May 27 03:22:44.258945 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 03:22:44.267719 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 03:22:44.271268 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully. May 27 03:22:44.271322 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully. May 27 03:22:44.281883 systemd[1]: network-cleanup.service: Deactivated successfully. May 27 03:22:44.281968 systemd[1]: Stopped network-cleanup.service - Network Cleanup. May 27 03:22:44.287695 systemd[1]: systemd-udevd.service: Deactivated successfully. May 27 03:22:44.287809 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 03:22:44.291596 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. May 27 03:22:44.291653 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. May 27 03:22:44.295998 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. May 27 03:22:44.296027 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. May 27 03:22:44.296199 systemd[1]: dracut-pre-udev.service: Deactivated successfully. May 27 03:22:44.296230 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. May 27 03:22:44.296518 systemd[1]: dracut-cmdline.service: Deactivated successfully. May 27 03:22:44.296544 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. May 27 03:22:44.296747 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. May 27 03:22:44.296771 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. May 27 03:22:44.300532 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... May 27 03:22:44.300626 systemd[1]: systemd-network-generator.service: Deactivated successfully. May 27 03:22:44.300666 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. May 27 03:22:44.305009 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. May 27 03:22:44.305051 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 03:22:44.311198 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 03:22:44.311236 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:22:44.313715 systemd[1]: sysroot-boot.mount: Deactivated successfully. May 27 03:22:44.313777 systemd[1]: run-credentials-systemd\x2dnetwork\x2dgenerator.service.mount: Deactivated successfully. May 27 03:22:44.313807 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully. May 27 03:22:44.313835 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 03:22:44.314218 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. May 27 03:22:44.314278 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. May 27 03:22:44.646916 systemd[1]: sysroot-boot.service: Deactivated successfully. May 27 03:22:44.646998 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. May 27 03:22:44.651620 systemd[1]: Reached target initrd-switch-root.target - Switch Root. May 27 03:22:44.651906 systemd[1]: initrd-setup-root.service: Deactivated successfully. May 27 03:22:44.651944 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. May 27 03:22:44.657916 systemd[1]: Starting initrd-switch-root.service - Switch Root... May 27 03:22:44.673942 systemd[1]: Switching root. May 27 03:22:44.710420 systemd-journald[205]: Journal stopped May 27 03:22:46.274263 systemd-journald[205]: Received SIGTERM from PID 1 (systemd). May 27 03:22:46.274289 kernel: SELinux: policy capability network_peer_controls=1 May 27 03:22:46.274299 kernel: SELinux: policy capability open_perms=1 May 27 03:22:46.274307 kernel: SELinux: policy capability extended_socket_class=1 May 27 03:22:46.274314 kernel: SELinux: policy capability always_check_network=0 May 27 03:22:46.274321 kernel: SELinux: policy capability cgroup_seclabel=1 May 27 03:22:46.274331 kernel: SELinux: policy capability nnp_nosuid_transition=1 May 27 03:22:46.274339 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 May 27 03:22:46.274347 kernel: SELinux: policy capability ioctl_skip_cloexec=0 May 27 03:22:46.274354 kernel: SELinux: policy capability userspace_initial_context=0 May 27 03:22:46.274361 kernel: audit: type=1403 audit(1748316165.216:2): auid=4294967295 ses=4294967295 lsm=selinux res=1 May 27 03:22:46.274370 systemd[1]: Successfully loaded SELinux policy in 51.913ms. May 27 03:22:46.274379 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.561ms. May 27 03:22:46.274390 systemd[1]: systemd 256.8 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) May 27 03:22:46.274400 systemd[1]: Detected virtualization microsoft. May 27 03:22:46.274408 systemd[1]: Detected architecture x86-64. May 27 03:22:46.274417 systemd[1]: Detected first boot. May 27 03:22:46.274426 systemd[1]: Hostname set to . May 27 03:22:46.274445 systemd[1]: Initializing machine ID from random generator. May 27 03:22:46.274454 zram_generator::config[1187]: No configuration found. May 27 03:22:46.274464 kernel: Guest personality initialized and is inactive May 27 03:22:46.274471 kernel: VMCI host device registered (name=vmci, major=10, minor=124) May 27 03:22:46.274479 kernel: Initialized host personality May 27 03:22:46.274487 kernel: NET: Registered PF_VSOCK protocol family May 27 03:22:46.274495 systemd[1]: Populated /etc with preset unit settings. May 27 03:22:46.274505 systemd[1]: run-credentials-systemd\x2djournald.service.mount: Deactivated successfully. May 27 03:22:46.274513 systemd[1]: initrd-switch-root.service: Deactivated successfully. May 27 03:22:46.274522 systemd[1]: Stopped initrd-switch-root.service - Switch Root. May 27 03:22:46.274530 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. May 27 03:22:46.274539 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. May 27 03:22:46.274548 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. May 27 03:22:46.274556 systemd[1]: Created slice system-getty.slice - Slice /system/getty. May 27 03:22:46.274566 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. May 27 03:22:46.274574 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. May 27 03:22:46.274583 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. May 27 03:22:46.274591 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. May 27 03:22:46.274600 systemd[1]: Created slice user.slice - User and Session Slice. May 27 03:22:46.274608 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. May 27 03:22:46.274617 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. May 27 03:22:46.274625 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. May 27 03:22:46.274636 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. May 27 03:22:46.274647 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. May 27 03:22:46.274656 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... May 27 03:22:46.274664 systemd[1]: Expecting device dev-ttyS0.device - /dev/ttyS0... May 27 03:22:46.274673 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). May 27 03:22:46.274682 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. May 27 03:22:46.274690 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. May 27 03:22:46.274698 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. May 27 03:22:46.274708 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. May 27 03:22:46.274717 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. May 27 03:22:46.274726 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. May 27 03:22:46.274736 systemd[1]: Reached target remote-fs.target - Remote File Systems. May 27 03:22:46.274745 systemd[1]: Reached target slices.target - Slice Units. May 27 03:22:46.274754 systemd[1]: Reached target swap.target - Swaps. May 27 03:22:46.274762 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. May 27 03:22:46.274770 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. May 27 03:22:46.274781 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. May 27 03:22:46.274790 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. May 27 03:22:46.274799 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. May 27 03:22:46.274807 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. May 27 03:22:46.274816 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. May 27 03:22:46.274827 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... May 27 03:22:46.274835 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... May 27 03:22:46.274843 systemd[1]: Mounting media.mount - External Media Directory... May 27 03:22:46.274852 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:22:46.274861 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... May 27 03:22:46.274870 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... May 27 03:22:46.274878 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... May 27 03:22:46.274888 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). May 27 03:22:46.274898 systemd[1]: Reached target machines.target - Containers. May 27 03:22:46.274906 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... May 27 03:22:46.274915 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 03:22:46.274925 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... May 27 03:22:46.274933 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... May 27 03:22:46.274942 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 03:22:46.274951 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 03:22:46.274959 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 03:22:46.274970 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... May 27 03:22:46.274978 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 03:22:46.274987 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). May 27 03:22:46.274995 systemd[1]: systemd-fsck-root.service: Deactivated successfully. May 27 03:22:46.275004 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. May 27 03:22:46.275013 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. May 27 03:22:46.275022 systemd[1]: Stopped systemd-fsck-usr.service. May 27 03:22:46.275031 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 03:22:46.275041 systemd[1]: Starting systemd-journald.service - Journal Service... May 27 03:22:46.275049 kernel: loop: module loaded May 27 03:22:46.275057 kernel: fuse: init (API version 7.41) May 27 03:22:46.275065 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... May 27 03:22:46.275074 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... May 27 03:22:46.275083 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... May 27 03:22:46.275091 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... May 27 03:22:46.275100 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... May 27 03:22:46.275109 systemd[1]: verity-setup.service: Deactivated successfully. May 27 03:22:46.275119 systemd[1]: Stopped verity-setup.service. May 27 03:22:46.275127 kernel: ACPI: bus type drm_connector registered May 27 03:22:46.275136 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:22:46.275145 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. May 27 03:22:46.275154 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. May 27 03:22:46.275163 systemd[1]: Mounted media.mount - External Media Directory. May 27 03:22:46.275183 systemd-journald[1294]: Collecting audit messages is disabled. May 27 03:22:46.275205 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. May 27 03:22:46.275214 systemd-journald[1294]: Journal started May 27 03:22:46.275235 systemd-journald[1294]: Runtime Journal (/run/log/journal/721b7af7877e4c0fa168a00139bb990e) is 8M, max 159M, 151M free. May 27 03:22:45.913499 systemd[1]: Queued start job for default target multi-user.target. May 27 03:22:45.920737 systemd[1]: Unnecessary job was removed for dev-nvme0n1p6.device - /dev/nvme0n1p6. May 27 03:22:45.920980 systemd[1]: systemd-journald.service: Deactivated successfully. May 27 03:22:46.278390 systemd[1]: Started systemd-journald.service - Journal Service. May 27 03:22:46.280637 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. May 27 03:22:46.283082 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. May 27 03:22:46.286730 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. May 27 03:22:46.289163 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. May 27 03:22:46.292092 systemd[1]: modprobe@configfs.service: Deactivated successfully. May 27 03:22:46.292278 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. May 27 03:22:46.294963 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 03:22:46.295157 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 03:22:46.297971 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 03:22:46.298155 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 03:22:46.300713 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 03:22:46.300901 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 03:22:46.303813 systemd[1]: modprobe@fuse.service: Deactivated successfully. May 27 03:22:46.304005 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. May 27 03:22:46.306417 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 03:22:46.306609 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 03:22:46.309028 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. May 27 03:22:46.311668 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. May 27 03:22:46.314729 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. May 27 03:22:46.317610 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. May 27 03:22:46.324958 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. May 27 03:22:46.331804 systemd[1]: Reached target network-pre.target - Preparation for Network. May 27 03:22:46.333769 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... May 27 03:22:46.344247 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... May 27 03:22:46.346589 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). May 27 03:22:46.346615 systemd[1]: Reached target local-fs.target - Local File Systems. May 27 03:22:46.348724 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. May 27 03:22:46.350806 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... May 27 03:22:46.353108 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 03:22:46.354371 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... May 27 03:22:46.357315 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... May 27 03:22:46.361527 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 03:22:46.362543 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... May 27 03:22:46.364551 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 03:22:46.366018 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... May 27 03:22:46.372184 systemd-journald[1294]: Time spent on flushing to /var/log/journal/721b7af7877e4c0fa168a00139bb990e is 46.171ms for 974 entries. May 27 03:22:46.372184 systemd-journald[1294]: System Journal (/var/log/journal/721b7af7877e4c0fa168a00139bb990e) is 11.8M, max 2.6G, 2.6G free. May 27 03:22:46.466396 systemd-journald[1294]: Received client request to flush runtime journal. May 27 03:22:46.466430 kernel: loop0: detected capacity change from 0 to 146240 May 27 03:22:46.466464 systemd-journald[1294]: /var/log/journal/721b7af7877e4c0fa168a00139bb990e/system.journal: Realtime clock jumped backwards relative to last journal entry, rotating. May 27 03:22:46.466485 systemd-journald[1294]: Rotating system journal. May 27 03:22:46.375153 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... May 27 03:22:46.378544 systemd[1]: Starting systemd-sysusers.service - Create System Users... May 27 03:22:46.381685 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. May 27 03:22:46.384809 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. May 27 03:22:46.395026 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. May 27 03:22:46.397103 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. May 27 03:22:46.401807 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... May 27 03:22:46.409783 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. May 27 03:22:46.445766 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. May 27 03:22:46.468512 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. May 27 03:22:46.477656 systemd[1]: Finished systemd-sysusers.service - Create System Users. May 27 03:22:46.481526 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... May 27 03:22:46.499456 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher May 27 03:22:46.510361 systemd-tmpfiles[1347]: ACLs are not supported, ignoring. May 27 03:22:46.510375 systemd-tmpfiles[1347]: ACLs are not supported, ignoring. May 27 03:22:46.512762 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. May 27 03:22:46.517464 kernel: loop1: detected capacity change from 0 to 224512 May 27 03:22:46.547451 kernel: loop2: detected capacity change from 0 to 28536 May 27 03:22:46.634459 kernel: loop3: detected capacity change from 0 to 113872 May 27 03:22:46.719461 kernel: loop4: detected capacity change from 0 to 146240 May 27 03:22:46.734312 kernel: loop5: detected capacity change from 0 to 224512 May 27 03:22:46.753942 kernel: loop6: detected capacity change from 0 to 28536 May 27 03:22:46.766463 kernel: loop7: detected capacity change from 0 to 113872 May 27 03:22:46.775466 (sd-merge)[1353]: Using extensions 'containerd-flatcar', 'docker-flatcar', 'kubernetes', 'oem-azure'. May 27 03:22:46.775793 (sd-merge)[1353]: Merged extensions into '/usr'. May 27 03:22:46.785093 systemd[1]: Reload requested from client PID 1329 ('systemd-sysext') (unit systemd-sysext.service)... May 27 03:22:46.785163 systemd[1]: Reloading... May 27 03:22:46.840460 zram_generator::config[1375]: No configuration found. May 27 03:22:46.970623 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:22:47.039888 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. May 27 03:22:47.040012 systemd[1]: Reloading finished in 254 ms. May 27 03:22:47.053717 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. May 27 03:22:47.055126 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. May 27 03:22:47.061125 systemd[1]: Starting ensure-sysext.service... May 27 03:22:47.063342 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... May 27 03:22:47.075551 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... May 27 03:22:47.085358 systemd[1]: Reload requested from client PID 1438 ('systemctl') (unit ensure-sysext.service)... May 27 03:22:47.085421 systemd[1]: Reloading... May 27 03:22:47.097302 systemd-tmpfiles[1439]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. May 27 03:22:47.097697 systemd-tmpfiles[1439]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. May 27 03:22:47.097969 systemd-tmpfiles[1439]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. May 27 03:22:47.098200 systemd-tmpfiles[1439]: /usr/lib/tmpfiles.d/systemd-flatcar.conf:6: Duplicate line for path "/var/log/journal", ignoring. May 27 03:22:47.098847 systemd-tmpfiles[1439]: /usr/lib/tmpfiles.d/systemd.conf:29: Duplicate line for path "/var/lib/systemd", ignoring. May 27 03:22:47.099091 systemd-tmpfiles[1439]: ACLs are not supported, ignoring. May 27 03:22:47.099163 systemd-tmpfiles[1439]: ACLs are not supported, ignoring. May 27 03:22:47.110308 systemd-tmpfiles[1439]: Detected autofs mount point /boot during canonicalization of boot. May 27 03:22:47.111079 systemd-tmpfiles[1439]: Skipping /boot May 27 03:22:47.118264 systemd-udevd[1440]: Using default interface naming scheme 'v255'. May 27 03:22:47.121500 systemd-tmpfiles[1439]: Detected autofs mount point /boot during canonicalization of boot. May 27 03:22:47.121512 systemd-tmpfiles[1439]: Skipping /boot May 27 03:22:47.147466 zram_generator::config[1470]: No configuration found. May 27 03:22:47.309861 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:22:47.412453 kernel: hv_vmbus: registering driver hyperv_fb May 27 03:22:47.412486 kernel: mousedev: PS/2 mouse device common for all mice May 27 03:22:47.422471 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#36 cmd 0x85 status: scsi 0x2 srb 0x6 hv 0xc0000001 May 27 03:22:47.429455 kernel: hv_vmbus: registering driver hv_balloon May 27 03:22:47.431459 kernel: hv_balloon: Using Dynamic Memory protocol version 2.0 May 27 03:22:47.437974 systemd[1]: Condition check resulted in dev-ttyS0.device - /dev/ttyS0 being skipped. May 27 03:22:47.438322 systemd[1]: Reloading finished in 352 ms. May 27 03:22:47.446649 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. May 27 03:22:47.449114 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. May 27 03:22:47.464880 kernel: hyperv_fb: Synthvid Version major 3, minor 5 May 27 03:22:47.469469 kernel: hyperv_fb: Screen resolution: 1024x768, Color depth: 32, Frame buffer size: 8388608 May 27 03:22:47.471256 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 03:22:47.476149 kernel: Console: switching to colour dummy device 80x25 May 27 03:22:47.483963 kernel: Console: switching to colour frame buffer device 128x48 May 27 03:22:47.477989 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... May 27 03:22:47.484427 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... May 27 03:22:47.493139 systemd[1]: Starting systemd-networkd.service - Network Configuration... May 27 03:22:47.500535 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... May 27 03:22:47.509321 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... May 27 03:22:47.516751 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:22:47.516960 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 03:22:47.520664 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... May 27 03:22:47.526830 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... May 27 03:22:47.531080 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... May 27 03:22:47.534562 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 03:22:47.534668 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 03:22:47.534746 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:22:47.542727 systemd[1]: Starting systemd-userdbd.service - User Database Manager... May 27 03:22:47.549497 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:22:47.549667 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 03:22:47.549799 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 03:22:47.549870 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 03:22:47.549946 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:22:47.552867 systemd[1]: Expecting device dev-ptp_hyperv.device - /dev/ptp_hyperv... May 27 03:22:47.554949 systemd[1]: proc-xen.mount - /proc/xen was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:22:47.555170 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. May 27 03:22:47.561645 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... May 27 03:22:47.563716 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. May 27 03:22:47.563822 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). May 27 03:22:47.563969 systemd[1]: Reached target time-set.target - System Time Set. May 27 03:22:47.566578 systemd[1]: xenserver-pv-version.service - Set fake PV driver version for XenServer was skipped because of an unmet condition check (ConditionVirtualization=xen). May 27 03:22:47.567226 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. May 27 03:22:47.570004 systemd[1]: Condition check resulted in dev-ptp_hyperv.device - /dev/ptp_hyperv being skipped. May 27 03:22:47.571920 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. May 27 03:22:47.581738 systemd[1]: Finished ensure-sysext.service. May 27 03:22:47.590766 systemd[1]: modprobe@loop.service: Deactivated successfully. May 27 03:22:47.590925 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. May 27 03:22:47.594672 systemd[1]: modprobe@drm.service: Deactivated successfully. May 27 03:22:47.595320 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. May 27 03:22:47.600221 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. May 27 03:22:47.601070 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. May 27 03:22:47.604410 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. May 27 03:22:47.604561 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. May 27 03:22:47.608305 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). May 27 03:22:47.608367 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. May 27 03:22:47.649083 augenrules[1630]: No rules May 27 03:22:47.649948 systemd[1]: audit-rules.service: Deactivated successfully. May 27 03:22:47.650642 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 03:22:47.682873 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:22:47.692057 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 03:22:47.692374 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:22:47.696906 ldconfig[1324]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. May 27 03:22:47.697542 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:22:47.707061 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. May 27 03:22:47.712838 systemd[1]: Starting systemd-update-done.service - Update is Completed... May 27 03:22:47.714980 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. May 27 03:22:47.717939 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). May 27 03:22:47.739190 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - MSFT NVMe Accelerator v1.0 OEM. May 27 03:22:47.747762 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... May 27 03:22:47.766570 systemd[1]: Finished systemd-update-done.service - Update is Completed. May 27 03:22:47.771749 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. May 27 03:22:47.772523 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:22:47.785256 systemd[1]: run-credentials-systemd\x2dvconsole\x2dsetup.service.mount: Deactivated successfully. May 27 03:22:47.791659 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... May 27 03:22:47.816136 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. May 27 03:22:47.818335 systemd[1]: Started systemd-userdbd.service - User Database Manager. May 27 03:22:47.861471 kernel: kvm_intel: Using Hyper-V Enlightened VMCS May 27 03:22:47.921376 systemd-networkd[1584]: lo: Link UP May 27 03:22:47.921387 systemd-networkd[1584]: lo: Gained carrier May 27 03:22:47.922300 systemd-networkd[1584]: Enumeration completed May 27 03:22:47.922367 systemd[1]: Started systemd-networkd.service - Network Configuration. May 27 03:22:47.925601 systemd-networkd[1584]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 03:22:47.925608 systemd-networkd[1584]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 03:22:47.926008 systemd-networkd[1584]: eth0: Link UP May 27 03:22:47.926012 systemd-networkd[1584]: eth0: Gained carrier May 27 03:22:47.926027 systemd-networkd[1584]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 03:22:47.926937 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... May 27 03:22:47.928647 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... May 27 03:22:47.934782 systemd-resolved[1585]: Positive Trust Anchors: May 27 03:22:47.934793 systemd-resolved[1585]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d May 27 03:22:47.934824 systemd-resolved[1585]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test May 27 03:22:47.939157 systemd-resolved[1585]: Using system hostname 'ci-4344.0.0-a-c2c0d8ddb2'. May 27 03:22:47.939513 systemd-networkd[1584]: eth0: DHCPv4 address 10.200.8.20/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 27 03:22:47.940150 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. May 27 03:22:47.943172 systemd[1]: Started systemd-resolved.service - Network Name Resolution. May 27 03:22:47.946580 systemd[1]: Reached target network.target - Network. May 27 03:22:47.947793 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. May 27 03:22:47.950521 systemd[1]: Reached target sysinit.target - System Initialization. May 27 03:22:47.952680 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. May 27 03:22:47.954498 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. May 27 03:22:47.957480 systemd[1]: Started google-oslogin-cache.timer - NSS cache refresh timer. May 27 03:22:47.960560 systemd[1]: Started logrotate.timer - Daily rotation of log files. May 27 03:22:47.961714 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. May 27 03:22:47.964478 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. May 27 03:22:47.967471 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). May 27 03:22:47.967489 systemd[1]: Reached target paths.target - Path Units. May 27 03:22:47.970472 systemd[1]: Reached target timers.target - Timer Units. May 27 03:22:47.973152 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. May 27 03:22:47.976133 systemd[1]: Starting docker.socket - Docker Socket for the API... May 27 03:22:47.980020 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). May 27 03:22:47.982583 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). May 27 03:22:47.985490 systemd[1]: Reached target ssh-access.target - SSH Access Available. May 27 03:22:47.994472 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. May 27 03:22:47.996270 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. May 27 03:22:48.000010 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. May 27 03:22:48.002598 systemd[1]: Listening on docker.socket - Docker Socket for the API. May 27 03:22:48.004836 systemd[1]: Reached target sockets.target - Socket Units. May 27 03:22:48.007512 systemd[1]: Reached target basic.target - Basic System. May 27 03:22:48.010529 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. May 27 03:22:48.010553 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. May 27 03:22:48.012403 systemd[1]: Starting chronyd.service - NTP client/server... May 27 03:22:48.015590 systemd[1]: Starting containerd.service - containerd container runtime... May 27 03:22:48.024037 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... May 27 03:22:48.028037 systemd[1]: Starting dbus.service - D-Bus System Message Bus... May 27 03:22:48.035922 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... May 27 03:22:48.039927 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... May 27 03:22:48.048590 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... May 27 03:22:48.050731 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). May 27 03:22:48.052535 systemd[1]: Starting google-oslogin-cache.service - NSS cache refresh... May 27 03:22:48.055859 jq[1679]: false May 27 03:22:48.056559 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... May 27 03:22:48.063586 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... May 27 03:22:48.066485 (chronyd)[1671]: chronyd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS May 27 03:22:48.075079 chronyd[1688]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +ASYNCDNS +NTS +SECHASH +IPV6 -DEBUG) May 27 03:22:48.075596 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... May 27 03:22:48.079521 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... May 27 03:22:48.084480 google_oslogin_nss_cache[1681]: oslogin_cache_refresh[1681]: Refreshing passwd entry cache May 27 03:22:48.084026 oslogin_cache_refresh[1681]: Refreshing passwd entry cache May 27 03:22:48.086109 chronyd[1688]: Timezone right/UTC failed leap second check, ignoring May 27 03:22:48.086232 chronyd[1688]: Loaded seccomp filter (level 2) May 27 03:22:48.091600 systemd[1]: Starting systemd-logind.service - User Login Management... May 27 03:22:48.095171 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). May 27 03:22:48.095540 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. May 27 03:22:48.097721 systemd[1]: Starting update-engine.service - Update Engine... May 27 03:22:48.098711 extend-filesystems[1680]: Found loop4 May 27 03:22:48.098711 extend-filesystems[1680]: Found loop5 May 27 03:22:48.098711 extend-filesystems[1680]: Found loop6 May 27 03:22:48.098711 extend-filesystems[1680]: Found loop7 May 27 03:22:48.098711 extend-filesystems[1680]: Found sr0 May 27 03:22:48.098711 extend-filesystems[1680]: Found nvme0n1 May 27 03:22:48.098711 extend-filesystems[1680]: Found nvme0n1p1 May 27 03:22:48.098711 extend-filesystems[1680]: Found nvme0n1p2 May 27 03:22:48.098711 extend-filesystems[1680]: Found nvme0n1p3 May 27 03:22:48.098711 extend-filesystems[1680]: Found usr May 27 03:22:48.098711 extend-filesystems[1680]: Found nvme0n1p4 May 27 03:22:48.098711 extend-filesystems[1680]: Found nvme0n1p6 May 27 03:22:48.098711 extend-filesystems[1680]: Found nvme0n1p7 May 27 03:22:48.098711 extend-filesystems[1680]: Found nvme0n1p9 May 27 03:22:48.098711 extend-filesystems[1680]: Checking size of /dev/nvme0n1p9 May 27 03:22:48.101232 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... May 27 03:22:48.114538 google_oslogin_nss_cache[1681]: oslogin_cache_refresh[1681]: Failure getting users, quitting May 27 03:22:48.114538 google_oslogin_nss_cache[1681]: oslogin_cache_refresh[1681]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 03:22:48.114538 google_oslogin_nss_cache[1681]: oslogin_cache_refresh[1681]: Refreshing group entry cache May 27 03:22:48.102310 oslogin_cache_refresh[1681]: Failure getting users, quitting May 27 03:22:48.107259 systemd[1]: Started chronyd.service - NTP client/server. May 27 03:22:48.102322 oslogin_cache_refresh[1681]: Produced empty passwd cache file, removing /etc/oslogin_passwd.cache.bak. May 27 03:22:48.112246 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. May 27 03:22:48.102354 oslogin_cache_refresh[1681]: Refreshing group entry cache May 27 03:22:48.112502 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. May 27 03:22:48.112632 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. May 27 03:22:48.126728 extend-filesystems[1680]: Old size kept for /dev/nvme0n1p9 May 27 03:22:48.127304 systemd[1]: motdgen.service: Deactivated successfully. May 27 03:22:48.131484 google_oslogin_nss_cache[1681]: oslogin_cache_refresh[1681]: Failure getting groups, quitting May 27 03:22:48.131484 google_oslogin_nss_cache[1681]: oslogin_cache_refresh[1681]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 03:22:48.128850 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. May 27 03:22:48.128478 oslogin_cache_refresh[1681]: Failure getting groups, quitting May 27 03:22:48.129118 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. May 27 03:22:48.128487 oslogin_cache_refresh[1681]: Produced empty group cache file, removing /etc/oslogin_group.cache.bak. May 27 03:22:48.129253 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. May 27 03:22:48.133254 systemd[1]: extend-filesystems.service: Deactivated successfully. May 27 03:22:48.135166 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. May 27 03:22:48.137406 systemd[1]: google-oslogin-cache.service: Deactivated successfully. May 27 03:22:48.137575 systemd[1]: Finished google-oslogin-cache.service - NSS cache refresh. May 27 03:22:48.144319 jq[1698]: true May 27 03:22:48.153339 (ntainerd)[1708]: containerd.service: Referenced but unset environment variable evaluates to an empty string: TORCX_IMAGEDIR, TORCX_UNPACKDIR May 27 03:22:48.165811 jq[1713]: true May 27 03:22:48.166404 tar[1702]: linux-amd64/LICENSE May 27 03:22:48.167456 tar[1702]: linux-amd64/helm May 27 03:22:48.174466 update_engine[1696]: I20250527 03:22:48.172338 1696 main.cc:92] Flatcar Update Engine starting May 27 03:22:48.228017 dbus-daemon[1674]: [system] SELinux support is enabled May 27 03:22:48.228737 systemd[1]: Started dbus.service - D-Bus System Message Bus. May 27 03:22:48.234885 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). May 27 03:22:48.234912 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. May 27 03:22:48.238546 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). May 27 03:22:48.238564 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. May 27 03:22:48.244534 update_engine[1696]: I20250527 03:22:48.244498 1696 update_check_scheduler.cc:74] Next update check in 9m19s May 27 03:22:48.247610 systemd[1]: Started update-engine.service - Update Engine. May 27 03:22:48.264372 systemd[1]: Started locksmithd.service - Cluster reboot manager. May 27 03:22:48.266574 systemd-logind[1695]: New seat seat0. May 27 03:22:48.277927 systemd-logind[1695]: Watching system buttons on /dev/input/event0 (AT Translated Set 2 keyboard) May 27 03:22:48.279934 bash[1742]: Updated "/home/core/.ssh/authorized_keys" May 27 03:22:48.280417 systemd[1]: Started systemd-logind.service - User Login Management. May 27 03:22:48.283610 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. May 27 03:22:48.303425 systemd[1]: sshkeys.service was skipped because no trigger condition checks were met. May 27 03:22:48.359398 coreos-metadata[1673]: May 27 03:22:48.359 INFO Fetching http://168.63.129.16/?comp=versions: Attempt #1 May 27 03:22:48.361898 coreos-metadata[1673]: May 27 03:22:48.361 INFO Fetch successful May 27 03:22:48.361898 coreos-metadata[1673]: May 27 03:22:48.361 INFO Fetching http://168.63.129.16/machine/?comp=goalstate: Attempt #1 May 27 03:22:48.365669 coreos-metadata[1673]: May 27 03:22:48.365 INFO Fetch successful May 27 03:22:48.365972 coreos-metadata[1673]: May 27 03:22:48.365 INFO Fetching http://168.63.129.16/machine/825faabc-e329-4b72-b36e-9f20a3e0d2c5/b97c1420%2D4bc6%2D455b%2Db65f%2Db9310c30879a.%5Fci%2D4344.0.0%2Da%2Dc2c0d8ddb2?comp=config&type=sharedConfig&incarnation=1: Attempt #1 May 27 03:22:48.366959 coreos-metadata[1673]: May 27 03:22:48.366 INFO Fetch successful May 27 03:22:48.367750 coreos-metadata[1673]: May 27 03:22:48.367 INFO Fetching http://169.254.169.254/metadata/instance/compute/vmSize?api-version=2017-08-01&format=text: Attempt #1 May 27 03:22:48.378826 coreos-metadata[1673]: May 27 03:22:48.378 INFO Fetch successful May 27 03:22:48.428414 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. May 27 03:22:48.430473 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. May 27 03:22:48.462376 sshd_keygen[1706]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 May 27 03:22:48.471694 locksmithd[1746]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" May 27 03:22:48.487339 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. May 27 03:22:48.492481 systemd[1]: Starting issuegen.service - Generate /run/issue... May 27 03:22:48.509045 systemd[1]: issuegen.service: Deactivated successfully. May 27 03:22:48.509214 systemd[1]: Finished issuegen.service - Generate /run/issue. May 27 03:22:48.512420 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... May 27 03:22:48.535244 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. May 27 03:22:48.539012 systemd[1]: Started getty@tty1.service - Getty on tty1. May 27 03:22:48.542960 systemd[1]: Started serial-getty@ttyS0.service - Serial Getty on ttyS0. May 27 03:22:48.545654 systemd[1]: Reached target getty.target - Login Prompts. May 27 03:22:48.595176 containerd[1708]: time="2025-05-27T03:22:48Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 May 27 03:22:48.596039 containerd[1708]: time="2025-05-27T03:22:48.596013486Z" level=info msg="starting containerd" revision=06b99ca80cdbfbc6cc8bd567021738c9af2b36ce version=v2.0.4 May 27 03:22:48.607379 containerd[1708]: time="2025-05-27T03:22:48.607354162Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="7.568µs" May 27 03:22:48.607458 containerd[1708]: time="2025-05-27T03:22:48.607447243Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 May 27 03:22:48.607498 containerd[1708]: time="2025-05-27T03:22:48.607490876Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 May 27 03:22:48.607636 containerd[1708]: time="2025-05-27T03:22:48.607628190Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 May 27 03:22:48.607669 containerd[1708]: time="2025-05-27T03:22:48.607663245Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 May 27 03:22:48.607710 containerd[1708]: time="2025-05-27T03:22:48.607702855Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 03:22:48.607769 containerd[1708]: time="2025-05-27T03:22:48.607761063Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 May 27 03:22:48.607797 containerd[1708]: time="2025-05-27T03:22:48.607790732Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 03:22:48.607975 containerd[1708]: time="2025-05-27T03:22:48.607965212Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 May 27 03:22:48.608006 containerd[1708]: time="2025-05-27T03:22:48.607999748Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 03:22:48.608036 containerd[1708]: time="2025-05-27T03:22:48.608029174Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 May 27 03:22:48.608069 containerd[1708]: time="2025-05-27T03:22:48.608063065Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 May 27 03:22:48.608139 containerd[1708]: time="2025-05-27T03:22:48.608133128Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 May 27 03:22:48.608280 containerd[1708]: time="2025-05-27T03:22:48.608272462Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 03:22:48.608323 containerd[1708]: time="2025-05-27T03:22:48.608314945Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 May 27 03:22:48.608355 containerd[1708]: time="2025-05-27T03:22:48.608348693Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 May 27 03:22:48.608413 containerd[1708]: time="2025-05-27T03:22:48.608405348Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 May 27 03:22:48.608652 containerd[1708]: time="2025-05-27T03:22:48.608642953Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 May 27 03:22:48.608735 containerd[1708]: time="2025-05-27T03:22:48.608728288Z" level=info msg="metadata content store policy set" policy=shared May 27 03:22:48.618530 containerd[1708]: time="2025-05-27T03:22:48.618479320Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 May 27 03:22:48.618530 containerd[1708]: time="2025-05-27T03:22:48.618522072Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 May 27 03:22:48.618602 containerd[1708]: time="2025-05-27T03:22:48.618536186Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 May 27 03:22:48.618602 containerd[1708]: time="2025-05-27T03:22:48.618545622Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 May 27 03:22:48.618602 containerd[1708]: time="2025-05-27T03:22:48.618556658Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 May 27 03:22:48.618602 containerd[1708]: time="2025-05-27T03:22:48.618570421Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 May 27 03:22:48.618602 containerd[1708]: time="2025-05-27T03:22:48.618582098Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 May 27 03:22:48.618602 containerd[1708]: time="2025-05-27T03:22:48.618592147Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 May 27 03:22:48.618602 containerd[1708]: time="2025-05-27T03:22:48.618601531Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 May 27 03:22:48.618710 containerd[1708]: time="2025-05-27T03:22:48.618610520Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 May 27 03:22:48.618710 containerd[1708]: time="2025-05-27T03:22:48.618618482Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 May 27 03:22:48.618710 containerd[1708]: time="2025-05-27T03:22:48.618634566Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 May 27 03:22:48.618755 containerd[1708]: time="2025-05-27T03:22:48.618723627Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 May 27 03:22:48.618755 containerd[1708]: time="2025-05-27T03:22:48.618738026Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 May 27 03:22:48.618755 containerd[1708]: time="2025-05-27T03:22:48.618749905Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 May 27 03:22:48.618803 containerd[1708]: time="2025-05-27T03:22:48.618759026Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 May 27 03:22:48.618803 containerd[1708]: time="2025-05-27T03:22:48.618768161Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 May 27 03:22:48.618803 containerd[1708]: time="2025-05-27T03:22:48.618776791Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 May 27 03:22:48.618803 containerd[1708]: time="2025-05-27T03:22:48.618785514Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 May 27 03:22:48.618803 containerd[1708]: time="2025-05-27T03:22:48.618793831Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 May 27 03:22:48.618882 containerd[1708]: time="2025-05-27T03:22:48.618803061Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 May 27 03:22:48.618882 containerd[1708]: time="2025-05-27T03:22:48.618811481Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 May 27 03:22:48.618882 containerd[1708]: time="2025-05-27T03:22:48.618820052Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 May 27 03:22:48.618882 containerd[1708]: time="2025-05-27T03:22:48.618866921Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" May 27 03:22:48.618882 containerd[1708]: time="2025-05-27T03:22:48.618877317Z" level=info msg="Start snapshots syncer" May 27 03:22:48.618959 containerd[1708]: time="2025-05-27T03:22:48.618893378Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 May 27 03:22:48.619502 containerd[1708]: time="2025-05-27T03:22:48.619116672Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"/opt/cni/bin\",\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" May 27 03:22:48.619502 containerd[1708]: time="2025-05-27T03:22:48.619154475Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.619812952Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.619913204Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.619933669Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.619944446Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.619953291Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.619964509Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.619973155Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.619982324Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.620211406Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.620231269Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.620252660Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.620285762Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.620302962Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 May 27 03:22:48.620523 containerd[1708]: time="2025-05-27T03:22:48.620314922Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 03:22:48.620787 containerd[1708]: time="2025-05-27T03:22:48.620324628Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 May 27 03:22:48.620787 containerd[1708]: time="2025-05-27T03:22:48.620335088Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 May 27 03:22:48.620787 containerd[1708]: time="2025-05-27T03:22:48.620347348Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 May 27 03:22:48.620787 containerd[1708]: time="2025-05-27T03:22:48.620384254Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 May 27 03:22:48.620787 containerd[1708]: time="2025-05-27T03:22:48.620396878Z" level=info msg="runtime interface created" May 27 03:22:48.620787 containerd[1708]: time="2025-05-27T03:22:48.620401971Z" level=info msg="created NRI interface" May 27 03:22:48.620787 containerd[1708]: time="2025-05-27T03:22:48.620412514Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 May 27 03:22:48.620787 containerd[1708]: time="2025-05-27T03:22:48.620423265Z" level=info msg="Connect containerd service" May 27 03:22:48.621526 containerd[1708]: time="2025-05-27T03:22:48.621513094Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" May 27 03:22:48.622855 containerd[1708]: time="2025-05-27T03:22:48.622796656Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 03:22:48.777657 tar[1702]: linux-amd64/README.md May 27 03:22:48.789334 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. May 27 03:22:48.834972 containerd[1708]: time="2025-05-27T03:22:48.834905825Z" level=info msg="Start subscribing containerd event" May 27 03:22:48.834972 containerd[1708]: time="2025-05-27T03:22:48.834940103Z" level=info msg="Start recovering state" May 27 03:22:48.835102 containerd[1708]: time="2025-05-27T03:22:48.835095006Z" level=info msg="Start event monitor" May 27 03:22:48.835145 containerd[1708]: time="2025-05-27T03:22:48.835139158Z" level=info msg="Start cni network conf syncer for default" May 27 03:22:48.835207 containerd[1708]: time="2025-05-27T03:22:48.835171910Z" level=info msg="Start streaming server" May 27 03:22:48.835207 containerd[1708]: time="2025-05-27T03:22:48.835179831Z" level=info msg="Registered namespace \"k8s.io\" with NRI" May 27 03:22:48.835207 containerd[1708]: time="2025-05-27T03:22:48.835186114Z" level=info msg="runtime interface starting up..." May 27 03:22:48.835207 containerd[1708]: time="2025-05-27T03:22:48.835191654Z" level=info msg="starting plugins..." May 27 03:22:48.835344 containerd[1708]: time="2025-05-27T03:22:48.835200157Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" May 27 03:22:48.835695 containerd[1708]: time="2025-05-27T03:22:48.835677573Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc May 27 03:22:48.835767 containerd[1708]: time="2025-05-27T03:22:48.835760211Z" level=info msg=serving... address=/run/containerd/containerd.sock May 27 03:22:48.835997 systemd[1]: Started containerd.service - containerd container runtime. May 27 03:22:48.836238 containerd[1708]: time="2025-05-27T03:22:48.836143404Z" level=info msg="containerd successfully booted in 0.241259s" May 27 03:22:49.208605 systemd-networkd[1584]: eth0: Gained IPv6LL May 27 03:22:49.210835 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. May 27 03:22:49.213167 systemd[1]: Reached target network-online.target - Network is Online. May 27 03:22:49.216387 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:22:49.219586 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... May 27 03:22:49.225836 systemd[1]: Starting waagent.service - Microsoft Azure Linux Agent... May 27 03:22:49.249005 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. May 27 03:22:49.252601 systemd[1]: Started waagent.service - Microsoft Azure Linux Agent. May 27 03:22:49.780609 waagent[1817]: 2025-05-27T03:22:49.780566Z INFO Daemon Daemon Azure Linux Agent Version: 2.12.0.4 May 27 03:22:49.782628 waagent[1817]: 2025-05-27T03:22:49.782584Z INFO Daemon Daemon OS: flatcar 4344.0.0 May 27 03:22:49.783942 waagent[1817]: 2025-05-27T03:22:49.783910Z INFO Daemon Daemon Python: 3.11.12 May 27 03:22:49.786680 waagent[1817]: 2025-05-27T03:22:49.786634Z INFO Daemon Daemon Run daemon May 27 03:22:49.789599 waagent[1817]: 2025-05-27T03:22:49.789563Z INFO Daemon Daemon No RDMA handler exists for distro='Flatcar Container Linux by Kinvolk' version='4344.0.0' May 27 03:22:49.792081 waagent[1817]: 2025-05-27T03:22:49.792037Z INFO Daemon Daemon Using waagent for provisioning May 27 03:22:49.793791 waagent[1817]: 2025-05-27T03:22:49.793753Z INFO Daemon Daemon Activate resource disk May 27 03:22:49.795511 waagent[1817]: 2025-05-27T03:22:49.795466Z INFO Daemon Daemon Searching gen1 prefix 00000000-0001 or gen2 f8b3781a-1e82-4818-a1c3-63d806ec15bb May 27 03:22:49.799798 waagent[1817]: 2025-05-27T03:22:49.798957Z INFO Daemon Daemon Found device: None May 27 03:22:49.800839 waagent[1817]: 2025-05-27T03:22:49.800532Z ERROR Daemon Daemon Failed to mount resource disk [ResourceDiskError] unable to detect disk topology May 27 03:22:49.803237 waagent[1817]: 2025-05-27T03:22:49.802731Z ERROR Daemon Daemon Event: name=WALinuxAgent, op=ActivateResourceDisk, message=[ResourceDiskError] unable to detect disk topology, duration=0 May 27 03:22:49.806214 waagent[1817]: 2025-05-27T03:22:49.806174Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 27 03:22:49.808388 waagent[1817]: 2025-05-27T03:22:49.807795Z INFO Daemon Daemon Running default provisioning handler May 27 03:22:49.814711 waagent[1817]: 2025-05-27T03:22:49.814677Z INFO Daemon Daemon Unable to get cloud-init enabled status from systemctl: Command '['systemctl', 'is-enabled', 'cloud-init-local.service']' returned non-zero exit status 4. May 27 03:22:49.819910 waagent[1817]: 2025-05-27T03:22:49.819870Z INFO Daemon Daemon Unable to get cloud-init enabled status from service: [Errno 2] No such file or directory: 'service' May 27 03:22:49.822971 waagent[1817]: 2025-05-27T03:22:49.822793Z INFO Daemon Daemon cloud-init is enabled: False May 27 03:22:49.824225 waagent[1817]: 2025-05-27T03:22:49.824191Z INFO Daemon Daemon Copying ovf-env.xml May 27 03:22:49.864302 waagent[1817]: 2025-05-27T03:22:49.862254Z INFO Daemon Daemon Successfully mounted dvd May 27 03:22:49.879027 systemd[1]: mnt-cdrom-secure.mount: Deactivated successfully. May 27 03:22:49.881164 waagent[1817]: 2025-05-27T03:22:49.881126Z INFO Daemon Daemon Detect protocol endpoint May 27 03:22:49.882520 waagent[1817]: 2025-05-27T03:22:49.882229Z INFO Daemon Daemon Clean protocol and wireserver endpoint May 27 03:22:49.884822 waagent[1817]: 2025-05-27T03:22:49.884538Z INFO Daemon Daemon WireServer endpoint is not found. Rerun dhcp handler May 27 03:22:49.886730 waagent[1817]: 2025-05-27T03:22:49.886559Z INFO Daemon Daemon Test for route to 168.63.129.16 May 27 03:22:49.889653 waagent[1817]: 2025-05-27T03:22:49.889614Z INFO Daemon Daemon Route to 168.63.129.16 exists May 27 03:22:49.892517 waagent[1817]: 2025-05-27T03:22:49.892478Z INFO Daemon Daemon Wire server endpoint:168.63.129.16 May 27 03:22:49.902605 waagent[1817]: 2025-05-27T03:22:49.902578Z INFO Daemon Daemon Fabric preferred wire protocol version:2015-04-05 May 27 03:22:49.904065 waagent[1817]: 2025-05-27T03:22:49.904046Z INFO Daemon Daemon Wire protocol version:2012-11-30 May 27 03:22:49.906580 waagent[1817]: 2025-05-27T03:22:49.906499Z INFO Daemon Daemon Server preferred version:2015-04-05 May 27 03:22:50.019329 waagent[1817]: 2025-05-27T03:22:50.019159Z INFO Daemon Daemon Initializing goal state during protocol detection May 27 03:22:50.021518 waagent[1817]: 2025-05-27T03:22:50.021478Z INFO Daemon Daemon Forcing an update of the goal state. May 27 03:22:50.025712 waagent[1817]: 2025-05-27T03:22:50.025671Z INFO Daemon Fetched a new incarnation for the WireServer goal state [incarnation 1] May 27 03:22:50.039425 waagent[1817]: 2025-05-27T03:22:50.039396Z INFO Daemon Daemon HostGAPlugin version: 1.0.8.164 May 27 03:22:50.040851 waagent[1817]: 2025-05-27T03:22:50.040822Z INFO Daemon May 27 03:22:50.042785 waagent[1817]: 2025-05-27T03:22:50.042499Z INFO Daemon Fetched new vmSettings [HostGAPlugin correlation ID: b27d7f28-9fa9-483b-b236-157214315d58 eTag: 12084408785590914154 source: Fabric] May 27 03:22:50.047874 waagent[1817]: 2025-05-27T03:22:50.046752Z INFO Daemon The vmSettings originated via Fabric; will ignore them. May 27 03:22:50.049319 waagent[1817]: 2025-05-27T03:22:50.048964Z INFO Daemon May 27 03:22:50.051546 waagent[1817]: 2025-05-27T03:22:50.051505Z INFO Daemon Fetching full goal state from the WireServer [incarnation 1] May 27 03:22:50.062213 waagent[1817]: 2025-05-27T03:22:50.062184Z INFO Daemon Daemon Downloading artifacts profile blob May 27 03:22:50.065541 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:22:50.069038 systemd[1]: Reached target multi-user.target - Multi-User System. May 27 03:22:50.070618 systemd[1]: Startup finished in 2.810s (kernel) + 1min 8.481s (initrd) + 4.903s (userspace) = 1min 16.195s. May 27 03:22:50.074665 (kubelet)[1832]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 03:22:50.134584 login[1781]: pam_lastlog(login:session): file /var/log/lastlog is locked/write, retrying May 27 03:22:50.135108 login[1780]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 27 03:22:50.141782 systemd[1]: Created slice user-500.slice - User Slice of UID 500. May 27 03:22:50.146622 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... May 27 03:22:50.162270 waagent[1817]: 2025-05-27T03:22:50.158936Z INFO Daemon Downloaded certificate {'thumbprint': '70052F7A2CFDF10023730DF259CE4E5474B1D050', 'hasPrivateKey': True} May 27 03:22:50.162270 waagent[1817]: 2025-05-27T03:22:50.159307Z INFO Daemon Fetch goal state completed May 27 03:22:50.161314 systemd-logind[1695]: New session 1 of user core. May 27 03:22:50.165280 waagent[1817]: 2025-05-27T03:22:50.164742Z INFO Daemon Daemon Starting provisioning May 27 03:22:50.165280 waagent[1817]: 2025-05-27T03:22:50.164860Z INFO Daemon Daemon Handle ovf-env.xml. May 27 03:22:50.165280 waagent[1817]: 2025-05-27T03:22:50.165024Z INFO Daemon Daemon Set hostname [ci-4344.0.0-a-c2c0d8ddb2] May 27 03:22:50.172897 waagent[1817]: 2025-05-27T03:22:50.170752Z INFO Daemon Daemon Publish hostname [ci-4344.0.0-a-c2c0d8ddb2] May 27 03:22:50.173263 waagent[1817]: 2025-05-27T03:22:50.173221Z INFO Daemon Daemon Examine /proc/net/route for primary interface May 27 03:22:50.174198 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. May 27 03:22:50.176564 waagent[1817]: 2025-05-27T03:22:50.176525Z INFO Daemon Daemon Primary interface is [eth0] May 27 03:22:50.178588 systemd[1]: Starting user@500.service - User Manager for UID 500... May 27 03:22:50.184840 systemd-networkd[1584]: eth0: found matching network '/usr/lib/systemd/network/zz-default.network', based on potentially unpredictable interface name. May 27 03:22:50.184846 systemd-networkd[1584]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. May 27 03:22:50.184897 systemd-networkd[1584]: eth0: DHCP lease lost May 27 03:22:50.188375 waagent[1817]: 2025-05-27T03:22:50.186255Z INFO Daemon Daemon Create user account if not exists May 27 03:22:50.188375 waagent[1817]: 2025-05-27T03:22:50.186427Z INFO Daemon Daemon User core already exists, skip useradd May 27 03:22:50.188375 waagent[1817]: 2025-05-27T03:22:50.187103Z INFO Daemon Daemon Configure sudoer May 27 03:22:50.189847 (systemd)[1846]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) May 27 03:22:50.192476 systemd-logind[1695]: New session c1 of user core. May 27 03:22:50.193303 waagent[1817]: 2025-05-27T03:22:50.193228Z INFO Daemon Daemon Configure sshd May 27 03:22:50.197492 waagent[1817]: 2025-05-27T03:22:50.197421Z INFO Daemon Daemon Added a configuration snippet disabling SSH password-based authentication methods. It also configures SSH client probing to keep connections alive. May 27 03:22:50.202995 waagent[1817]: 2025-05-27T03:22:50.202927Z INFO Daemon Daemon Deploy ssh public key. May 27 03:22:50.209569 systemd-networkd[1584]: eth0: DHCPv4 address 10.200.8.20/24, gateway 10.200.8.1 acquired from 168.63.129.16 May 27 03:22:50.337913 systemd[1846]: Queued start job for default target default.target. May 27 03:22:50.343076 systemd[1846]: Created slice app.slice - User Application Slice. May 27 03:22:50.343100 systemd[1846]: Reached target paths.target - Paths. May 27 03:22:50.343274 systemd[1846]: Reached target timers.target - Timers. May 27 03:22:50.344623 systemd[1846]: Starting dbus.socket - D-Bus User Message Bus Socket... May 27 03:22:50.352999 systemd[1846]: Listening on dbus.socket - D-Bus User Message Bus Socket. May 27 03:22:50.353047 systemd[1846]: Reached target sockets.target - Sockets. May 27 03:22:50.353076 systemd[1846]: Reached target basic.target - Basic System. May 27 03:22:50.353099 systemd[1846]: Reached target default.target - Main User Target. May 27 03:22:50.353118 systemd[1846]: Startup finished in 153ms. May 27 03:22:50.353186 systemd[1]: Started user@500.service - User Manager for UID 500. May 27 03:22:50.354622 systemd[1]: Started session-1.scope - Session 1 of User core. May 27 03:22:50.596900 kubelet[1832]: E0527 03:22:50.596826 1832 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 03:22:50.598171 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 03:22:50.598285 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 03:22:50.598542 systemd[1]: kubelet.service: Consumed 832ms CPU time, 265.2M memory peak. May 27 03:22:51.136601 login[1781]: pam_unix(login:session): session opened for user core(uid=500) by LOGIN(uid=0) May 27 03:22:51.140477 systemd-logind[1695]: New session 2 of user core. May 27 03:22:51.146555 systemd[1]: Started session-2.scope - Session 2 of User core. May 27 03:22:51.252883 waagent[1817]: 2025-05-27T03:22:51.252822Z INFO Daemon Daemon Provisioning complete May 27 03:22:51.263341 waagent[1817]: 2025-05-27T03:22:51.263312Z INFO Daemon Daemon RDMA capabilities are not enabled, skipping May 27 03:22:51.264622 waagent[1817]: 2025-05-27T03:22:51.264597Z INFO Daemon Daemon End of log to /dev/console. The agent will now check for updates and then will process extensions. May 27 03:22:51.266462 waagent[1817]: 2025-05-27T03:22:51.266397Z INFO Daemon Daemon Installed Agent WALinuxAgent-2.12.0.4 is the most current agent May 27 03:22:51.352692 waagent[1886]: 2025-05-27T03:22:51.352639Z INFO ExtHandler ExtHandler Azure Linux Agent (Goal State Agent version 2.12.0.4) May 27 03:22:51.352896 waagent[1886]: 2025-05-27T03:22:51.352720Z INFO ExtHandler ExtHandler OS: flatcar 4344.0.0 May 27 03:22:51.352896 waagent[1886]: 2025-05-27T03:22:51.352755Z INFO ExtHandler ExtHandler Python: 3.11.12 May 27 03:22:51.352896 waagent[1886]: 2025-05-27T03:22:51.352790Z INFO ExtHandler ExtHandler CPU Arch: x86_64 May 27 03:22:51.361377 waagent[1886]: 2025-05-27T03:22:51.361338Z INFO ExtHandler ExtHandler Distro: flatcar-4344.0.0; OSUtil: FlatcarUtil; AgentService: waagent; Python: 3.11.12; Arch: x86_64; systemd: True; LISDrivers: Absent; logrotate: logrotate 3.22.0; May 27 03:22:51.361541 waagent[1886]: 2025-05-27T03:22:51.361521Z INFO ExtHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 27 03:22:51.361599 waagent[1886]: 2025-05-27T03:22:51.361567Z INFO ExtHandler ExtHandler Wire server endpoint:168.63.129.16 May 27 03:22:51.366242 waagent[1886]: 2025-05-27T03:22:51.366194Z INFO ExtHandler Fetched a new incarnation for the WireServer goal state [incarnation 1] May 27 03:22:51.373994 waagent[1886]: 2025-05-27T03:22:51.373965Z INFO ExtHandler ExtHandler HostGAPlugin version: 1.0.8.164 May 27 03:22:51.374294 waagent[1886]: 2025-05-27T03:22:51.374267Z INFO ExtHandler May 27 03:22:51.374341 waagent[1886]: 2025-05-27T03:22:51.374313Z INFO ExtHandler Fetched new vmSettings [HostGAPlugin correlation ID: 76a08e6d-01b9-4762-a265-f8d775ffc701 eTag: 12084408785590914154 source: Fabric] May 27 03:22:51.374529 waagent[1886]: 2025-05-27T03:22:51.374506Z INFO ExtHandler The vmSettings originated via Fabric; will ignore them. May 27 03:22:51.374813 waagent[1886]: 2025-05-27T03:22:51.374791Z INFO ExtHandler May 27 03:22:51.374844 waagent[1886]: 2025-05-27T03:22:51.374825Z INFO ExtHandler Fetching full goal state from the WireServer [incarnation 1] May 27 03:22:51.379215 waagent[1886]: 2025-05-27T03:22:51.379189Z INFO ExtHandler ExtHandler Downloading artifacts profile blob May 27 03:22:51.455516 waagent[1886]: 2025-05-27T03:22:51.455424Z INFO ExtHandler Downloaded certificate {'thumbprint': '70052F7A2CFDF10023730DF259CE4E5474B1D050', 'hasPrivateKey': True} May 27 03:22:51.455785 waagent[1886]: 2025-05-27T03:22:51.455759Z INFO ExtHandler Fetch goal state completed May 27 03:22:51.467683 waagent[1886]: 2025-05-27T03:22:51.467646Z INFO ExtHandler ExtHandler OpenSSL version: OpenSSL 3.3.3 11 Feb 2025 (Library: OpenSSL 3.3.3 11 Feb 2025) May 27 03:22:51.471314 waagent[1886]: 2025-05-27T03:22:51.471269Z INFO ExtHandler ExtHandler WALinuxAgent-2.12.0.4 running as process 1886 May 27 03:22:51.471405 waagent[1886]: 2025-05-27T03:22:51.471371Z INFO ExtHandler ExtHandler ******** AutoUpdate.Enabled is set to False, not processing the operation ******** May 27 03:22:51.471652 waagent[1886]: 2025-05-27T03:22:51.471634Z INFO ExtHandler ExtHandler ******** AutoUpdate.UpdateToLatestVersion is set to False, not processing the operation ******** May 27 03:22:51.472491 waagent[1886]: 2025-05-27T03:22:51.472430Z INFO ExtHandler ExtHandler [CGI] Cgroup monitoring is not supported on ['flatcar', '4344.0.0', '', 'Flatcar Container Linux by Kinvolk'] May 27 03:22:51.472724 waagent[1886]: 2025-05-27T03:22:51.472703Z INFO ExtHandler ExtHandler [CGI] Agent will reset the quotas in case distro: ['flatcar', '4344.0.0', '', 'Flatcar Container Linux by Kinvolk'] went from supported to unsupported May 27 03:22:51.472822 waagent[1886]: 2025-05-27T03:22:51.472805Z INFO ExtHandler ExtHandler [CGI] Agent cgroups enabled: False May 27 03:22:51.473148 waagent[1886]: 2025-05-27T03:22:51.473129Z INFO ExtHandler ExtHandler Starting setup for Persistent firewall rules May 27 03:22:51.478916 waagent[1886]: 2025-05-27T03:22:51.478896Z INFO ExtHandler ExtHandler Firewalld service not running/unavailable, trying to set up waagent-network-setup.service May 27 03:22:51.479018 waagent[1886]: 2025-05-27T03:22:51.479002Z INFO ExtHandler ExtHandler Successfully updated the Binary file /var/lib/waagent/waagent-network-setup.py for firewall setup May 27 03:22:51.483313 waagent[1886]: 2025-05-27T03:22:51.483118Z INFO ExtHandler ExtHandler Service: waagent-network-setup.service not enabled. Adding it now May 27 03:22:51.487457 systemd[1]: Reload requested from client PID 1901 ('systemctl') (unit waagent.service)... May 27 03:22:51.487479 systemd[1]: Reloading... May 27 03:22:51.562765 zram_generator::config[1941]: No configuration found. May 27 03:22:51.626739 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:22:51.706419 systemd[1]: Reloading finished in 218 ms. May 27 03:22:51.721031 waagent[1886]: 2025-05-27T03:22:51.720571Z INFO ExtHandler ExtHandler Successfully added and enabled the waagent-network-setup.service May 27 03:22:51.721031 waagent[1886]: 2025-05-27T03:22:51.720650Z INFO ExtHandler ExtHandler Persistent firewall rules setup successfully May 27 03:22:51.902519 kernel: hv_storvsc f8b3781a-1e82-4818-a1c3-63d806ec15bb: tag#63 cmd 0x4a status: scsi 0x0 srb 0x20 hv 0xc0000001 May 27 03:22:52.256407 waagent[1886]: 2025-05-27T03:22:52.256366Z INFO ExtHandler ExtHandler DROP rule is not available which implies no firewall rules are set yet. Environment thread will set it up. May 27 03:22:52.256608 waagent[1886]: 2025-05-27T03:22:52.256587Z INFO ExtHandler ExtHandler Checking if log collection is allowed at this time [False]. All three conditions must be met: 1. configuration enabled [True], 2. cgroups v1 enabled [False] OR cgroups v2 is in use and v2 resource limiting configuration enabled [False], 3. python supported: [True] May 27 03:22:52.257100 waagent[1886]: 2025-05-27T03:22:52.257071Z INFO ExtHandler ExtHandler Starting env monitor service. May 27 03:22:52.257566 waagent[1886]: 2025-05-27T03:22:52.257542Z INFO ExtHandler ExtHandler Start SendTelemetryHandler service. May 27 03:22:52.257601 waagent[1886]: 2025-05-27T03:22:52.257576Z INFO EnvHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 27 03:22:52.257644 waagent[1886]: 2025-05-27T03:22:52.257626Z INFO EnvHandler ExtHandler Wire server endpoint:168.63.129.16 May 27 03:22:52.257679 waagent[1886]: 2025-05-27T03:22:52.257660Z INFO MonitorHandler ExtHandler WireServer endpoint 168.63.129.16 read from file May 27 03:22:52.257941 waagent[1886]: 2025-05-27T03:22:52.257919Z INFO EnvHandler ExtHandler Configure routes May 27 03:22:52.258022 waagent[1886]: 2025-05-27T03:22:52.257984Z INFO SendTelemetryHandler ExtHandler Successfully started the SendTelemetryHandler thread May 27 03:22:52.258166 waagent[1886]: 2025-05-27T03:22:52.258147Z INFO ExtHandler ExtHandler Start Extension Telemetry service. May 27 03:22:52.258222 waagent[1886]: 2025-05-27T03:22:52.258191Z INFO MonitorHandler ExtHandler Wire server endpoint:168.63.129.16 May 27 03:22:52.258401 waagent[1886]: 2025-05-27T03:22:52.258384Z INFO MonitorHandler ExtHandler Monitor.NetworkConfigurationChanges is disabled. May 27 03:22:52.258688 waagent[1886]: 2025-05-27T03:22:52.258638Z INFO EnvHandler ExtHandler Gateway:None May 27 03:22:52.258882 waagent[1886]: 2025-05-27T03:22:52.258862Z INFO TelemetryEventsCollector ExtHandler Extension Telemetry pipeline enabled: True May 27 03:22:52.258943 waagent[1886]: 2025-05-27T03:22:52.258914Z INFO TelemetryEventsCollector ExtHandler Successfully started the TelemetryEventsCollector thread May 27 03:22:52.259090 waagent[1886]: 2025-05-27T03:22:52.259047Z INFO EnvHandler ExtHandler Routes:None May 27 03:22:52.260644 waagent[1886]: 2025-05-27T03:22:52.260615Z INFO MonitorHandler ExtHandler Routing table from /proc/net/route: May 27 03:22:52.260644 waagent[1886]: Iface Destination Gateway Flags RefCnt Use Metric Mask MTU Window IRTT May 27 03:22:52.260644 waagent[1886]: eth0 00000000 0108C80A 0003 0 0 1024 00000000 0 0 0 May 27 03:22:52.260644 waagent[1886]: eth0 0008C80A 00000000 0001 0 0 1024 00FFFFFF 0 0 0 May 27 03:22:52.260644 waagent[1886]: eth0 0108C80A 00000000 0005 0 0 1024 FFFFFFFF 0 0 0 May 27 03:22:52.260644 waagent[1886]: eth0 10813FA8 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 27 03:22:52.260644 waagent[1886]: eth0 FEA9FEA9 0108C80A 0007 0 0 1024 FFFFFFFF 0 0 0 May 27 03:22:52.260801 waagent[1886]: 2025-05-27T03:22:52.260732Z INFO ExtHandler ExtHandler Goal State Period: 6 sec. This indicates how often the agent checks for new goal states and reports status. May 27 03:22:52.268972 waagent[1886]: 2025-05-27T03:22:52.267239Z INFO ExtHandler ExtHandler May 27 03:22:52.268972 waagent[1886]: 2025-05-27T03:22:52.267292Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState started [incarnation_1 channel: WireServer source: Fabric activity: a8e9ab2f-720b-4889-93e7-3b1af5fe8e24 correlation cbcddb39-4d52-416c-8f28-2b85ca64bc39 created: 2025-05-27T03:21:20.776413Z] May 27 03:22:52.268972 waagent[1886]: 2025-05-27T03:22:52.267561Z INFO ExtHandler ExtHandler No extension handlers found, not processing anything. May 27 03:22:52.268972 waagent[1886]: 2025-05-27T03:22:52.267962Z INFO ExtHandler ExtHandler ProcessExtensionsGoalState completed [incarnation_1 0 ms] May 27 03:22:52.273784 waagent[1886]: 2025-05-27T03:22:52.273749Z INFO MonitorHandler ExtHandler Network interfaces: May 27 03:22:52.273784 waagent[1886]: Executing ['ip', '-a', '-o', 'link']: May 27 03:22:52.273784 waagent[1886]: 1: lo: mtu 65536 qdisc noqueue state UNKNOWN mode DEFAULT group default qlen 1000\ link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 May 27 03:22:52.273784 waagent[1886]: 2: eth0: mtu 1500 qdisc mq state UP mode DEFAULT group default qlen 1000\ link/ether 00:22:48:9d:72:4e brd ff:ff:ff:ff:ff:ff\ alias Network Device May 27 03:22:52.273784 waagent[1886]: Executing ['ip', '-4', '-a', '-o', 'address']: May 27 03:22:52.273784 waagent[1886]: 1: lo inet 127.0.0.1/8 scope host lo\ valid_lft forever preferred_lft forever May 27 03:22:52.273784 waagent[1886]: 2: eth0 inet 10.200.8.20/24 metric 1024 brd 10.200.8.255 scope global eth0\ valid_lft forever preferred_lft forever May 27 03:22:52.273784 waagent[1886]: Executing ['ip', '-6', '-a', '-o', 'address']: May 27 03:22:52.273784 waagent[1886]: 1: lo inet6 ::1/128 scope host noprefixroute \ valid_lft forever preferred_lft forever May 27 03:22:52.273784 waagent[1886]: 2: eth0 inet6 fe80::222:48ff:fe9d:724e/64 scope link proto kernel_ll \ valid_lft forever preferred_lft forever May 27 03:22:52.296455 waagent[1886]: 2025-05-27T03:22:52.295966Z WARNING ExtHandler ExtHandler Failed to get firewall packets: 'iptables -w -t security -L OUTPUT --zero OUTPUT -nxv' failed: 2 (iptables v1.8.11 (nf_tables): Illegal option `--numeric' with this command May 27 03:22:52.296455 waagent[1886]: Try `iptables -h' or 'iptables --help' for more information.) May 27 03:22:52.296455 waagent[1886]: 2025-05-27T03:22:52.296305Z INFO ExtHandler ExtHandler [HEARTBEAT] Agent WALinuxAgent-2.12.0.4 is running as the goal state agent [DEBUG HeartbeatCounter: 0;HeartbeatId: 91580665-360E-49B7-A0AC-3FD72EA621F4;DroppedPackets: -1;UpdateGSErrors: 0;AutoUpdate: 0;UpdateMode: SelfUpdate;] May 27 03:22:52.306309 waagent[1886]: 2025-05-27T03:22:52.306268Z INFO EnvHandler ExtHandler Created firewall rules for the Azure Fabric: May 27 03:22:52.306309 waagent[1886]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 27 03:22:52.306309 waagent[1886]: pkts bytes target prot opt in out source destination May 27 03:22:52.306309 waagent[1886]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 27 03:22:52.306309 waagent[1886]: pkts bytes target prot opt in out source destination May 27 03:22:52.306309 waagent[1886]: Chain OUTPUT (policy ACCEPT 2 packets, 289 bytes) May 27 03:22:52.306309 waagent[1886]: pkts bytes target prot opt in out source destination May 27 03:22:52.306309 waagent[1886]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 27 03:22:52.306309 waagent[1886]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 27 03:22:52.306309 waagent[1886]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 27 03:22:52.308546 waagent[1886]: 2025-05-27T03:22:52.308507Z INFO EnvHandler ExtHandler Current Firewall rules: May 27 03:22:52.308546 waagent[1886]: Chain INPUT (policy ACCEPT 0 packets, 0 bytes) May 27 03:22:52.308546 waagent[1886]: pkts bytes target prot opt in out source destination May 27 03:22:52.308546 waagent[1886]: Chain FORWARD (policy ACCEPT 0 packets, 0 bytes) May 27 03:22:52.308546 waagent[1886]: pkts bytes target prot opt in out source destination May 27 03:22:52.308546 waagent[1886]: Chain OUTPUT (policy ACCEPT 2 packets, 289 bytes) May 27 03:22:52.308546 waagent[1886]: pkts bytes target prot opt in out source destination May 27 03:22:52.308546 waagent[1886]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 tcp dpt:53 May 27 03:22:52.308546 waagent[1886]: 0 0 ACCEPT tcp -- * * 0.0.0.0/0 168.63.129.16 owner UID match 0 May 27 03:22:52.308546 waagent[1886]: 0 0 DROP tcp -- * * 0.0.0.0/0 168.63.129.16 ctstate INVALID,NEW May 27 03:23:00.608750 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. May 27 03:23:00.610413 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:23:01.121157 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:23:01.123783 (kubelet)[2037]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 03:23:01.153281 kubelet[2037]: E0527 03:23:01.153227 2037 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 03:23:01.155637 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 03:23:01.155751 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 03:23:01.156030 systemd[1]: kubelet.service: Consumed 117ms CPU time, 110.2M memory peak. May 27 03:23:07.946295 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. May 27 03:23:07.947275 systemd[1]: Started sshd@0-10.200.8.20:22-10.200.16.10:51916.service - OpenSSH per-connection server daemon (10.200.16.10:51916). May 27 03:23:08.609366 sshd[2045]: Accepted publickey for core from 10.200.16.10 port 51916 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:23:08.610634 sshd-session[2045]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:08.615036 systemd-logind[1695]: New session 3 of user core. May 27 03:23:08.620558 systemd[1]: Started session-3.scope - Session 3 of User core. May 27 03:23:09.180348 systemd[1]: Started sshd@1-10.200.8.20:22-10.200.16.10:58612.service - OpenSSH per-connection server daemon (10.200.16.10:58612). May 27 03:23:09.814801 sshd[2050]: Accepted publickey for core from 10.200.16.10 port 58612 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:23:09.816008 sshd-session[2050]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:09.820022 systemd-logind[1695]: New session 4 of user core. May 27 03:23:09.825550 systemd[1]: Started session-4.scope - Session 4 of User core. May 27 03:23:10.262638 sshd[2052]: Connection closed by 10.200.16.10 port 58612 May 27 03:23:10.263114 sshd-session[2050]: pam_unix(sshd:session): session closed for user core May 27 03:23:10.266073 systemd[1]: sshd@1-10.200.8.20:22-10.200.16.10:58612.service: Deactivated successfully. May 27 03:23:10.267368 systemd[1]: session-4.scope: Deactivated successfully. May 27 03:23:10.267931 systemd-logind[1695]: Session 4 logged out. Waiting for processes to exit. May 27 03:23:10.268879 systemd-logind[1695]: Removed session 4. May 27 03:23:10.373250 systemd[1]: Started sshd@2-10.200.8.20:22-10.200.16.10:58620.service - OpenSSH per-connection server daemon (10.200.16.10:58620). May 27 03:23:11.008299 sshd[2058]: Accepted publickey for core from 10.200.16.10 port 58620 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:23:11.009455 sshd-session[2058]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:11.013418 systemd-logind[1695]: New session 5 of user core. May 27 03:23:11.020560 systemd[1]: Started session-5.scope - Session 5 of User core. May 27 03:23:11.358418 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. May 27 03:23:11.359528 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:23:11.457805 sshd[2060]: Connection closed by 10.200.16.10 port 58620 May 27 03:23:11.458117 sshd-session[2058]: pam_unix(sshd:session): session closed for user core May 27 03:23:11.460013 systemd[1]: sshd@2-10.200.8.20:22-10.200.16.10:58620.service: Deactivated successfully. May 27 03:23:11.461066 systemd[1]: session-5.scope: Deactivated successfully. May 27 03:23:11.462369 systemd-logind[1695]: Session 5 logged out. Waiting for processes to exit. May 27 03:23:11.463023 systemd-logind[1695]: Removed session 5. May 27 03:23:11.568285 systemd[1]: Started sshd@3-10.200.8.20:22-10.200.16.10:58624.service - OpenSSH per-connection server daemon (10.200.16.10:58624). May 27 03:23:11.812187 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:23:11.817659 (kubelet)[2076]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 03:23:11.853124 kubelet[2076]: E0527 03:23:11.853094 2076 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 03:23:11.854607 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 03:23:11.854709 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 03:23:11.854984 systemd[1]: kubelet.service: Consumed 115ms CPU time, 108.5M memory peak. May 27 03:23:11.867665 chronyd[1688]: Selected source PHC0 May 27 03:23:12.202410 sshd[2069]: Accepted publickey for core from 10.200.16.10 port 58624 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:23:12.203574 sshd-session[2069]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:12.207489 systemd-logind[1695]: New session 6 of user core. May 27 03:23:12.214576 systemd[1]: Started session-6.scope - Session 6 of User core. May 27 03:23:12.646667 sshd[2083]: Connection closed by 10.200.16.10 port 58624 May 27 03:23:12.647103 sshd-session[2069]: pam_unix(sshd:session): session closed for user core May 27 03:23:12.649611 systemd[1]: sshd@3-10.200.8.20:22-10.200.16.10:58624.service: Deactivated successfully. May 27 03:23:12.650793 systemd[1]: session-6.scope: Deactivated successfully. May 27 03:23:12.652081 systemd-logind[1695]: Session 6 logged out. Waiting for processes to exit. May 27 03:23:12.652764 systemd-logind[1695]: Removed session 6. May 27 03:23:12.761096 systemd[1]: Started sshd@4-10.200.8.20:22-10.200.16.10:58626.service - OpenSSH per-connection server daemon (10.200.16.10:58626). May 27 03:23:13.395723 sshd[2089]: Accepted publickey for core from 10.200.16.10 port 58626 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:23:13.396824 sshd-session[2089]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:13.400723 systemd-logind[1695]: New session 7 of user core. May 27 03:23:13.409566 systemd[1]: Started session-7.scope - Session 7 of User core. May 27 03:23:13.766039 sudo[2092]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 May 27 03:23:13.766220 sudo[2092]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:23:13.778240 sudo[2092]: pam_unix(sudo:session): session closed for user root May 27 03:23:13.879325 sshd[2091]: Connection closed by 10.200.16.10 port 58626 May 27 03:23:13.879933 sshd-session[2089]: pam_unix(sshd:session): session closed for user core May 27 03:23:13.882812 systemd[1]: sshd@4-10.200.8.20:22-10.200.16.10:58626.service: Deactivated successfully. May 27 03:23:13.883994 systemd[1]: session-7.scope: Deactivated successfully. May 27 03:23:13.884986 systemd-logind[1695]: Session 7 logged out. Waiting for processes to exit. May 27 03:23:13.885992 systemd-logind[1695]: Removed session 7. May 27 03:23:13.995075 systemd[1]: Started sshd@5-10.200.8.20:22-10.200.16.10:58638.service - OpenSSH per-connection server daemon (10.200.16.10:58638). May 27 03:23:14.632304 sshd[2098]: Accepted publickey for core from 10.200.16.10 port 58638 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:23:14.633518 sshd-session[2098]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:14.637488 systemd-logind[1695]: New session 8 of user core. May 27 03:23:14.643561 systemd[1]: Started session-8.scope - Session 8 of User core. May 27 03:23:14.979235 sudo[2102]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules May 27 03:23:14.979416 sudo[2102]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:23:14.984343 sudo[2102]: pam_unix(sudo:session): session closed for user root May 27 03:23:14.987342 sudo[2101]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules May 27 03:23:14.987570 sudo[2101]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:23:14.993464 systemd[1]: Starting audit-rules.service - Load Audit Rules... May 27 03:23:15.019980 augenrules[2124]: No rules May 27 03:23:15.020726 systemd[1]: audit-rules.service: Deactivated successfully. May 27 03:23:15.020896 systemd[1]: Finished audit-rules.service - Load Audit Rules. May 27 03:23:15.021546 sudo[2101]: pam_unix(sudo:session): session closed for user root May 27 03:23:15.124787 sshd[2100]: Connection closed by 10.200.16.10 port 58638 May 27 03:23:15.125202 sshd-session[2098]: pam_unix(sshd:session): session closed for user core May 27 03:23:15.128044 systemd[1]: sshd@5-10.200.8.20:22-10.200.16.10:58638.service: Deactivated successfully. May 27 03:23:15.129147 systemd[1]: session-8.scope: Deactivated successfully. May 27 03:23:15.129675 systemd-logind[1695]: Session 8 logged out. Waiting for processes to exit. May 27 03:23:15.130559 systemd-logind[1695]: Removed session 8. May 27 03:23:15.240076 systemd[1]: Started sshd@6-10.200.8.20:22-10.200.16.10:58652.service - OpenSSH per-connection server daemon (10.200.16.10:58652). May 27 03:23:15.876088 sshd[2133]: Accepted publickey for core from 10.200.16.10 port 58652 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:23:15.877194 sshd-session[2133]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:23:15.881133 systemd-logind[1695]: New session 9 of user core. May 27 03:23:15.885569 systemd[1]: Started session-9.scope - Session 9 of User core. May 27 03:23:16.222629 sudo[2136]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh May 27 03:23:16.222805 sudo[2136]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) May 27 03:23:16.603720 systemd[1]: Starting docker.service - Docker Application Container Engine... May 27 03:23:16.616691 (dockerd)[2154]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU May 27 03:23:16.834118 dockerd[2154]: time="2025-05-27T03:23:16.834082149Z" level=info msg="Starting up" May 27 03:23:16.835025 dockerd[2154]: time="2025-05-27T03:23:16.835004548Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" May 27 03:23:16.961201 dockerd[2154]: time="2025-05-27T03:23:16.961163096Z" level=info msg="Loading containers: start." May 27 03:23:16.977464 kernel: Initializing XFRM netlink socket May 27 03:23:17.157587 systemd-networkd[1584]: docker0: Link UP May 27 03:23:17.169390 dockerd[2154]: time="2025-05-27T03:23:17.169356843Z" level=info msg="Loading containers: done." May 27 03:23:17.186383 dockerd[2154]: time="2025-05-27T03:23:17.186360669Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 May 27 03:23:17.186486 dockerd[2154]: time="2025-05-27T03:23:17.186409863Z" level=info msg="Docker daemon" commit=bbd0a17ccc67e48d4a69393287b7fcc4f0578683 containerd-snapshotter=false storage-driver=overlay2 version=28.0.1 May 27 03:23:17.186512 dockerd[2154]: time="2025-05-27T03:23:17.186490045Z" level=info msg="Initializing buildkit" May 27 03:23:17.221048 dockerd[2154]: time="2025-05-27T03:23:17.220980647Z" level=info msg="Completed buildkit initialization" May 27 03:23:17.225819 dockerd[2154]: time="2025-05-27T03:23:17.225782362Z" level=info msg="Daemon has completed initialization" May 27 03:23:17.225959 dockerd[2154]: time="2025-05-27T03:23:17.225837942Z" level=info msg="API listen on /run/docker.sock" May 27 03:23:17.226018 systemd[1]: Started docker.service - Docker Application Container Engine. May 27 03:23:18.279513 containerd[1708]: time="2025-05-27T03:23:18.279487650Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\"" May 27 03:23:19.138179 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1886608474.mount: Deactivated successfully. May 27 03:23:20.210367 containerd[1708]: time="2025-05-27T03:23:20.210338799Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:20.212299 containerd[1708]: time="2025-05-27T03:23:20.212270828Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.5: active requests=0, bytes read=28797819" May 27 03:23:20.214699 containerd[1708]: time="2025-05-27T03:23:20.214667131Z" level=info msg="ImageCreate event name:\"sha256:495c5ce47cf7c8b58655ef50d0f0a9b43c5ae18492059dc9af4c9aacae82a5a4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:20.217674 containerd[1708]: time="2025-05-27T03:23:20.217632341Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:20.218214 containerd[1708]: time="2025-05-27T03:23:20.218119064Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.5\" with image id \"sha256:495c5ce47cf7c8b58655ef50d0f0a9b43c5ae18492059dc9af4c9aacae82a5a4\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.5\", repo digest \"registry.k8s.io/kube-apiserver@sha256:0bee1bf751fe06009678c0cde7545443ba3a8d2edf71cea4c69cbb5774b9bf47\", size \"28794611\" in 1.938601248s" May 27 03:23:20.218214 containerd[1708]: time="2025-05-27T03:23:20.218144199Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.5\" returns image reference \"sha256:495c5ce47cf7c8b58655ef50d0f0a9b43c5ae18492059dc9af4c9aacae82a5a4\"" May 27 03:23:20.218573 containerd[1708]: time="2025-05-27T03:23:20.218558577Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\"" May 27 03:23:21.386305 containerd[1708]: time="2025-05-27T03:23:21.386281624Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:21.388240 containerd[1708]: time="2025-05-27T03:23:21.388211650Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.5: active requests=0, bytes read=24782531" May 27 03:23:21.390450 containerd[1708]: time="2025-05-27T03:23:21.390407647Z" level=info msg="ImageCreate event name:\"sha256:85dcaf69f000132c34fa34452e0fd8444bdf360b593fe06b1103680f6ecc7e00\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:21.393463 containerd[1708]: time="2025-05-27T03:23:21.393410059Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:21.393970 containerd[1708]: time="2025-05-27T03:23:21.393895488Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.5\" with image id \"sha256:85dcaf69f000132c34fa34452e0fd8444bdf360b593fe06b1103680f6ecc7e00\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.5\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:79bcf2f5e614c336c02dcea9dfcdf485d7297aed6a21239a99c87f7164f9baca\", size \"26384363\" in 1.175182295s" May 27 03:23:21.393970 containerd[1708]: time="2025-05-27T03:23:21.393916424Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.5\" returns image reference \"sha256:85dcaf69f000132c34fa34452e0fd8444bdf360b593fe06b1103680f6ecc7e00\"" May 27 03:23:21.394330 containerd[1708]: time="2025-05-27T03:23:21.394311522Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\"" May 27 03:23:21.858593 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. May 27 03:23:21.860199 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:23:22.422711 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:23:22.431662 (kubelet)[2424]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS May 27 03:23:22.486968 kubelet[2424]: E0527 03:23:22.486923 2424 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" May 27 03:23:22.489756 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE May 27 03:23:22.489867 systemd[1]: kubelet.service: Failed with result 'exit-code'. May 27 03:23:22.490313 systemd[1]: kubelet.service: Consumed 124ms CPU time, 110.8M memory peak. May 27 03:23:22.698833 containerd[1708]: time="2025-05-27T03:23:22.698777067Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:22.700720 containerd[1708]: time="2025-05-27T03:23:22.700687810Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.5: active requests=0, bytes read=19176071" May 27 03:23:22.703741 containerd[1708]: time="2025-05-27T03:23:22.703708579Z" level=info msg="ImageCreate event name:\"sha256:2729fb488407e634105c62238a45a599db1692680526e20844060a7a8197b45a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:22.709151 containerd[1708]: time="2025-05-27T03:23:22.709118926Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:22.709762 containerd[1708]: time="2025-05-27T03:23:22.709553983Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.5\" with image id \"sha256:2729fb488407e634105c62238a45a599db1692680526e20844060a7a8197b45a\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.5\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f0f39d8b9808c407cacb3a46a5a9ce4d4a4a7cf3b674ba4bd221f5bc90051d2a\", size \"20777921\" in 1.315214869s" May 27 03:23:22.709762 containerd[1708]: time="2025-05-27T03:23:22.709580337Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.5\" returns image reference \"sha256:2729fb488407e634105c62238a45a599db1692680526e20844060a7a8197b45a\"" May 27 03:23:22.710084 containerd[1708]: time="2025-05-27T03:23:22.710053104Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\"" May 27 03:23:23.535216 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1090243405.mount: Deactivated successfully. May 27 03:23:23.847638 containerd[1708]: time="2025-05-27T03:23:23.847584899Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:23.849558 containerd[1708]: time="2025-05-27T03:23:23.849529328Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.5: active requests=0, bytes read=30892880" May 27 03:23:23.851722 containerd[1708]: time="2025-05-27T03:23:23.851686009Z" level=info msg="ImageCreate event name:\"sha256:f532b7356fac4d7c4e4f6763bb5a15a43e3bb740c9fb26c85b906a4d971f2363\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:23.854445 containerd[1708]: time="2025-05-27T03:23:23.854388375Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:23.854802 containerd[1708]: time="2025-05-27T03:23:23.854781904Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.5\" with image id \"sha256:f532b7356fac4d7c4e4f6763bb5a15a43e3bb740c9fb26c85b906a4d971f2363\", repo tag \"registry.k8s.io/kube-proxy:v1.32.5\", repo digest \"registry.k8s.io/kube-proxy@sha256:9dc6553459c3319525ba4090a780db1a133d5dee68c08e07f9b9d6ba83b42a0b\", size \"30891891\" in 1.144705788s" May 27 03:23:23.854840 containerd[1708]: time="2025-05-27T03:23:23.854807810Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.5\" returns image reference \"sha256:f532b7356fac4d7c4e4f6763bb5a15a43e3bb740c9fb26c85b906a4d971f2363\"" May 27 03:23:23.855323 containerd[1708]: time="2025-05-27T03:23:23.855300291Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" May 27 03:23:24.446733 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1176769749.mount: Deactivated successfully. May 27 03:23:25.167101 containerd[1708]: time="2025-05-27T03:23:25.167073576Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:25.169073 containerd[1708]: time="2025-05-27T03:23:25.169046557Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=18565249" May 27 03:23:25.171373 containerd[1708]: time="2025-05-27T03:23:25.171341801Z" level=info msg="ImageCreate event name:\"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:25.174287 containerd[1708]: time="2025-05-27T03:23:25.174244036Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:25.174848 containerd[1708]: time="2025-05-27T03:23:25.174749043Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"18562039\" in 1.319428728s" May 27 03:23:25.174848 containerd[1708]: time="2025-05-27T03:23:25.174770452Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:c69fa2e9cbf5f42dc48af631e956d3f95724c13f91596bc567591790e5e36db6\"" May 27 03:23:25.175253 containerd[1708]: time="2025-05-27T03:23:25.175232345Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" May 27 03:23:25.664917 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount523992678.mount: Deactivated successfully. May 27 03:23:25.678299 containerd[1708]: time="2025-05-27T03:23:25.678275733Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 03:23:25.680317 containerd[1708]: time="2025-05-27T03:23:25.680287897Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=321146" May 27 03:23:25.682686 containerd[1708]: time="2025-05-27T03:23:25.682652646Z" level=info msg="ImageCreate event name:\"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 03:23:25.685625 containerd[1708]: time="2025-05-27T03:23:25.685593581Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" May 27 03:23:25.686267 containerd[1708]: time="2025-05-27T03:23:25.685994878Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"320368\" in 510.741004ms" May 27 03:23:25.686267 containerd[1708]: time="2025-05-27T03:23:25.686015753Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:873ed75102791e5b0b8a7fcd41606c92fcec98d56d05ead4ac5131650004c136\"" May 27 03:23:25.686340 containerd[1708]: time="2025-05-27T03:23:25.686314140Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" May 27 03:23:26.207499 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2504158519.mount: Deactivated successfully. May 27 03:23:27.685072 containerd[1708]: time="2025-05-27T03:23:27.685042727Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:27.686956 containerd[1708]: time="2025-05-27T03:23:27.686929609Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=57551368" May 27 03:23:27.689273 containerd[1708]: time="2025-05-27T03:23:27.689214358Z" level=info msg="ImageCreate event name:\"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:27.692582 containerd[1708]: time="2025-05-27T03:23:27.692542941Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:27.693240 containerd[1708]: time="2025-05-27T03:23:27.693154780Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"57680541\" in 2.006823632s" May 27 03:23:27.693240 containerd[1708]: time="2025-05-27T03:23:27.693176812Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:a9e7e6b294baf1695fccb862d956c5d3ad8510e1e4ca1535f35dc09f247abbfc\"" May 27 03:23:30.061573 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:23:30.061722 systemd[1]: kubelet.service: Consumed 124ms CPU time, 110.8M memory peak. May 27 03:23:30.063635 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:23:30.083576 systemd[1]: Reload requested from client PID 2576 ('systemctl') (unit session-9.scope)... May 27 03:23:30.083594 systemd[1]: Reloading... May 27 03:23:30.166513 zram_generator::config[2621]: No configuration found. May 27 03:23:30.240947 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:23:30.328488 systemd[1]: Reloading finished in 244 ms. May 27 03:23:30.432776 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM May 27 03:23:30.432838 systemd[1]: kubelet.service: Failed with result 'signal'. May 27 03:23:30.433017 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:23:30.433049 systemd[1]: kubelet.service: Consumed 66ms CPU time, 77.9M memory peak. May 27 03:23:30.434878 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:23:31.051245 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:23:31.056662 (kubelet)[2688]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 03:23:31.087203 kubelet[2688]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:23:31.087203 kubelet[2688]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 03:23:31.087203 kubelet[2688]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:23:31.087203 kubelet[2688]: I0527 03:23:31.086840 2688 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 03:23:31.255421 kubelet[2688]: I0527 03:23:31.255396 2688 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 03:23:31.255421 kubelet[2688]: I0527 03:23:31.255412 2688 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 03:23:31.255643 kubelet[2688]: I0527 03:23:31.255630 2688 server.go:954] "Client rotation is on, will bootstrap in background" May 27 03:23:31.280498 kubelet[2688]: E0527 03:23:31.280477 2688 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.200.8.20:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="UnhandledError" May 27 03:23:31.281301 kubelet[2688]: I0527 03:23:31.281223 2688 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 03:23:31.289593 kubelet[2688]: I0527 03:23:31.289576 2688 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 03:23:31.291201 kubelet[2688]: I0527 03:23:31.291188 2688 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 03:23:31.292119 kubelet[2688]: I0527 03:23:31.292086 2688 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 03:23:31.292262 kubelet[2688]: I0527 03:23:31.292115 2688 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-a-c2c0d8ddb2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 03:23:31.292364 kubelet[2688]: I0527 03:23:31.292269 2688 topology_manager.go:138] "Creating topology manager with none policy" May 27 03:23:31.292364 kubelet[2688]: I0527 03:23:31.292279 2688 container_manager_linux.go:304] "Creating device plugin manager" May 27 03:23:31.292404 kubelet[2688]: I0527 03:23:31.292368 2688 state_mem.go:36] "Initialized new in-memory state store" May 27 03:23:31.295624 kubelet[2688]: I0527 03:23:31.295613 2688 kubelet.go:446] "Attempting to sync node with API server" May 27 03:23:31.295679 kubelet[2688]: I0527 03:23:31.295634 2688 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 03:23:31.295679 kubelet[2688]: I0527 03:23:31.295653 2688 kubelet.go:352] "Adding apiserver pod source" May 27 03:23:31.295679 kubelet[2688]: I0527 03:23:31.295661 2688 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 03:23:31.299849 kubelet[2688]: W0527 03:23:31.299634 2688 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.200.8.20:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.200.8.20:6443: connect: connection refused May 27 03:23:31.299849 kubelet[2688]: E0527 03:23:31.299678 2688 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.200.8.20:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="UnhandledError" May 27 03:23:31.299849 kubelet[2688]: W0527 03:23:31.299730 2688 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.200.8.20:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-c2c0d8ddb2&limit=500&resourceVersion=0": dial tcp 10.200.8.20:6443: connect: connection refused May 27 03:23:31.299849 kubelet[2688]: E0527 03:23:31.299755 2688 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.200.8.20:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4344.0.0-a-c2c0d8ddb2&limit=500&resourceVersion=0\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="UnhandledError" May 27 03:23:31.300449 kubelet[2688]: I0527 03:23:31.300010 2688 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 03:23:31.300449 kubelet[2688]: I0527 03:23:31.300332 2688 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 03:23:31.301302 kubelet[2688]: W0527 03:23:31.300969 2688 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. May 27 03:23:31.303752 kubelet[2688]: I0527 03:23:31.303471 2688 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 03:23:31.303752 kubelet[2688]: I0527 03:23:31.303499 2688 server.go:1287] "Started kubelet" May 27 03:23:31.305598 kubelet[2688]: I0527 03:23:31.305583 2688 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 03:23:31.310013 kubelet[2688]: I0527 03:23:31.309985 2688 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 03:23:31.310279 kubelet[2688]: I0527 03:23:31.310271 2688 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 03:23:31.310518 kubelet[2688]: E0527 03:23:31.310507 2688 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" May 27 03:23:31.310735 kubelet[2688]: I0527 03:23:31.310721 2688 server.go:479] "Adding debug handlers to kubelet server" May 27 03:23:31.312180 kubelet[2688]: I0527 03:23:31.312128 2688 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 03:23:31.312319 kubelet[2688]: I0527 03:23:31.312307 2688 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 03:23:31.312679 kubelet[2688]: I0527 03:23:31.312661 2688 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 03:23:31.312724 kubelet[2688]: I0527 03:23:31.312696 2688 reconciler.go:26] "Reconciler: start to sync state" May 27 03:23:31.316710 kubelet[2688]: I0527 03:23:31.316689 2688 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 03:23:31.316995 kubelet[2688]: E0527 03:23:31.316970 2688 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.20:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-c2c0d8ddb2?timeout=10s\": dial tcp 10.200.8.20:6443: connect: connection refused" interval="200ms" May 27 03:23:31.317174 kubelet[2688]: I0527 03:23:31.317158 2688 factory.go:221] Registration of the systemd container factory successfully May 27 03:23:31.317225 kubelet[2688]: I0527 03:23:31.317212 2688 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 03:23:31.318767 kubelet[2688]: E0527 03:23:31.317708 2688 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.200.8.20:6443/api/v1/namespaces/default/events\": dial tcp 10.200.8.20:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4344.0.0-a-c2c0d8ddb2.18434454c9cd0908 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4344.0.0-a-c2c0d8ddb2,UID:ci-4344.0.0-a-c2c0d8ddb2,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4344.0.0-a-c2c0d8ddb2,},FirstTimestamp:2025-05-27 03:23:31.303483656 +0000 UTC m=+0.243888597,LastTimestamp:2025-05-27 03:23:31.303483656 +0000 UTC m=+0.243888597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4344.0.0-a-c2c0d8ddb2,}" May 27 03:23:31.318945 kubelet[2688]: E0527 03:23:31.318929 2688 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 03:23:31.319312 kubelet[2688]: W0527 03:23:31.319269 2688 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.200.8.20:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.200.8.20:6443: connect: connection refused May 27 03:23:31.319391 kubelet[2688]: E0527 03:23:31.319314 2688 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.200.8.20:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="UnhandledError" May 27 03:23:31.319487 kubelet[2688]: I0527 03:23:31.319474 2688 factory.go:221] Registration of the containerd container factory successfully May 27 03:23:31.327547 kubelet[2688]: I0527 03:23:31.327517 2688 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 03:23:31.328259 kubelet[2688]: I0527 03:23:31.328234 2688 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 03:23:31.328259 kubelet[2688]: I0527 03:23:31.328253 2688 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 03:23:31.328336 kubelet[2688]: I0527 03:23:31.328267 2688 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 03:23:31.328336 kubelet[2688]: I0527 03:23:31.328274 2688 kubelet.go:2382] "Starting kubelet main sync loop" May 27 03:23:31.328336 kubelet[2688]: E0527 03:23:31.328306 2688 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 03:23:31.334807 kubelet[2688]: W0527 03:23:31.334761 2688 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.200.8.20:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.200.8.20:6443: connect: connection refused May 27 03:23:31.334807 kubelet[2688]: E0527 03:23:31.334785 2688 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.200.8.20:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.200.8.20:6443: connect: connection refused" logger="UnhandledError" May 27 03:23:31.335848 kubelet[2688]: I0527 03:23:31.335833 2688 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 03:23:31.335848 kubelet[2688]: I0527 03:23:31.335844 2688 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 03:23:31.335928 kubelet[2688]: I0527 03:23:31.335857 2688 state_mem.go:36] "Initialized new in-memory state store" May 27 03:23:31.339975 kubelet[2688]: I0527 03:23:31.339848 2688 policy_none.go:49] "None policy: Start" May 27 03:23:31.339975 kubelet[2688]: I0527 03:23:31.339859 2688 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 03:23:31.339975 kubelet[2688]: I0527 03:23:31.339869 2688 state_mem.go:35] "Initializing new in-memory state store" May 27 03:23:31.345855 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. May 27 03:23:31.354827 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. May 27 03:23:31.369990 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. May 27 03:23:31.371066 kubelet[2688]: I0527 03:23:31.371052 2688 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 03:23:31.371409 kubelet[2688]: I0527 03:23:31.371173 2688 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 03:23:31.371409 kubelet[2688]: I0527 03:23:31.371180 2688 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 03:23:31.371409 kubelet[2688]: I0527 03:23:31.371314 2688 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 03:23:31.372071 kubelet[2688]: E0527 03:23:31.372047 2688 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 03:23:31.372128 kubelet[2688]: E0527 03:23:31.372083 2688 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" May 27 03:23:31.435898 systemd[1]: Created slice kubepods-burstable-podc7e844df2ffbfae7b77a378fc24740c0.slice - libcontainer container kubepods-burstable-podc7e844df2ffbfae7b77a378fc24740c0.slice. May 27 03:23:31.452278 kubelet[2688]: E0527 03:23:31.452262 2688 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.455198 systemd[1]: Created slice kubepods-burstable-pod6495289340d8c8c960b4c8d25060be23.slice - libcontainer container kubepods-burstable-pod6495289340d8c8c960b4c8d25060be23.slice. May 27 03:23:31.456609 kubelet[2688]: E0527 03:23:31.456596 2688 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.458050 systemd[1]: Created slice kubepods-burstable-pod19466318606f17977744b8a096a2c882.slice - libcontainer container kubepods-burstable-pod19466318606f17977744b8a096a2c882.slice. May 27 03:23:31.459278 kubelet[2688]: E0527 03:23:31.459255 2688 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.472172 kubelet[2688]: I0527 03:23:31.472159 2688 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.472450 kubelet[2688]: E0527 03:23:31.472426 2688 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.8.20:6443/api/v1/nodes\": dial tcp 10.200.8.20:6443: connect: connection refused" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.518093 kubelet[2688]: E0527 03:23:31.518051 2688 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.20:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-c2c0d8ddb2?timeout=10s\": dial tcp 10.200.8.20:6443: connect: connection refused" interval="400ms" May 27 03:23:31.614551 kubelet[2688]: I0527 03:23:31.614473 2688 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6495289340d8c8c960b4c8d25060be23-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"6495289340d8c8c960b4c8d25060be23\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.614551 kubelet[2688]: I0527 03:23:31.614539 2688 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.614646 kubelet[2688]: I0527 03:23:31.614558 2688 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6495289340d8c8c960b4c8d25060be23-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"6495289340d8c8c960b4c8d25060be23\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.614646 kubelet[2688]: I0527 03:23:31.614574 2688 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6495289340d8c8c960b4c8d25060be23-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"6495289340d8c8c960b4c8d25060be23\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.614646 kubelet[2688]: I0527 03:23:31.614597 2688 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.614646 kubelet[2688]: I0527 03:23:31.614613 2688 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.614646 kubelet[2688]: I0527 03:23:31.614630 2688 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.614759 kubelet[2688]: I0527 03:23:31.614647 2688 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c7e844df2ffbfae7b77a378fc24740c0-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"c7e844df2ffbfae7b77a378fc24740c0\") " pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.614759 kubelet[2688]: I0527 03:23:31.614665 2688 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.673746 kubelet[2688]: I0527 03:23:31.673708 2688 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.674095 kubelet[2688]: E0527 03:23:31.674056 2688 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.8.20:6443/api/v1/nodes\": dial tcp 10.200.8.20:6443: connect: connection refused" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:31.753865 containerd[1708]: time="2025-05-27T03:23:31.753813653Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2,Uid:c7e844df2ffbfae7b77a378fc24740c0,Namespace:kube-system,Attempt:0,}" May 27 03:23:31.757242 containerd[1708]: time="2025-05-27T03:23:31.757215401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2,Uid:6495289340d8c8c960b4c8d25060be23,Namespace:kube-system,Attempt:0,}" May 27 03:23:31.759784 containerd[1708]: time="2025-05-27T03:23:31.759762938Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2,Uid:19466318606f17977744b8a096a2c882,Namespace:kube-system,Attempt:0,}" May 27 03:23:31.824564 containerd[1708]: time="2025-05-27T03:23:31.824425094Z" level=info msg="connecting to shim 28c0f7081538d39918da6a3abeabbc2038b353111a8591c4f974d8bb57d89d6d" address="unix:///run/containerd/s/afd038e47f47377b3fb6077754331c5852802c6403df2d674d1523e780ef9c8f" namespace=k8s.io protocol=ttrpc version=3 May 27 03:23:31.826480 containerd[1708]: time="2025-05-27T03:23:31.826454431Z" level=info msg="connecting to shim 289a6d01190216781ffc2b49e21ecc7a01d5972286d4d6352b8472ecad8f3fb4" address="unix:///run/containerd/s/f2f20eaa210364adc39bb6b74bf5a8fe48c4498195574f1f19c99b1eadf811f2" namespace=k8s.io protocol=ttrpc version=3 May 27 03:23:31.845805 containerd[1708]: time="2025-05-27T03:23:31.845778033Z" level=info msg="connecting to shim 3c9147759c9be5d11f446cd5a813073a024c5ba2e0946d7dd5b9deb630369bbb" address="unix:///run/containerd/s/fe20da6da5948568b05bf4964ccb2840dc3f837f23e25fe3a79c0812dc4c79dd" namespace=k8s.io protocol=ttrpc version=3 May 27 03:23:31.857571 systemd[1]: Started cri-containerd-28c0f7081538d39918da6a3abeabbc2038b353111a8591c4f974d8bb57d89d6d.scope - libcontainer container 28c0f7081538d39918da6a3abeabbc2038b353111a8591c4f974d8bb57d89d6d. May 27 03:23:31.864987 systemd[1]: Started cri-containerd-289a6d01190216781ffc2b49e21ecc7a01d5972286d4d6352b8472ecad8f3fb4.scope - libcontainer container 289a6d01190216781ffc2b49e21ecc7a01d5972286d4d6352b8472ecad8f3fb4. May 27 03:23:31.880546 systemd[1]: Started cri-containerd-3c9147759c9be5d11f446cd5a813073a024c5ba2e0946d7dd5b9deb630369bbb.scope - libcontainer container 3c9147759c9be5d11f446cd5a813073a024c5ba2e0946d7dd5b9deb630369bbb. May 27 03:23:31.919098 kubelet[2688]: E0527 03:23:31.919031 2688 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.200.8.20:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4344.0.0-a-c2c0d8ddb2?timeout=10s\": dial tcp 10.200.8.20:6443: connect: connection refused" interval="800ms" May 27 03:23:31.930374 containerd[1708]: time="2025-05-27T03:23:31.930351903Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2,Uid:19466318606f17977744b8a096a2c882,Namespace:kube-system,Attempt:0,} returns sandbox id \"289a6d01190216781ffc2b49e21ecc7a01d5972286d4d6352b8472ecad8f3fb4\"" May 27 03:23:31.936889 containerd[1708]: time="2025-05-27T03:23:31.936747288Z" level=info msg="CreateContainer within sandbox \"289a6d01190216781ffc2b49e21ecc7a01d5972286d4d6352b8472ecad8f3fb4\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" May 27 03:23:31.940139 containerd[1708]: time="2025-05-27T03:23:31.940117432Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2,Uid:6495289340d8c8c960b4c8d25060be23,Namespace:kube-system,Attempt:0,} returns sandbox id \"3c9147759c9be5d11f446cd5a813073a024c5ba2e0946d7dd5b9deb630369bbb\"" May 27 03:23:31.941772 containerd[1708]: time="2025-05-27T03:23:31.941752734Z" level=info msg="CreateContainer within sandbox \"3c9147759c9be5d11f446cd5a813073a024c5ba2e0946d7dd5b9deb630369bbb\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" May 27 03:23:31.949153 containerd[1708]: time="2025-05-27T03:23:31.949130723Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2,Uid:c7e844df2ffbfae7b77a378fc24740c0,Namespace:kube-system,Attempt:0,} returns sandbox id \"28c0f7081538d39918da6a3abeabbc2038b353111a8591c4f974d8bb57d89d6d\"" May 27 03:23:31.952074 containerd[1708]: time="2025-05-27T03:23:31.952039364Z" level=info msg="CreateContainer within sandbox \"28c0f7081538d39918da6a3abeabbc2038b353111a8591c4f974d8bb57d89d6d\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" May 27 03:23:31.958128 containerd[1708]: time="2025-05-27T03:23:31.958107860Z" level=info msg="Container e663447332f7376320b79a2c1dba47eef8da3529625911b3882cc6de740458fb: CDI devices from CRI Config.CDIDevices: []" May 27 03:23:31.974448 containerd[1708]: time="2025-05-27T03:23:31.974294991Z" level=info msg="CreateContainer within sandbox \"289a6d01190216781ffc2b49e21ecc7a01d5972286d4d6352b8472ecad8f3fb4\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"e663447332f7376320b79a2c1dba47eef8da3529625911b3882cc6de740458fb\"" May 27 03:23:31.975684 containerd[1708]: time="2025-05-27T03:23:31.975661489Z" level=info msg="StartContainer for \"e663447332f7376320b79a2c1dba47eef8da3529625911b3882cc6de740458fb\"" May 27 03:23:31.977323 containerd[1708]: time="2025-05-27T03:23:31.977293123Z" level=info msg="connecting to shim e663447332f7376320b79a2c1dba47eef8da3529625911b3882cc6de740458fb" address="unix:///run/containerd/s/f2f20eaa210364adc39bb6b74bf5a8fe48c4498195574f1f19c99b1eadf811f2" protocol=ttrpc version=3 May 27 03:23:31.979462 containerd[1708]: time="2025-05-27T03:23:31.978760515Z" level=info msg="Container 76d179f66870c0908a9b1de8e1d0869e6803363bab010f10b14100b662109521: CDI devices from CRI Config.CDIDevices: []" May 27 03:23:31.986108 containerd[1708]: time="2025-05-27T03:23:31.986072395Z" level=info msg="Container 094a298e219febaabea0d417274477890005835e7039ec4611a8ef4e73b0f414: CDI devices from CRI Config.CDIDevices: []" May 27 03:23:31.992550 systemd[1]: Started cri-containerd-e663447332f7376320b79a2c1dba47eef8da3529625911b3882cc6de740458fb.scope - libcontainer container e663447332f7376320b79a2c1dba47eef8da3529625911b3882cc6de740458fb. May 27 03:23:31.998729 containerd[1708]: time="2025-05-27T03:23:31.998700357Z" level=info msg="CreateContainer within sandbox \"3c9147759c9be5d11f446cd5a813073a024c5ba2e0946d7dd5b9deb630369bbb\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"76d179f66870c0908a9b1de8e1d0869e6803363bab010f10b14100b662109521\"" May 27 03:23:31.998994 containerd[1708]: time="2025-05-27T03:23:31.998964797Z" level=info msg="StartContainer for \"76d179f66870c0908a9b1de8e1d0869e6803363bab010f10b14100b662109521\"" May 27 03:23:32.000313 containerd[1708]: time="2025-05-27T03:23:32.000268693Z" level=info msg="connecting to shim 76d179f66870c0908a9b1de8e1d0869e6803363bab010f10b14100b662109521" address="unix:///run/containerd/s/fe20da6da5948568b05bf4964ccb2840dc3f837f23e25fe3a79c0812dc4c79dd" protocol=ttrpc version=3 May 27 03:23:32.004323 containerd[1708]: time="2025-05-27T03:23:32.004298288Z" level=info msg="CreateContainer within sandbox \"28c0f7081538d39918da6a3abeabbc2038b353111a8591c4f974d8bb57d89d6d\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"094a298e219febaabea0d417274477890005835e7039ec4611a8ef4e73b0f414\"" May 27 03:23:32.004870 containerd[1708]: time="2025-05-27T03:23:32.004837857Z" level=info msg="StartContainer for \"094a298e219febaabea0d417274477890005835e7039ec4611a8ef4e73b0f414\"" May 27 03:23:32.006186 containerd[1708]: time="2025-05-27T03:23:32.006109850Z" level=info msg="connecting to shim 094a298e219febaabea0d417274477890005835e7039ec4611a8ef4e73b0f414" address="unix:///run/containerd/s/afd038e47f47377b3fb6077754331c5852802c6403df2d674d1523e780ef9c8f" protocol=ttrpc version=3 May 27 03:23:32.018791 systemd[1]: Started cri-containerd-76d179f66870c0908a9b1de8e1d0869e6803363bab010f10b14100b662109521.scope - libcontainer container 76d179f66870c0908a9b1de8e1d0869e6803363bab010f10b14100b662109521. May 27 03:23:32.022974 systemd[1]: Started cri-containerd-094a298e219febaabea0d417274477890005835e7039ec4611a8ef4e73b0f414.scope - libcontainer container 094a298e219febaabea0d417274477890005835e7039ec4611a8ef4e73b0f414. May 27 03:23:32.050788 containerd[1708]: time="2025-05-27T03:23:32.050768161Z" level=info msg="StartContainer for \"e663447332f7376320b79a2c1dba47eef8da3529625911b3882cc6de740458fb\" returns successfully" May 27 03:23:32.075994 kubelet[2688]: I0527 03:23:32.075578 2688 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:32.076072 kubelet[2688]: E0527 03:23:32.075902 2688 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.200.8.20:6443/api/v1/nodes\": dial tcp 10.200.8.20:6443: connect: connection refused" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:32.076973 containerd[1708]: time="2025-05-27T03:23:32.076956586Z" level=info msg="StartContainer for \"76d179f66870c0908a9b1de8e1d0869e6803363bab010f10b14100b662109521\" returns successfully" May 27 03:23:32.113827 containerd[1708]: time="2025-05-27T03:23:32.113803388Z" level=info msg="StartContainer for \"094a298e219febaabea0d417274477890005835e7039ec4611a8ef4e73b0f414\" returns successfully" May 27 03:23:32.340206 kubelet[2688]: E0527 03:23:32.340151 2688 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:32.342964 kubelet[2688]: E0527 03:23:32.342791 2688 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:32.345038 kubelet[2688]: E0527 03:23:32.345021 2688 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:32.878588 kubelet[2688]: I0527 03:23:32.878575 2688 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.349785 kubelet[2688]: E0527 03:23:33.349638 2688 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.350824 kubelet[2688]: E0527 03:23:33.350430 2688 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.590675 kubelet[2688]: E0527 03:23:33.590649 2688 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4344.0.0-a-c2c0d8ddb2\" not found" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.653373 kubelet[2688]: I0527 03:23:33.653353 2688 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.711744 kubelet[2688]: I0527 03:23:33.711721 2688 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.805717 kubelet[2688]: E0527 03:23:33.805697 2688 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.805717 kubelet[2688]: I0527 03:23:33.805718 2688 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.807419 kubelet[2688]: E0527 03:23:33.807281 2688 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.807419 kubelet[2688]: I0527 03:23:33.807299 2688 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.808647 kubelet[2688]: E0527 03:23:33.808627 2688 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:33.877285 update_engine[1696]: I20250527 03:23:33.877246 1696 update_attempter.cc:509] Updating boot flags... May 27 03:23:34.299730 kubelet[2688]: I0527 03:23:34.299709 2688 apiserver.go:52] "Watching apiserver" May 27 03:23:34.313531 kubelet[2688]: I0527 03:23:34.313514 2688 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 03:23:34.348886 kubelet[2688]: I0527 03:23:34.348871 2688 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:34.350149 kubelet[2688]: E0527 03:23:34.350129 2688 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:35.497872 systemd[1]: Reload requested from client PID 2977 ('systemctl') (unit session-9.scope)... May 27 03:23:35.498118 systemd[1]: Reloading... May 27 03:23:35.534460 kernel: hv_balloon: Max. dynamic memory size: 8192 MB May 27 03:23:35.558461 zram_generator::config[3018]: No configuration found. May 27 03:23:35.640409 systemd[1]: /usr/lib/systemd/system/docker.socket:6: ListenStream= references a path below legacy directory /var/run/, updating /var/run/docker.sock → /run/docker.sock; please update the unit file accordingly. May 27 03:23:35.728738 systemd[1]: Reloading finished in 230 ms. May 27 03:23:35.754041 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:23:35.774148 systemd[1]: kubelet.service: Deactivated successfully. May 27 03:23:35.774343 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:23:35.774398 systemd[1]: kubelet.service: Consumed 497ms CPU time, 129M memory peak. May 27 03:23:35.775609 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... May 27 03:23:36.156162 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. May 27 03:23:36.163693 (kubelet)[3089]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS May 27 03:23:36.202600 kubelet[3089]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:23:36.202786 kubelet[3089]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. May 27 03:23:36.202786 kubelet[3089]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. May 27 03:23:36.202786 kubelet[3089]: I0527 03:23:36.202759 3089 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" May 27 03:23:36.208377 kubelet[3089]: I0527 03:23:36.208313 3089 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" May 27 03:23:36.208377 kubelet[3089]: I0527 03:23:36.208368 3089 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" May 27 03:23:36.208701 kubelet[3089]: I0527 03:23:36.208682 3089 server.go:954] "Client rotation is on, will bootstrap in background" May 27 03:23:36.209413 kubelet[3089]: I0527 03:23:36.209399 3089 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". May 27 03:23:36.211732 kubelet[3089]: I0527 03:23:36.211315 3089 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" May 27 03:23:36.213865 kubelet[3089]: I0527 03:23:36.213851 3089 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" May 27 03:23:36.215579 kubelet[3089]: I0527 03:23:36.215565 3089 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" May 27 03:23:36.215704 kubelet[3089]: I0527 03:23:36.215682 3089 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] May 27 03:23:36.216062 kubelet[3089]: I0527 03:23:36.215703 3089 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4344.0.0-a-c2c0d8ddb2","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} May 27 03:23:36.216355 kubelet[3089]: I0527 03:23:36.216067 3089 topology_manager.go:138] "Creating topology manager with none policy" May 27 03:23:36.216355 kubelet[3089]: I0527 03:23:36.216076 3089 container_manager_linux.go:304] "Creating device plugin manager" May 27 03:23:36.216355 kubelet[3089]: I0527 03:23:36.216195 3089 state_mem.go:36] "Initialized new in-memory state store" May 27 03:23:36.216355 kubelet[3089]: I0527 03:23:36.216334 3089 kubelet.go:446] "Attempting to sync node with API server" May 27 03:23:36.216355 kubelet[3089]: I0527 03:23:36.216349 3089 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" May 27 03:23:36.216474 kubelet[3089]: I0527 03:23:36.216367 3089 kubelet.go:352] "Adding apiserver pod source" May 27 03:23:36.216474 kubelet[3089]: I0527 03:23:36.216376 3089 apiserver.go:42] "Waiting for node sync before watching apiserver pods" May 27 03:23:36.219275 kubelet[3089]: I0527 03:23:36.219261 3089 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.0.4" apiVersion="v1" May 27 03:23:36.222456 kubelet[3089]: I0527 03:23:36.221105 3089 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" May 27 03:23:36.224272 kubelet[3089]: I0527 03:23:36.224259 3089 watchdog_linux.go:99] "Systemd watchdog is not enabled" May 27 03:23:36.226540 kubelet[3089]: I0527 03:23:36.226527 3089 server.go:1287] "Started kubelet" May 27 03:23:36.227524 kubelet[3089]: I0527 03:23:36.227502 3089 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 May 27 03:23:36.229039 kubelet[3089]: I0527 03:23:36.229022 3089 server.go:479] "Adding debug handlers to kubelet server" May 27 03:23:36.229974 kubelet[3089]: I0527 03:23:36.227859 3089 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 May 27 03:23:36.230180 kubelet[3089]: I0527 03:23:36.230164 3089 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" May 27 03:23:36.230332 kubelet[3089]: I0527 03:23:36.230323 3089 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" May 27 03:23:36.231526 kubelet[3089]: I0527 03:23:36.227824 3089 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" May 27 03:23:36.236467 kubelet[3089]: I0527 03:23:36.236448 3089 volume_manager.go:297] "Starting Kubelet Volume Manager" May 27 03:23:36.236700 kubelet[3089]: E0527 03:23:36.236690 3089 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4344.0.0-a-c2c0d8ddb2\" not found" May 27 03:23:36.239011 kubelet[3089]: I0527 03:23:36.238996 3089 desired_state_of_world_populator.go:150] "Desired state populator starts to run" May 27 03:23:36.239089 kubelet[3089]: I0527 03:23:36.239083 3089 reconciler.go:26] "Reconciler: start to sync state" May 27 03:23:36.241765 kubelet[3089]: I0527 03:23:36.241650 3089 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" May 27 03:23:36.242060 kubelet[3089]: I0527 03:23:36.242051 3089 factory.go:221] Registration of the systemd container factory successfully May 27 03:23:36.242954 kubelet[3089]: I0527 03:23:36.242171 3089 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory May 27 03:23:36.243523 kubelet[3089]: I0527 03:23:36.242669 3089 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" May 27 03:23:36.243607 kubelet[3089]: I0527 03:23:36.243600 3089 status_manager.go:227] "Starting to sync pod status with apiserver" May 27 03:23:36.243651 kubelet[3089]: I0527 03:23:36.243646 3089 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." May 27 03:23:36.243682 kubelet[3089]: I0527 03:23:36.243679 3089 kubelet.go:2382] "Starting kubelet main sync loop" May 27 03:23:36.243751 kubelet[3089]: E0527 03:23:36.243742 3089 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" May 27 03:23:36.249014 kubelet[3089]: E0527 03:23:36.249000 3089 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" May 27 03:23:36.249321 kubelet[3089]: I0527 03:23:36.249310 3089 factory.go:221] Registration of the containerd container factory successfully May 27 03:23:36.284097 kubelet[3089]: I0527 03:23:36.284087 3089 cpu_manager.go:221] "Starting CPU manager" policy="none" May 27 03:23:36.284160 kubelet[3089]: I0527 03:23:36.284114 3089 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" May 27 03:23:36.284160 kubelet[3089]: I0527 03:23:36.284127 3089 state_mem.go:36] "Initialized new in-memory state store" May 27 03:23:36.284245 kubelet[3089]: I0527 03:23:36.284232 3089 state_mem.go:88] "Updated default CPUSet" cpuSet="" May 27 03:23:36.284268 kubelet[3089]: I0527 03:23:36.284243 3089 state_mem.go:96] "Updated CPUSet assignments" assignments={} May 27 03:23:36.284268 kubelet[3089]: I0527 03:23:36.284257 3089 policy_none.go:49] "None policy: Start" May 27 03:23:36.284268 kubelet[3089]: I0527 03:23:36.284266 3089 memory_manager.go:186] "Starting memorymanager" policy="None" May 27 03:23:36.284325 kubelet[3089]: I0527 03:23:36.284275 3089 state_mem.go:35] "Initializing new in-memory state store" May 27 03:23:36.284362 kubelet[3089]: I0527 03:23:36.284355 3089 state_mem.go:75] "Updated machine memory state" May 27 03:23:36.287449 kubelet[3089]: I0527 03:23:36.287418 3089 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" May 27 03:23:36.287544 kubelet[3089]: I0527 03:23:36.287533 3089 eviction_manager.go:189] "Eviction manager: starting control loop" May 27 03:23:36.287571 kubelet[3089]: I0527 03:23:36.287545 3089 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" May 27 03:23:36.287945 kubelet[3089]: I0527 03:23:36.287884 3089 plugin_manager.go:118] "Starting Kubelet Plugin Manager" May 27 03:23:36.288832 kubelet[3089]: E0527 03:23:36.288818 3089 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" May 27 03:23:36.344703 kubelet[3089]: I0527 03:23:36.344686 3089 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.344778 kubelet[3089]: I0527 03:23:36.344771 3089 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.344954 kubelet[3089]: I0527 03:23:36.344928 3089 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.350404 kubelet[3089]: W0527 03:23:36.350391 3089 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 03:23:36.357428 kubelet[3089]: W0527 03:23:36.357354 3089 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 03:23:36.357591 kubelet[3089]: W0527 03:23:36.357399 3089 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 03:23:36.390285 kubelet[3089]: I0527 03:23:36.390274 3089 kubelet_node_status.go:75] "Attempting to register node" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.399765 kubelet[3089]: I0527 03:23:36.399750 3089 kubelet_node_status.go:124] "Node was previously registered" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.399834 kubelet[3089]: I0527 03:23:36.399793 3089 kubelet_node_status.go:78] "Successfully registered node" node="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.439511 kubelet[3089]: I0527 03:23:36.439417 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c7e844df2ffbfae7b77a378fc24740c0-kubeconfig\") pod \"kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"c7e844df2ffbfae7b77a378fc24740c0\") " pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.539575 kubelet[3089]: I0527 03:23:36.539528 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/6495289340d8c8c960b4c8d25060be23-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"6495289340d8c8c960b4c8d25060be23\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.539722 kubelet[3089]: I0527 03:23:36.539658 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-k8s-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.539722 kubelet[3089]: I0527 03:23:36.539676 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.539820 kubelet[3089]: I0527 03:23:36.539811 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/6495289340d8c8c960b4c8d25060be23-ca-certs\") pod \"kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"6495289340d8c8c960b4c8d25060be23\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.540004 kubelet[3089]: I0527 03:23:36.539924 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/6495289340d8c8c960b4c8d25060be23-k8s-certs\") pod \"kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"6495289340d8c8c960b4c8d25060be23\") " pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.540004 kubelet[3089]: I0527 03:23:36.539946 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-ca-certs\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.540004 kubelet[3089]: I0527 03:23:36.539961 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-flexvolume-dir\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:36.540004 kubelet[3089]: I0527 03:23:36.539976 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/19466318606f17977744b8a096a2c882-kubeconfig\") pod \"kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2\" (UID: \"19466318606f17977744b8a096a2c882\") " pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:37.217220 kubelet[3089]: I0527 03:23:37.217201 3089 apiserver.go:52] "Watching apiserver" May 27 03:23:37.239379 kubelet[3089]: I0527 03:23:37.239357 3089 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" May 27 03:23:37.271535 kubelet[3089]: I0527 03:23:37.271489 3089 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:37.275773 kubelet[3089]: W0527 03:23:37.275575 3089 warnings.go:70] metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots] May 27 03:23:37.275773 kubelet[3089]: E0527 03:23:37.275632 3089 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2\" already exists" pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" May 27 03:23:37.296558 kubelet[3089]: I0527 03:23:37.296429 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4344.0.0-a-c2c0d8ddb2" podStartSLOduration=1.296417176 podStartE2EDuration="1.296417176s" podCreationTimestamp="2025-05-27 03:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:23:37.288307761 +0000 UTC m=+1.121185199" watchObservedRunningTime="2025-05-27 03:23:37.296417176 +0000 UTC m=+1.129294607" May 27 03:23:37.297078 kubelet[3089]: I0527 03:23:37.296710 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4344.0.0-a-c2c0d8ddb2" podStartSLOduration=1.29670096 podStartE2EDuration="1.29670096s" podCreationTimestamp="2025-05-27 03:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:23:37.295780676 +0000 UTC m=+1.128658114" watchObservedRunningTime="2025-05-27 03:23:37.29670096 +0000 UTC m=+1.129578399" May 27 03:23:41.804789 kubelet[3089]: I0527 03:23:41.804742 3089 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" May 27 03:23:41.805213 kubelet[3089]: I0527 03:23:41.805180 3089 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" May 27 03:23:41.805248 containerd[1708]: time="2025-05-27T03:23:41.804998287Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." May 27 03:23:42.642714 kubelet[3089]: I0527 03:23:42.642410 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4344.0.0-a-c2c0d8ddb2" podStartSLOduration=6.642395095 podStartE2EDuration="6.642395095s" podCreationTimestamp="2025-05-27 03:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:23:37.311495736 +0000 UTC m=+1.144373175" watchObservedRunningTime="2025-05-27 03:23:42.642395095 +0000 UTC m=+6.475272533" May 27 03:23:42.651498 systemd[1]: Created slice kubepods-besteffort-pod20f56e25_2cb1_49f2_9bf0_5daa88bcad05.slice - libcontainer container kubepods-besteffort-pod20f56e25_2cb1_49f2_9bf0_5daa88bcad05.slice. May 27 03:23:42.680807 kubelet[3089]: I0527 03:23:42.680767 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/20f56e25-2cb1-49f2-9bf0-5daa88bcad05-kube-proxy\") pod \"kube-proxy-xpwsv\" (UID: \"20f56e25-2cb1-49f2-9bf0-5daa88bcad05\") " pod="kube-system/kube-proxy-xpwsv" May 27 03:23:42.680807 kubelet[3089]: I0527 03:23:42.680798 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/20f56e25-2cb1-49f2-9bf0-5daa88bcad05-lib-modules\") pod \"kube-proxy-xpwsv\" (UID: \"20f56e25-2cb1-49f2-9bf0-5daa88bcad05\") " pod="kube-system/kube-proxy-xpwsv" May 27 03:23:42.680934 kubelet[3089]: I0527 03:23:42.680813 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/20f56e25-2cb1-49f2-9bf0-5daa88bcad05-xtables-lock\") pod \"kube-proxy-xpwsv\" (UID: \"20f56e25-2cb1-49f2-9bf0-5daa88bcad05\") " pod="kube-system/kube-proxy-xpwsv" May 27 03:23:42.680934 kubelet[3089]: I0527 03:23:42.680829 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cspcp\" (UniqueName: \"kubernetes.io/projected/20f56e25-2cb1-49f2-9bf0-5daa88bcad05-kube-api-access-cspcp\") pod \"kube-proxy-xpwsv\" (UID: \"20f56e25-2cb1-49f2-9bf0-5daa88bcad05\") " pod="kube-system/kube-proxy-xpwsv" May 27 03:23:42.869273 systemd[1]: Created slice kubepods-besteffort-poddda99589_818c_4056_a949_c57e2b8d50f4.slice - libcontainer container kubepods-besteffort-poddda99589_818c_4056_a949_c57e2b8d50f4.slice. May 27 03:23:42.882783 kubelet[3089]: I0527 03:23:42.882759 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/dda99589-818c-4056-a949-c57e2b8d50f4-var-lib-calico\") pod \"tigera-operator-844669ff44-wpx57\" (UID: \"dda99589-818c-4056-a949-c57e2b8d50f4\") " pod="tigera-operator/tigera-operator-844669ff44-wpx57" May 27 03:23:42.882975 kubelet[3089]: I0527 03:23:42.882788 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ft6gk\" (UniqueName: \"kubernetes.io/projected/dda99589-818c-4056-a949-c57e2b8d50f4-kube-api-access-ft6gk\") pod \"tigera-operator-844669ff44-wpx57\" (UID: \"dda99589-818c-4056-a949-c57e2b8d50f4\") " pod="tigera-operator/tigera-operator-844669ff44-wpx57" May 27 03:23:42.960713 containerd[1708]: time="2025-05-27T03:23:42.960635501Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xpwsv,Uid:20f56e25-2cb1-49f2-9bf0-5daa88bcad05,Namespace:kube-system,Attempt:0,}" May 27 03:23:42.991611 containerd[1708]: time="2025-05-27T03:23:42.991584779Z" level=info msg="connecting to shim ab064e81aa89687588429b467c288e43c8daca4370014564adc616cd41ca0e7c" address="unix:///run/containerd/s/46b2bc3fc11ff2579bdd98de7898f93e58222bfbc31ba289b54a651df0064473" namespace=k8s.io protocol=ttrpc version=3 May 27 03:23:43.012610 systemd[1]: Started cri-containerd-ab064e81aa89687588429b467c288e43c8daca4370014564adc616cd41ca0e7c.scope - libcontainer container ab064e81aa89687588429b467c288e43c8daca4370014564adc616cd41ca0e7c. May 27 03:23:43.031753 containerd[1708]: time="2025-05-27T03:23:43.031722860Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-xpwsv,Uid:20f56e25-2cb1-49f2-9bf0-5daa88bcad05,Namespace:kube-system,Attempt:0,} returns sandbox id \"ab064e81aa89687588429b467c288e43c8daca4370014564adc616cd41ca0e7c\"" May 27 03:23:43.034170 containerd[1708]: time="2025-05-27T03:23:43.034146177Z" level=info msg="CreateContainer within sandbox \"ab064e81aa89687588429b467c288e43c8daca4370014564adc616cd41ca0e7c\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" May 27 03:23:43.058243 containerd[1708]: time="2025-05-27T03:23:43.058140318Z" level=info msg="Container b1f3b09e028644870a0be478e0532ce619a1285b6e57bce4d0dcda1528df4e8b: CDI devices from CRI Config.CDIDevices: []" May 27 03:23:43.072667 containerd[1708]: time="2025-05-27T03:23:43.072644816Z" level=info msg="CreateContainer within sandbox \"ab064e81aa89687588429b467c288e43c8daca4370014564adc616cd41ca0e7c\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"b1f3b09e028644870a0be478e0532ce619a1285b6e57bce4d0dcda1528df4e8b\"" May 27 03:23:43.073119 containerd[1708]: time="2025-05-27T03:23:43.073021288Z" level=info msg="StartContainer for \"b1f3b09e028644870a0be478e0532ce619a1285b6e57bce4d0dcda1528df4e8b\"" May 27 03:23:43.074397 containerd[1708]: time="2025-05-27T03:23:43.074361877Z" level=info msg="connecting to shim b1f3b09e028644870a0be478e0532ce619a1285b6e57bce4d0dcda1528df4e8b" address="unix:///run/containerd/s/46b2bc3fc11ff2579bdd98de7898f93e58222bfbc31ba289b54a651df0064473" protocol=ttrpc version=3 May 27 03:23:43.090557 systemd[1]: Started cri-containerd-b1f3b09e028644870a0be478e0532ce619a1285b6e57bce4d0dcda1528df4e8b.scope - libcontainer container b1f3b09e028644870a0be478e0532ce619a1285b6e57bce4d0dcda1528df4e8b. May 27 03:23:43.116892 containerd[1708]: time="2025-05-27T03:23:43.116870449Z" level=info msg="StartContainer for \"b1f3b09e028644870a0be478e0532ce619a1285b6e57bce4d0dcda1528df4e8b\" returns successfully" May 27 03:23:43.172397 containerd[1708]: time="2025-05-27T03:23:43.172375951Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-wpx57,Uid:dda99589-818c-4056-a949-c57e2b8d50f4,Namespace:tigera-operator,Attempt:0,}" May 27 03:23:43.208842 containerd[1708]: time="2025-05-27T03:23:43.208761641Z" level=info msg="connecting to shim 3926e9d5ec5766fced6623271051fd1dbe29d1e9606f114844030127e687e253" address="unix:///run/containerd/s/d02efc14b727d62347428fcf296f9385cf173acfed8dcae5fdecb29a394936c1" namespace=k8s.io protocol=ttrpc version=3 May 27 03:23:43.225555 systemd[1]: Started cri-containerd-3926e9d5ec5766fced6623271051fd1dbe29d1e9606f114844030127e687e253.scope - libcontainer container 3926e9d5ec5766fced6623271051fd1dbe29d1e9606f114844030127e687e253. May 27 03:23:43.257791 containerd[1708]: time="2025-05-27T03:23:43.257774628Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-844669ff44-wpx57,Uid:dda99589-818c-4056-a949-c57e2b8d50f4,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"3926e9d5ec5766fced6623271051fd1dbe29d1e9606f114844030127e687e253\"" May 27 03:23:43.258883 containerd[1708]: time="2025-05-27T03:23:43.258836765Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\"" May 27 03:23:43.560513 kubelet[3089]: I0527 03:23:43.560390 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-xpwsv" podStartSLOduration=1.560378079 podStartE2EDuration="1.560378079s" podCreationTimestamp="2025-05-27 03:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:23:43.289074348 +0000 UTC m=+7.121951788" watchObservedRunningTime="2025-05-27 03:23:43.560378079 +0000 UTC m=+7.393255569" May 27 03:23:44.465898 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2176165341.mount: Deactivated successfully. May 27 03:23:44.833386 containerd[1708]: time="2025-05-27T03:23:44.833327326Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:44.835682 containerd[1708]: time="2025-05-27T03:23:44.835651765Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.0: active requests=0, bytes read=25055451" May 27 03:23:44.838028 containerd[1708]: time="2025-05-27T03:23:44.837994597Z" level=info msg="ImageCreate event name:\"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:44.840900 containerd[1708]: time="2025-05-27T03:23:44.840867456Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:44.841422 containerd[1708]: time="2025-05-27T03:23:44.841153535Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.0\" with image id \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\", repo tag \"quay.io/tigera/operator:v1.38.0\", repo digest \"quay.io/tigera/operator@sha256:e0a34b265aebce1a2db906d8dad99190706e8bf3910cae626b9c2eb6bbb21775\", size \"25051446\" in 1.582290508s" May 27 03:23:44.841422 containerd[1708]: time="2025-05-27T03:23:44.841175953Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.0\" returns image reference \"sha256:5e43c1322619406528ff596056dfeb70cb8d20c5c00439feb752a7725302e033\"" May 27 03:23:44.842827 containerd[1708]: time="2025-05-27T03:23:44.842800265Z" level=info msg="CreateContainer within sandbox \"3926e9d5ec5766fced6623271051fd1dbe29d1e9606f114844030127e687e253\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" May 27 03:23:44.856734 containerd[1708]: time="2025-05-27T03:23:44.856713075Z" level=info msg="Container d287972ef0fa8cbada0b025e81386ef3bc7f3bbcd9a913041bb9dae2378fd53a: CDI devices from CRI Config.CDIDevices: []" May 27 03:23:44.868810 containerd[1708]: time="2025-05-27T03:23:44.868789705Z" level=info msg="CreateContainer within sandbox \"3926e9d5ec5766fced6623271051fd1dbe29d1e9606f114844030127e687e253\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"d287972ef0fa8cbada0b025e81386ef3bc7f3bbcd9a913041bb9dae2378fd53a\"" May 27 03:23:44.869200 containerd[1708]: time="2025-05-27T03:23:44.869130459Z" level=info msg="StartContainer for \"d287972ef0fa8cbada0b025e81386ef3bc7f3bbcd9a913041bb9dae2378fd53a\"" May 27 03:23:44.870354 containerd[1708]: time="2025-05-27T03:23:44.870140206Z" level=info msg="connecting to shim d287972ef0fa8cbada0b025e81386ef3bc7f3bbcd9a913041bb9dae2378fd53a" address="unix:///run/containerd/s/d02efc14b727d62347428fcf296f9385cf173acfed8dcae5fdecb29a394936c1" protocol=ttrpc version=3 May 27 03:23:44.888531 systemd[1]: Started cri-containerd-d287972ef0fa8cbada0b025e81386ef3bc7f3bbcd9a913041bb9dae2378fd53a.scope - libcontainer container d287972ef0fa8cbada0b025e81386ef3bc7f3bbcd9a913041bb9dae2378fd53a. May 27 03:23:44.913624 containerd[1708]: time="2025-05-27T03:23:44.913564311Z" level=info msg="StartContainer for \"d287972ef0fa8cbada0b025e81386ef3bc7f3bbcd9a913041bb9dae2378fd53a\" returns successfully" May 27 03:23:47.172975 kubelet[3089]: I0527 03:23:47.172890 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-844669ff44-wpx57" podStartSLOduration=3.58967507 podStartE2EDuration="5.172872603s" podCreationTimestamp="2025-05-27 03:23:42 +0000 UTC" firstStartedPulling="2025-05-27 03:23:43.258540751 +0000 UTC m=+7.091418182" lastFinishedPulling="2025-05-27 03:23:44.84173828 +0000 UTC m=+8.674615715" observedRunningTime="2025-05-27 03:23:45.30251146 +0000 UTC m=+9.135388898" watchObservedRunningTime="2025-05-27 03:23:47.172872603 +0000 UTC m=+11.005750043" May 27 03:23:50.104480 sudo[2136]: pam_unix(sudo:session): session closed for user root May 27 03:23:50.209400 sshd[2135]: Connection closed by 10.200.16.10 port 58652 May 27 03:23:50.209325 sshd-session[2133]: pam_unix(sshd:session): session closed for user core May 27 03:23:50.215689 systemd-logind[1695]: Session 9 logged out. Waiting for processes to exit. May 27 03:23:50.216845 systemd[1]: sshd@6-10.200.8.20:22-10.200.16.10:58652.service: Deactivated successfully. May 27 03:23:50.219125 systemd[1]: session-9.scope: Deactivated successfully. May 27 03:23:50.221676 systemd[1]: session-9.scope: Consumed 3.254s CPU time, 224.2M memory peak. May 27 03:23:50.227344 systemd-logind[1695]: Removed session 9. May 27 03:23:53.355633 systemd[1]: Created slice kubepods-besteffort-podaa867ef3_2a80_49e0_a21f_fb09bd588a84.slice - libcontainer container kubepods-besteffort-podaa867ef3_2a80_49e0_a21f_fb09bd588a84.slice. May 27 03:23:53.444465 kubelet[3089]: I0527 03:23:53.444409 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/aa867ef3-2a80-49e0-a21f-fb09bd588a84-typha-certs\") pod \"calico-typha-6f648c4545-wvqjk\" (UID: \"aa867ef3-2a80-49e0-a21f-fb09bd588a84\") " pod="calico-system/calico-typha-6f648c4545-wvqjk" May 27 03:23:53.444465 kubelet[3089]: I0527 03:23:53.444452 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4ct5\" (UniqueName: \"kubernetes.io/projected/aa867ef3-2a80-49e0-a21f-fb09bd588a84-kube-api-access-k4ct5\") pod \"calico-typha-6f648c4545-wvqjk\" (UID: \"aa867ef3-2a80-49e0-a21f-fb09bd588a84\") " pod="calico-system/calico-typha-6f648c4545-wvqjk" May 27 03:23:53.444746 kubelet[3089]: I0527 03:23:53.444488 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aa867ef3-2a80-49e0-a21f-fb09bd588a84-tigera-ca-bundle\") pod \"calico-typha-6f648c4545-wvqjk\" (UID: \"aa867ef3-2a80-49e0-a21f-fb09bd588a84\") " pod="calico-system/calico-typha-6f648c4545-wvqjk" May 27 03:23:53.664716 containerd[1708]: time="2025-05-27T03:23:53.664681156Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f648c4545-wvqjk,Uid:aa867ef3-2a80-49e0-a21f-fb09bd588a84,Namespace:calico-system,Attempt:0,}" May 27 03:23:53.702965 containerd[1708]: time="2025-05-27T03:23:53.702804728Z" level=info msg="connecting to shim 3c5124d6ae067762146e8c8d4e0c4e01533b457316d591333bd3958751371f10" address="unix:///run/containerd/s/72940842099fe62014e0920c3cd23cd59a3a0d68b3d6970dcc8c2a20b8a5fd50" namespace=k8s.io protocol=ttrpc version=3 May 27 03:23:53.725656 systemd[1]: Started cri-containerd-3c5124d6ae067762146e8c8d4e0c4e01533b457316d591333bd3958751371f10.scope - libcontainer container 3c5124d6ae067762146e8c8d4e0c4e01533b457316d591333bd3958751371f10. May 27 03:23:53.765929 containerd[1708]: time="2025-05-27T03:23:53.765902704Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6f648c4545-wvqjk,Uid:aa867ef3-2a80-49e0-a21f-fb09bd588a84,Namespace:calico-system,Attempt:0,} returns sandbox id \"3c5124d6ae067762146e8c8d4e0c4e01533b457316d591333bd3958751371f10\"" May 27 03:23:53.769251 containerd[1708]: time="2025-05-27T03:23:53.769101667Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\"" May 27 03:23:53.784464 systemd[1]: Created slice kubepods-besteffort-pod3328073a_94b5_4499_9691_0e228800a8da.slice - libcontainer container kubepods-besteffort-pod3328073a_94b5_4499_9691_0e228800a8da.slice. May 27 03:23:53.847746 kubelet[3089]: I0527 03:23:53.847724 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zckp7\" (UniqueName: \"kubernetes.io/projected/3328073a-94b5-4499-9691-0e228800a8da-kube-api-access-zckp7\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847827 kubelet[3089]: I0527 03:23:53.847773 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/3328073a-94b5-4499-9691-0e228800a8da-cni-net-dir\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847827 kubelet[3089]: I0527 03:23:53.847791 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3328073a-94b5-4499-9691-0e228800a8da-lib-modules\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847827 kubelet[3089]: I0527 03:23:53.847812 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/3328073a-94b5-4499-9691-0e228800a8da-node-certs\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847891 kubelet[3089]: I0527 03:23:53.847827 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/3328073a-94b5-4499-9691-0e228800a8da-policysync\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847891 kubelet[3089]: I0527 03:23:53.847842 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/3328073a-94b5-4499-9691-0e228800a8da-var-lib-calico\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847891 kubelet[3089]: I0527 03:23:53.847860 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/3328073a-94b5-4499-9691-0e228800a8da-cni-bin-dir\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847891 kubelet[3089]: I0527 03:23:53.847876 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/3328073a-94b5-4499-9691-0e228800a8da-cni-log-dir\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847972 kubelet[3089]: I0527 03:23:53.847891 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/3328073a-94b5-4499-9691-0e228800a8da-flexvol-driver-host\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847972 kubelet[3089]: I0527 03:23:53.847908 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/3328073a-94b5-4499-9691-0e228800a8da-var-run-calico\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847972 kubelet[3089]: I0527 03:23:53.847924 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/3328073a-94b5-4499-9691-0e228800a8da-xtables-lock\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.847972 kubelet[3089]: I0527 03:23:53.847940 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3328073a-94b5-4499-9691-0e228800a8da-tigera-ca-bundle\") pod \"calico-node-926st\" (UID: \"3328073a-94b5-4499-9691-0e228800a8da\") " pod="calico-system/calico-node-926st" May 27 03:23:53.950490 kubelet[3089]: E0527 03:23:53.949347 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.950490 kubelet[3089]: W0527 03:23:53.949364 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.950490 kubelet[3089]: E0527 03:23:53.949399 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.950490 kubelet[3089]: E0527 03:23:53.949515 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.950490 kubelet[3089]: W0527 03:23:53.949520 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.950490 kubelet[3089]: E0527 03:23:53.949540 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.950490 kubelet[3089]: E0527 03:23:53.949625 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.950490 kubelet[3089]: W0527 03:23:53.949630 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.950490 kubelet[3089]: E0527 03:23:53.949636 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.950490 kubelet[3089]: E0527 03:23:53.949780 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.950854 kubelet[3089]: W0527 03:23:53.949785 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.950854 kubelet[3089]: E0527 03:23:53.949791 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.950854 kubelet[3089]: E0527 03:23:53.949886 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.950854 kubelet[3089]: W0527 03:23:53.949892 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.950854 kubelet[3089]: E0527 03:23:53.949897 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.950854 kubelet[3089]: E0527 03:23:53.949974 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.950854 kubelet[3089]: W0527 03:23:53.949979 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.950854 kubelet[3089]: E0527 03:23:53.949985 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.950854 kubelet[3089]: E0527 03:23:53.950074 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.950854 kubelet[3089]: W0527 03:23:53.950079 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.951052 kubelet[3089]: E0527 03:23:53.950093 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.951052 kubelet[3089]: E0527 03:23:53.950219 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.951052 kubelet[3089]: W0527 03:23:53.950224 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.951052 kubelet[3089]: E0527 03:23:53.950230 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.951052 kubelet[3089]: E0527 03:23:53.950548 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.951052 kubelet[3089]: W0527 03:23:53.950556 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.951052 kubelet[3089]: E0527 03:23:53.950565 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.951466 kubelet[3089]: E0527 03:23:53.951218 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.951508 kubelet[3089]: W0527 03:23:53.951468 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.951508 kubelet[3089]: E0527 03:23:53.951484 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.953511 kubelet[3089]: E0527 03:23:53.953491 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.953511 kubelet[3089]: W0527 03:23:53.953510 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.953596 kubelet[3089]: E0527 03:23:53.953543 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.953697 kubelet[3089]: E0527 03:23:53.953686 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.953724 kubelet[3089]: W0527 03:23:53.953697 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.953918 kubelet[3089]: E0527 03:23:53.953905 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.953987 kubelet[3089]: E0527 03:23:53.953909 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.954037 kubelet[3089]: W0527 03:23:53.954031 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.954080 kubelet[3089]: E0527 03:23:53.954074 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.954216 kubelet[3089]: E0527 03:23:53.954211 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.954254 kubelet[3089]: W0527 03:23:53.954248 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.954296 kubelet[3089]: E0527 03:23:53.954290 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.954492 kubelet[3089]: E0527 03:23:53.954421 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.954492 kubelet[3089]: W0527 03:23:53.954428 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.955515 kubelet[3089]: E0527 03:23:53.955463 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.955746 kubelet[3089]: E0527 03:23:53.955724 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.955746 kubelet[3089]: W0527 03:23:53.955734 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.955892 kubelet[3089]: E0527 03:23:53.955870 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.956120 kubelet[3089]: E0527 03:23:53.956050 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.956120 kubelet[3089]: W0527 03:23:53.956058 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.956182 kubelet[3089]: E0527 03:23:53.956130 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.956322 kubelet[3089]: E0527 03:23:53.956294 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.956322 kubelet[3089]: W0527 03:23:53.956301 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.956322 kubelet[3089]: E0527 03:23:53.956309 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:53.960909 kubelet[3089]: E0527 03:23:53.960893 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:53.960909 kubelet[3089]: W0527 03:23:53.960905 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:53.961004 kubelet[3089]: E0527 03:23:53.960915 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.012099 kubelet[3089]: E0527 03:23:54.011998 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9cggb" podUID="27ed5e5b-c569-4ae0-93dc-3358bb46ea0c" May 27 03:23:54.040725 kubelet[3089]: E0527 03:23:54.040709 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.040725 kubelet[3089]: W0527 03:23:54.040722 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.040848 kubelet[3089]: E0527 03:23:54.040734 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.040848 kubelet[3089]: E0527 03:23:54.040830 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.040848 kubelet[3089]: W0527 03:23:54.040836 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.040848 kubelet[3089]: E0527 03:23:54.040843 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.040959 kubelet[3089]: E0527 03:23:54.040926 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.040959 kubelet[3089]: W0527 03:23:54.040930 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.040959 kubelet[3089]: E0527 03:23:54.040937 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041077 kubelet[3089]: E0527 03:23:54.041043 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041077 kubelet[3089]: W0527 03:23:54.041063 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041077 kubelet[3089]: E0527 03:23:54.041070 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041178 kubelet[3089]: E0527 03:23:54.041154 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041178 kubelet[3089]: W0527 03:23:54.041174 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041238 kubelet[3089]: E0527 03:23:54.041180 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041295 kubelet[3089]: E0527 03:23:54.041289 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041322 kubelet[3089]: W0527 03:23:54.041295 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041322 kubelet[3089]: E0527 03:23:54.041301 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041458 kubelet[3089]: E0527 03:23:54.041373 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041458 kubelet[3089]: W0527 03:23:54.041378 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041458 kubelet[3089]: E0527 03:23:54.041383 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041536 kubelet[3089]: E0527 03:23:54.041469 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041536 kubelet[3089]: W0527 03:23:54.041473 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041536 kubelet[3089]: E0527 03:23:54.041479 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041599 kubelet[3089]: E0527 03:23:54.041563 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041599 kubelet[3089]: W0527 03:23:54.041566 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041599 kubelet[3089]: E0527 03:23:54.041572 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041665 kubelet[3089]: E0527 03:23:54.041640 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041665 kubelet[3089]: W0527 03:23:54.041644 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041665 kubelet[3089]: E0527 03:23:54.041649 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041727 kubelet[3089]: E0527 03:23:54.041715 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041727 kubelet[3089]: W0527 03:23:54.041720 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041727 kubelet[3089]: E0527 03:23:54.041725 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041819 kubelet[3089]: E0527 03:23:54.041794 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041819 kubelet[3089]: W0527 03:23:54.041800 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041819 kubelet[3089]: E0527 03:23:54.041804 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041896 kubelet[3089]: E0527 03:23:54.041879 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041896 kubelet[3089]: W0527 03:23:54.041883 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041896 kubelet[3089]: E0527 03:23:54.041888 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.041973 kubelet[3089]: E0527 03:23:54.041954 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.041973 kubelet[3089]: W0527 03:23:54.041958 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.041973 kubelet[3089]: E0527 03:23:54.041964 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.042038 kubelet[3089]: E0527 03:23:54.042033 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.042038 kubelet[3089]: W0527 03:23:54.042037 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.042090 kubelet[3089]: E0527 03:23:54.042042 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.042115 kubelet[3089]: E0527 03:23:54.042111 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.042115 kubelet[3089]: W0527 03:23:54.042115 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.042172 kubelet[3089]: E0527 03:23:54.042120 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.042212 kubelet[3089]: E0527 03:23:54.042193 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.042212 kubelet[3089]: W0527 03:23:54.042201 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.042212 kubelet[3089]: E0527 03:23:54.042206 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.042279 kubelet[3089]: E0527 03:23:54.042276 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.042279 kubelet[3089]: W0527 03:23:54.042279 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.042279 kubelet[3089]: E0527 03:23:54.042284 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.042367 kubelet[3089]: E0527 03:23:54.042349 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.042367 kubelet[3089]: W0527 03:23:54.042353 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.042367 kubelet[3089]: E0527 03:23:54.042358 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.042465 kubelet[3089]: E0527 03:23:54.042423 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.042465 kubelet[3089]: W0527 03:23:54.042427 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.042465 kubelet[3089]: E0527 03:23:54.042432 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.049670 kubelet[3089]: E0527 03:23:54.049655 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.049670 kubelet[3089]: W0527 03:23:54.049666 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.049800 kubelet[3089]: E0527 03:23:54.049677 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.049800 kubelet[3089]: I0527 03:23:54.049696 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/27ed5e5b-c569-4ae0-93dc-3358bb46ea0c-registration-dir\") pod \"csi-node-driver-9cggb\" (UID: \"27ed5e5b-c569-4ae0-93dc-3358bb46ea0c\") " pod="calico-system/csi-node-driver-9cggb" May 27 03:23:54.049800 kubelet[3089]: E0527 03:23:54.049796 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.049922 kubelet[3089]: W0527 03:23:54.049810 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.049922 kubelet[3089]: E0527 03:23:54.049821 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.049922 kubelet[3089]: I0527 03:23:54.049834 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/27ed5e5b-c569-4ae0-93dc-3358bb46ea0c-socket-dir\") pod \"csi-node-driver-9cggb\" (UID: \"27ed5e5b-c569-4ae0-93dc-3358bb46ea0c\") " pod="calico-system/csi-node-driver-9cggb" May 27 03:23:54.050006 kubelet[3089]: E0527 03:23:54.049999 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.050109 kubelet[3089]: W0527 03:23:54.050006 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.050109 kubelet[3089]: E0527 03:23:54.050017 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.050109 kubelet[3089]: I0527 03:23:54.050029 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/27ed5e5b-c569-4ae0-93dc-3358bb46ea0c-kubelet-dir\") pod \"csi-node-driver-9cggb\" (UID: \"27ed5e5b-c569-4ae0-93dc-3358bb46ea0c\") " pod="calico-system/csi-node-driver-9cggb" May 27 03:23:54.050285 kubelet[3089]: E0527 03:23:54.050187 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.050285 kubelet[3089]: W0527 03:23:54.050194 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.050285 kubelet[3089]: E0527 03:23:54.050206 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.050420 kubelet[3089]: E0527 03:23:54.050413 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.050503 kubelet[3089]: W0527 03:23:54.050469 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.050601 kubelet[3089]: E0527 03:23:54.050538 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.050742 kubelet[3089]: E0527 03:23:54.050731 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.050742 kubelet[3089]: W0527 03:23:54.050740 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.050813 kubelet[3089]: E0527 03:23:54.050754 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.050860 kubelet[3089]: E0527 03:23:54.050841 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.050860 kubelet[3089]: W0527 03:23:54.050848 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.050860 kubelet[3089]: E0527 03:23:54.050857 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.050970 kubelet[3089]: E0527 03:23:54.050958 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.050970 kubelet[3089]: W0527 03:23:54.050967 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.051020 kubelet[3089]: E0527 03:23:54.050976 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.051020 kubelet[3089]: I0527 03:23:54.050993 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/27ed5e5b-c569-4ae0-93dc-3358bb46ea0c-varrun\") pod \"csi-node-driver-9cggb\" (UID: \"27ed5e5b-c569-4ae0-93dc-3358bb46ea0c\") " pod="calico-system/csi-node-driver-9cggb" May 27 03:23:54.051096 kubelet[3089]: E0527 03:23:54.051087 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.051096 kubelet[3089]: W0527 03:23:54.051094 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.051155 kubelet[3089]: E0527 03:23:54.051144 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.051176 kubelet[3089]: I0527 03:23:54.051162 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flb6h\" (UniqueName: \"kubernetes.io/projected/27ed5e5b-c569-4ae0-93dc-3358bb46ea0c-kube-api-access-flb6h\") pod \"csi-node-driver-9cggb\" (UID: \"27ed5e5b-c569-4ae0-93dc-3358bb46ea0c\") " pod="calico-system/csi-node-driver-9cggb" May 27 03:23:54.051236 kubelet[3089]: E0527 03:23:54.051226 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.051236 kubelet[3089]: W0527 03:23:54.051234 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.051300 kubelet[3089]: E0527 03:23:54.051243 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.051339 kubelet[3089]: E0527 03:23:54.051330 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.051339 kubelet[3089]: W0527 03:23:54.051335 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.051394 kubelet[3089]: E0527 03:23:54.051344 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.051704 kubelet[3089]: E0527 03:23:54.051615 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.051704 kubelet[3089]: W0527 03:23:54.051625 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.051704 kubelet[3089]: E0527 03:23:54.051638 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.051796 kubelet[3089]: E0527 03:23:54.051766 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.051796 kubelet[3089]: W0527 03:23:54.051773 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.051796 kubelet[3089]: E0527 03:23:54.051781 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.051873 kubelet[3089]: E0527 03:23:54.051868 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.051895 kubelet[3089]: W0527 03:23:54.051873 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.051895 kubelet[3089]: E0527 03:23:54.051879 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.051983 kubelet[3089]: E0527 03:23:54.051971 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.051983 kubelet[3089]: W0527 03:23:54.051978 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.052022 kubelet[3089]: E0527 03:23:54.051983 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.089923 containerd[1708]: time="2025-05-27T03:23:54.089750297Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-926st,Uid:3328073a-94b5-4499-9691-0e228800a8da,Namespace:calico-system,Attempt:0,}" May 27 03:23:54.133198 containerd[1708]: time="2025-05-27T03:23:54.133149162Z" level=info msg="connecting to shim 49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140" address="unix:///run/containerd/s/96807204bc5f2e1951558b4c017e68cbb16cacda6c43daaf579516713637607a" namespace=k8s.io protocol=ttrpc version=3 May 27 03:23:54.150571 systemd[1]: Started cri-containerd-49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140.scope - libcontainer container 49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140. May 27 03:23:54.152295 kubelet[3089]: E0527 03:23:54.152255 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.152569 kubelet[3089]: W0527 03:23:54.152382 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.152569 kubelet[3089]: E0527 03:23:54.152402 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.152943 kubelet[3089]: E0527 03:23:54.152860 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.152943 kubelet[3089]: W0527 03:23:54.152871 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.152943 kubelet[3089]: E0527 03:23:54.152889 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.153183 kubelet[3089]: E0527 03:23:54.153164 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.153183 kubelet[3089]: W0527 03:23:54.153174 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.153292 kubelet[3089]: E0527 03:23:54.153247 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.153434 kubelet[3089]: E0527 03:23:54.153419 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.153434 kubelet[3089]: W0527 03:23:54.153426 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.153434 kubelet[3089]: E0527 03:23:54.153459 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.153806 kubelet[3089]: E0527 03:23:54.153786 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.153806 kubelet[3089]: W0527 03:23:54.153795 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.153927 kubelet[3089]: E0527 03:23:54.153871 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.154095 kubelet[3089]: E0527 03:23:54.154066 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.154095 kubelet[3089]: W0527 03:23:54.154085 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.154242 kubelet[3089]: E0527 03:23:54.154226 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.154348 kubelet[3089]: E0527 03:23:54.154333 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.154348 kubelet[3089]: W0527 03:23:54.154340 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.154492 kubelet[3089]: E0527 03:23:54.154480 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.154565 kubelet[3089]: E0527 03:23:54.154551 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.154565 kubelet[3089]: W0527 03:23:54.154557 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.154696 kubelet[3089]: E0527 03:23:54.154688 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.154785 kubelet[3089]: E0527 03:23:54.154770 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.154785 kubelet[3089]: W0527 03:23:54.154777 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.154933 kubelet[3089]: E0527 03:23:54.154920 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.155009 kubelet[3089]: E0527 03:23:54.154994 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.155009 kubelet[3089]: W0527 03:23:54.155001 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.155160 kubelet[3089]: E0527 03:23:54.155141 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.155273 kubelet[3089]: E0527 03:23:54.155236 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.155273 kubelet[3089]: W0527 03:23:54.155243 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.155273 kubelet[3089]: E0527 03:23:54.155252 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.155479 kubelet[3089]: E0527 03:23:54.155465 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.155479 kubelet[3089]: W0527 03:23:54.155472 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.155599 kubelet[3089]: E0527 03:23:54.155541 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.155733 kubelet[3089]: E0527 03:23:54.155720 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.155733 kubelet[3089]: W0527 03:23:54.155726 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.155871 kubelet[3089]: E0527 03:23:54.155816 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.155940 kubelet[3089]: E0527 03:23:54.155927 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.155940 kubelet[3089]: W0527 03:23:54.155933 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.156033 kubelet[3089]: E0527 03:23:54.156013 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.156181 kubelet[3089]: E0527 03:23:54.156152 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.156181 kubelet[3089]: W0527 03:23:54.156159 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.156279 kubelet[3089]: E0527 03:23:54.156237 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.156336 kubelet[3089]: E0527 03:23:54.156331 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.156395 kubelet[3089]: W0527 03:23:54.156366 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.156432 kubelet[3089]: E0527 03:23:54.156425 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.156704 kubelet[3089]: E0527 03:23:54.156693 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.156770 kubelet[3089]: W0527 03:23:54.156748 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.156917 kubelet[3089]: E0527 03:23:54.156878 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.156957 kubelet[3089]: E0527 03:23:54.156922 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.156957 kubelet[3089]: W0527 03:23:54.156930 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.156957 kubelet[3089]: E0527 03:23:54.156943 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.157132 kubelet[3089]: E0527 03:23:54.157125 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.157233 kubelet[3089]: W0527 03:23:54.157164 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.157233 kubelet[3089]: E0527 03:23:54.157177 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.157383 kubelet[3089]: E0527 03:23:54.157377 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.157418 kubelet[3089]: W0527 03:23:54.157413 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.157499 kubelet[3089]: E0527 03:23:54.157489 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.157631 kubelet[3089]: E0527 03:23:54.157625 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.157728 kubelet[3089]: W0527 03:23:54.157666 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.157728 kubelet[3089]: E0527 03:23:54.157679 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.157894 kubelet[3089]: E0527 03:23:54.157878 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.157894 kubelet[3089]: W0527 03:23:54.157886 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.158008 kubelet[3089]: E0527 03:23:54.157953 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.158075 kubelet[3089]: E0527 03:23:54.158059 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.158075 kubelet[3089]: W0527 03:23:54.158068 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.158172 kubelet[3089]: E0527 03:23:54.158129 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.158358 kubelet[3089]: E0527 03:23:54.158345 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.158422 kubelet[3089]: W0527 03:23:54.158357 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.158561 kubelet[3089]: E0527 03:23:54.158552 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.158683 kubelet[3089]: E0527 03:23:54.158644 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.158683 kubelet[3089]: W0527 03:23:54.158651 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.158683 kubelet[3089]: E0527 03:23:54.158660 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.165341 kubelet[3089]: E0527 03:23:54.165329 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:54.165341 kubelet[3089]: W0527 03:23:54.165341 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:54.165550 kubelet[3089]: E0527 03:23:54.165363 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:54.173407 containerd[1708]: time="2025-05-27T03:23:54.173346201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-926st,Uid:3328073a-94b5-4499-9691-0e228800a8da,Namespace:calico-system,Attempt:0,} returns sandbox id \"49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140\"" May 27 03:23:55.060268 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4211491937.mount: Deactivated successfully. May 27 03:23:55.244320 kubelet[3089]: E0527 03:23:55.244237 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9cggb" podUID="27ed5e5b-c569-4ae0-93dc-3358bb46ea0c" May 27 03:23:55.440811 containerd[1708]: time="2025-05-27T03:23:55.440783200Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:55.443493 containerd[1708]: time="2025-05-27T03:23:55.443460982Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.0: active requests=0, bytes read=35158669" May 27 03:23:55.446136 containerd[1708]: time="2025-05-27T03:23:55.446101035Z" level=info msg="ImageCreate event name:\"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:55.448817 containerd[1708]: time="2025-05-27T03:23:55.448781186Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:55.449112 containerd[1708]: time="2025-05-27T03:23:55.449026938Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.0\" with image id \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:d282f6c773c4631b9dc8379eb093c54ca34c7728d55d6509cb45da5e1f5baf8f\", size \"35158523\" in 1.679805368s" May 27 03:23:55.449112 containerd[1708]: time="2025-05-27T03:23:55.449050125Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.0\" returns image reference \"sha256:71be0570e8645ac646675719e0da6ac33a05810991b31aecc303e7add70933be\"" May 27 03:23:55.450087 containerd[1708]: time="2025-05-27T03:23:55.449933021Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\"" May 27 03:23:55.460253 containerd[1708]: time="2025-05-27T03:23:55.460232346Z" level=info msg="CreateContainer within sandbox \"3c5124d6ae067762146e8c8d4e0c4e01533b457316d591333bd3958751371f10\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" May 27 03:23:55.473101 containerd[1708]: time="2025-05-27T03:23:55.473081745Z" level=info msg="Container aa05a1938c86a1c4d987990bf2083202ffcbc2cbc128698a7aa5bf297d8b2d3d: CDI devices from CRI Config.CDIDevices: []" May 27 03:23:55.491206 containerd[1708]: time="2025-05-27T03:23:55.491140749Z" level=info msg="CreateContainer within sandbox \"3c5124d6ae067762146e8c8d4e0c4e01533b457316d591333bd3958751371f10\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"aa05a1938c86a1c4d987990bf2083202ffcbc2cbc128698a7aa5bf297d8b2d3d\"" May 27 03:23:55.492367 containerd[1708]: time="2025-05-27T03:23:55.492286991Z" level=info msg="StartContainer for \"aa05a1938c86a1c4d987990bf2083202ffcbc2cbc128698a7aa5bf297d8b2d3d\"" May 27 03:23:55.493535 containerd[1708]: time="2025-05-27T03:23:55.493421709Z" level=info msg="connecting to shim aa05a1938c86a1c4d987990bf2083202ffcbc2cbc128698a7aa5bf297d8b2d3d" address="unix:///run/containerd/s/72940842099fe62014e0920c3cd23cd59a3a0d68b3d6970dcc8c2a20b8a5fd50" protocol=ttrpc version=3 May 27 03:23:55.508564 systemd[1]: Started cri-containerd-aa05a1938c86a1c4d987990bf2083202ffcbc2cbc128698a7aa5bf297d8b2d3d.scope - libcontainer container aa05a1938c86a1c4d987990bf2083202ffcbc2cbc128698a7aa5bf297d8b2d3d. May 27 03:23:55.549298 containerd[1708]: time="2025-05-27T03:23:55.549240046Z" level=info msg="StartContainer for \"aa05a1938c86a1c4d987990bf2083202ffcbc2cbc128698a7aa5bf297d8b2d3d\" returns successfully" May 27 03:23:56.326890 kubelet[3089]: I0527 03:23:56.326704 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6f648c4545-wvqjk" podStartSLOduration=1.645629695 podStartE2EDuration="3.326689987s" podCreationTimestamp="2025-05-27 03:23:53 +0000 UTC" firstStartedPulling="2025-05-27 03:23:53.768560901 +0000 UTC m=+17.601438328" lastFinishedPulling="2025-05-27 03:23:55.449621187 +0000 UTC m=+19.282498620" observedRunningTime="2025-05-27 03:23:56.325893939 +0000 UTC m=+20.158771376" watchObservedRunningTime="2025-05-27 03:23:56.326689987 +0000 UTC m=+20.159567555" May 27 03:23:56.358347 kubelet[3089]: E0527 03:23:56.358331 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.358347 kubelet[3089]: W0527 03:23:56.358345 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.358479 kubelet[3089]: E0527 03:23:56.358359 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.358479 kubelet[3089]: E0527 03:23:56.358474 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.358562 kubelet[3089]: W0527 03:23:56.358480 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.358562 kubelet[3089]: E0527 03:23:56.358487 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.358610 kubelet[3089]: E0527 03:23:56.358581 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.358610 kubelet[3089]: W0527 03:23:56.358588 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.358610 kubelet[3089]: E0527 03:23:56.358596 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.358737 kubelet[3089]: E0527 03:23:56.358713 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.358737 kubelet[3089]: W0527 03:23:56.358734 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.358797 kubelet[3089]: E0527 03:23:56.358742 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.358858 kubelet[3089]: E0527 03:23:56.358834 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.358858 kubelet[3089]: W0527 03:23:56.358854 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.358931 kubelet[3089]: E0527 03:23:56.358861 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.358962 kubelet[3089]: E0527 03:23:56.358958 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.358997 kubelet[3089]: W0527 03:23:56.358964 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.358997 kubelet[3089]: E0527 03:23:56.358971 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.359077 kubelet[3089]: E0527 03:23:56.359071 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.359106 kubelet[3089]: W0527 03:23:56.359077 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.359106 kubelet[3089]: E0527 03:23:56.359083 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.359165 kubelet[3089]: E0527 03:23:56.359157 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.359165 kubelet[3089]: W0527 03:23:56.359161 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.359217 kubelet[3089]: E0527 03:23:56.359166 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.359285 kubelet[3089]: E0527 03:23:56.359249 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.359285 kubelet[3089]: W0527 03:23:56.359254 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.359285 kubelet[3089]: E0527 03:23:56.359259 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.359375 kubelet[3089]: E0527 03:23:56.359352 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.359375 kubelet[3089]: W0527 03:23:56.359356 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.359375 kubelet[3089]: E0527 03:23:56.359362 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.359459 kubelet[3089]: E0527 03:23:56.359431 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.359459 kubelet[3089]: W0527 03:23:56.359449 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.359459 kubelet[3089]: E0527 03:23:56.359455 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.359534 kubelet[3089]: E0527 03:23:56.359529 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.359534 kubelet[3089]: W0527 03:23:56.359534 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.359578 kubelet[3089]: E0527 03:23:56.359539 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.359638 kubelet[3089]: E0527 03:23:56.359615 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.359638 kubelet[3089]: W0527 03:23:56.359635 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.359686 kubelet[3089]: E0527 03:23:56.359641 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.359727 kubelet[3089]: E0527 03:23:56.359719 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.359727 kubelet[3089]: W0527 03:23:56.359725 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.359772 kubelet[3089]: E0527 03:23:56.359730 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.359806 kubelet[3089]: E0527 03:23:56.359801 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.359806 kubelet[3089]: W0527 03:23:56.359805 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.359857 kubelet[3089]: E0527 03:23:56.359810 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.369214 kubelet[3089]: E0527 03:23:56.369200 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.369214 kubelet[3089]: W0527 03:23:56.369211 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.369325 kubelet[3089]: E0527 03:23:56.369222 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.369351 kubelet[3089]: E0527 03:23:56.369327 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.369351 kubelet[3089]: W0527 03:23:56.369332 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.369351 kubelet[3089]: E0527 03:23:56.369339 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.369520 kubelet[3089]: E0527 03:23:56.369427 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.369520 kubelet[3089]: W0527 03:23:56.369431 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.369520 kubelet[3089]: E0527 03:23:56.369448 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.369600 kubelet[3089]: E0527 03:23:56.369543 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.369600 kubelet[3089]: W0527 03:23:56.369551 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.369600 kubelet[3089]: E0527 03:23:56.369563 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.369693 kubelet[3089]: E0527 03:23:56.369642 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.369693 kubelet[3089]: W0527 03:23:56.369646 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.369693 kubelet[3089]: E0527 03:23:56.369654 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.369771 kubelet[3089]: E0527 03:23:56.369732 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.369771 kubelet[3089]: W0527 03:23:56.369737 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.369771 kubelet[3089]: E0527 03:23:56.369745 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.369851 kubelet[3089]: E0527 03:23:56.369848 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.369878 kubelet[3089]: W0527 03:23:56.369852 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.369878 kubelet[3089]: E0527 03:23:56.369861 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.370024 kubelet[3089]: E0527 03:23:56.369997 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.370024 kubelet[3089]: W0527 03:23:56.370004 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.370024 kubelet[3089]: E0527 03:23:56.370011 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.370112 kubelet[3089]: E0527 03:23:56.370100 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.370112 kubelet[3089]: W0527 03:23:56.370107 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.370160 kubelet[3089]: E0527 03:23:56.370115 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.370215 kubelet[3089]: E0527 03:23:56.370208 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.370237 kubelet[3089]: W0527 03:23:56.370223 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.370237 kubelet[3089]: E0527 03:23:56.370230 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.370352 kubelet[3089]: E0527 03:23:56.370295 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.370352 kubelet[3089]: W0527 03:23:56.370315 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.370352 kubelet[3089]: E0527 03:23:56.370321 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.370452 kubelet[3089]: E0527 03:23:56.370390 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.370452 kubelet[3089]: W0527 03:23:56.370395 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.370452 kubelet[3089]: E0527 03:23:56.370407 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.370527 kubelet[3089]: E0527 03:23:56.370499 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.370527 kubelet[3089]: W0527 03:23:56.370504 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.370527 kubelet[3089]: E0527 03:23:56.370512 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.370603 kubelet[3089]: E0527 03:23:56.370582 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.370603 kubelet[3089]: W0527 03:23:56.370587 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.370603 kubelet[3089]: E0527 03:23:56.370592 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.370730 kubelet[3089]: E0527 03:23:56.370706 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.370730 kubelet[3089]: W0527 03:23:56.370728 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.370770 kubelet[3089]: E0527 03:23:56.370736 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.370957 kubelet[3089]: E0527 03:23:56.370934 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.370957 kubelet[3089]: W0527 03:23:56.370955 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.371013 kubelet[3089]: E0527 03:23:56.370963 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.371102 kubelet[3089]: E0527 03:23:56.371079 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.371102 kubelet[3089]: W0527 03:23:56.371100 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.371169 kubelet[3089]: E0527 03:23:56.371107 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.371539 kubelet[3089]: E0527 03:23:56.371463 3089 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input May 27 03:23:56.371539 kubelet[3089]: W0527 03:23:56.371472 3089 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" May 27 03:23:56.371539 kubelet[3089]: E0527 03:23:56.371482 3089 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" May 27 03:23:56.523822 containerd[1708]: time="2025-05-27T03:23:56.523796924Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:56.525872 containerd[1708]: time="2025-05-27T03:23:56.525841944Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0: active requests=0, bytes read=4441619" May 27 03:23:56.528061 containerd[1708]: time="2025-05-27T03:23:56.528027134Z" level=info msg="ImageCreate event name:\"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:56.530751 containerd[1708]: time="2025-05-27T03:23:56.530725957Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:23:56.531549 containerd[1708]: time="2025-05-27T03:23:56.531464416Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" with image id \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:ce76dd87f11d3fd0054c35ad2e0e9f833748d007f77a9bfe859d0ddcb66fcb2c\", size \"5934282\" in 1.081505654s" May 27 03:23:56.531549 containerd[1708]: time="2025-05-27T03:23:56.531492194Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.0\" returns image reference \"sha256:c53606cea03e59dcbfa981dc43a55dff05952895f72576b8389fa00be09ab676\"" May 27 03:23:56.533631 containerd[1708]: time="2025-05-27T03:23:56.533607320Z" level=info msg="CreateContainer within sandbox \"49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" May 27 03:23:56.547460 containerd[1708]: time="2025-05-27T03:23:56.547004237Z" level=info msg="Container 83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0: CDI devices from CRI Config.CDIDevices: []" May 27 03:23:56.562860 containerd[1708]: time="2025-05-27T03:23:56.562840382Z" level=info msg="CreateContainer within sandbox \"49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0\"" May 27 03:23:56.563460 containerd[1708]: time="2025-05-27T03:23:56.563196398Z" level=info msg="StartContainer for \"83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0\"" May 27 03:23:56.564588 containerd[1708]: time="2025-05-27T03:23:56.564551748Z" level=info msg="connecting to shim 83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0" address="unix:///run/containerd/s/96807204bc5f2e1951558b4c017e68cbb16cacda6c43daaf579516713637607a" protocol=ttrpc version=3 May 27 03:23:56.582570 systemd[1]: Started cri-containerd-83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0.scope - libcontainer container 83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0. May 27 03:23:56.610455 containerd[1708]: time="2025-05-27T03:23:56.609764582Z" level=info msg="StartContainer for \"83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0\" returns successfully" May 27 03:23:56.611603 systemd[1]: cri-containerd-83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0.scope: Deactivated successfully. May 27 03:23:56.614311 containerd[1708]: time="2025-05-27T03:23:56.614283981Z" level=info msg="received exit event container_id:\"83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0\" id:\"83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0\" pid:3776 exited_at:{seconds:1748316236 nanos:614010248}" May 27 03:23:56.614669 containerd[1708]: time="2025-05-27T03:23:56.614387432Z" level=info msg="TaskExit event in podsandbox handler container_id:\"83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0\" id:\"83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0\" pid:3776 exited_at:{seconds:1748316236 nanos:614010248}" May 27 03:23:56.627856 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-83e803feb6b4e1a7a5d2f8e2c443ac17ff0d5222505f6d4b9984d51c0e52f7f0-rootfs.mount: Deactivated successfully. May 27 03:23:57.244832 kubelet[3089]: E0527 03:23:57.244808 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9cggb" podUID="27ed5e5b-c569-4ae0-93dc-3358bb46ea0c" May 27 03:23:57.318099 kubelet[3089]: I0527 03:23:57.318054 3089 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:23:59.244186 kubelet[3089]: E0527 03:23:59.244146 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9cggb" podUID="27ed5e5b-c569-4ae0-93dc-3358bb46ea0c" May 27 03:23:59.324049 containerd[1708]: time="2025-05-27T03:23:59.323993302Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\"" May 27 03:24:01.244967 kubelet[3089]: E0527 03:24:01.244938 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9cggb" podUID="27ed5e5b-c569-4ae0-93dc-3358bb46ea0c" May 27 03:24:02.659428 containerd[1708]: time="2025-05-27T03:24:02.659399918Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:02.661448 containerd[1708]: time="2025-05-27T03:24:02.661409757Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.0: active requests=0, bytes read=70300568" May 27 03:24:02.663601 containerd[1708]: time="2025-05-27T03:24:02.663563304Z" level=info msg="ImageCreate event name:\"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:02.666405 containerd[1708]: time="2025-05-27T03:24:02.666368855Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:02.666815 containerd[1708]: time="2025-05-27T03:24:02.666724705Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.0\" with image id \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:3dd06656abdc03fbd51782d5f6fe4d70e6825a1c0c5bce2a165bbd2ff9e0f7df\", size \"71793271\" in 3.342698037s" May 27 03:24:02.666815 containerd[1708]: time="2025-05-27T03:24:02.666748474Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.0\" returns image reference \"sha256:15f996c472622f23047ea38b2d72940e8c34d0996b8a2e12a1f255c1d7083185\"" May 27 03:24:02.668433 containerd[1708]: time="2025-05-27T03:24:02.668406856Z" level=info msg="CreateContainer within sandbox \"49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" May 27 03:24:02.684455 containerd[1708]: time="2025-05-27T03:24:02.682884854Z" level=info msg="Container 105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a: CDI devices from CRI Config.CDIDevices: []" May 27 03:24:02.697104 containerd[1708]: time="2025-05-27T03:24:02.697081871Z" level=info msg="CreateContainer within sandbox \"49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a\"" May 27 03:24:02.697453 containerd[1708]: time="2025-05-27T03:24:02.697419289Z" level=info msg="StartContainer for \"105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a\"" May 27 03:24:02.698718 containerd[1708]: time="2025-05-27T03:24:02.698675361Z" level=info msg="connecting to shim 105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a" address="unix:///run/containerd/s/96807204bc5f2e1951558b4c017e68cbb16cacda6c43daaf579516713637607a" protocol=ttrpc version=3 May 27 03:24:02.717741 systemd[1]: Started cri-containerd-105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a.scope - libcontainer container 105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a. May 27 03:24:02.746420 containerd[1708]: time="2025-05-27T03:24:02.746387755Z" level=info msg="StartContainer for \"105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a\" returns successfully" May 27 03:24:03.244463 kubelet[3089]: E0527 03:24:03.244274 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-9cggb" podUID="27ed5e5b-c569-4ae0-93dc-3358bb46ea0c" May 27 03:24:03.805506 containerd[1708]: time="2025-05-27T03:24:03.805332292Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" May 27 03:24:03.807143 systemd[1]: cri-containerd-105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a.scope: Deactivated successfully. May 27 03:24:03.807367 systemd[1]: cri-containerd-105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a.scope: Consumed 350ms CPU time, 191.4M memory peak, 170.9M written to disk. May 27 03:24:03.809267 containerd[1708]: time="2025-05-27T03:24:03.809185016Z" level=info msg="received exit event container_id:\"105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a\" id:\"105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a\" pid:3834 exited_at:{seconds:1748316243 nanos:809031593}" May 27 03:24:03.809267 containerd[1708]: time="2025-05-27T03:24:03.809238902Z" level=info msg="TaskExit event in podsandbox handler container_id:\"105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a\" id:\"105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a\" pid:3834 exited_at:{seconds:1748316243 nanos:809031593}" May 27 03:24:03.825838 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-105570cbf152707e5295ee8acafca45b42ccdc50e8e12b4156bdf4e3cb11480a-rootfs.mount: Deactivated successfully. May 27 03:24:03.902171 kubelet[3089]: I0527 03:24:03.901550 3089 kubelet_node_status.go:501] "Fast updating node status as it just became ready" May 27 03:24:03.935173 systemd[1]: Created slice kubepods-besteffort-pod275414b6_21b5_4cc2_8a95_b078295361dc.slice - libcontainer container kubepods-besteffort-pod275414b6_21b5_4cc2_8a95_b078295361dc.slice. May 27 03:24:03.956724 systemd[1]: Created slice kubepods-besteffort-pod2669f22e_e758_461f_86ef_0334f39a45eb.slice - libcontainer container kubepods-besteffort-pod2669f22e_e758_461f_86ef_0334f39a45eb.slice. May 27 03:24:03.962677 systemd[1]: Created slice kubepods-besteffort-pod8180663d_0a14_4da1_a83a_e7dbd72a3606.slice - libcontainer container kubepods-besteffort-pod8180663d_0a14_4da1_a83a_e7dbd72a3606.slice. May 27 03:24:03.968559 systemd[1]: Created slice kubepods-besteffort-pod40f9ea1a_358f_4451_a1dd_b953445e89db.slice - libcontainer container kubepods-besteffort-pod40f9ea1a_358f_4451_a1dd_b953445e89db.slice. May 27 03:24:03.977168 systemd[1]: Created slice kubepods-burstable-pod39aabb06_801e_4e8a_8388_57aa96ec69b4.slice - libcontainer container kubepods-burstable-pod39aabb06_801e_4e8a_8388_57aa96ec69b4.slice. May 27 03:24:03.981950 systemd[1]: Created slice kubepods-burstable-pod6d40ad28_6278_4b42_9b05_243af0281e54.slice - libcontainer container kubepods-burstable-pod6d40ad28_6278_4b42_9b05_243af0281e54.slice. May 27 03:24:03.985761 systemd[1]: Created slice kubepods-besteffort-pod8dd14faa_5472_4e5c_8d25_d39e38963942.slice - libcontainer container kubepods-besteffort-pod8dd14faa_5472_4e5c_8d25_d39e38963942.slice. May 27 03:24:04.019636 kubelet[3089]: I0527 03:24:04.019609 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8180663d-0a14-4da1-a83a-e7dbd72a3606-config\") pod \"goldmane-78d55f7ddc-f7587\" (UID: \"8180663d-0a14-4da1-a83a-e7dbd72a3606\") " pod="calico-system/goldmane-78d55f7ddc-f7587" May 27 03:24:04.019636 kubelet[3089]: I0527 03:24:04.019638 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f9ea1a-358f-4451-a1dd-b953445e89db-whisker-ca-bundle\") pod \"whisker-549c5bcfd5-lccgk\" (UID: \"40f9ea1a-358f-4451-a1dd-b953445e89db\") " pod="calico-system/whisker-549c5bcfd5-lccgk" May 27 03:24:04.019740 kubelet[3089]: I0527 03:24:04.019654 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrbj6\" (UniqueName: \"kubernetes.io/projected/6d40ad28-6278-4b42-9b05-243af0281e54-kube-api-access-xrbj6\") pod \"coredns-668d6bf9bc-gpm87\" (UID: \"6d40ad28-6278-4b42-9b05-243af0281e54\") " pod="kube-system/coredns-668d6bf9bc-gpm87" May 27 03:24:04.019740 kubelet[3089]: I0527 03:24:04.019669 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/39aabb06-801e-4e8a-8388-57aa96ec69b4-config-volume\") pod \"coredns-668d6bf9bc-klgls\" (UID: \"39aabb06-801e-4e8a-8388-57aa96ec69b4\") " pod="kube-system/coredns-668d6bf9bc-klgls" May 27 03:24:04.019740 kubelet[3089]: I0527 03:24:04.019684 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbtbm\" (UniqueName: \"kubernetes.io/projected/39aabb06-801e-4e8a-8388-57aa96ec69b4-kube-api-access-tbtbm\") pod \"coredns-668d6bf9bc-klgls\" (UID: \"39aabb06-801e-4e8a-8388-57aa96ec69b4\") " pod="kube-system/coredns-668d6bf9bc-klgls" May 27 03:24:04.019740 kubelet[3089]: I0527 03:24:04.019700 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bm8nf\" (UniqueName: \"kubernetes.io/projected/8dd14faa-5472-4e5c-8d25-d39e38963942-kube-api-access-bm8nf\") pod \"calico-apiserver-75bd958cdb-wx7mp\" (UID: \"8dd14faa-5472-4e5c-8d25-d39e38963942\") " pod="calico-apiserver/calico-apiserver-75bd958cdb-wx7mp" May 27 03:24:04.019740 kubelet[3089]: I0527 03:24:04.019714 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8180663d-0a14-4da1-a83a-e7dbd72a3606-goldmane-ca-bundle\") pod \"goldmane-78d55f7ddc-f7587\" (UID: \"8180663d-0a14-4da1-a83a-e7dbd72a3606\") " pod="calico-system/goldmane-78d55f7ddc-f7587" May 27 03:24:04.019845 kubelet[3089]: I0527 03:24:04.019729 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lm5n\" (UniqueName: \"kubernetes.io/projected/8180663d-0a14-4da1-a83a-e7dbd72a3606-kube-api-access-9lm5n\") pod \"goldmane-78d55f7ddc-f7587\" (UID: \"8180663d-0a14-4da1-a83a-e7dbd72a3606\") " pod="calico-system/goldmane-78d55f7ddc-f7587" May 27 03:24:04.019845 kubelet[3089]: I0527 03:24:04.019746 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhlfd\" (UniqueName: \"kubernetes.io/projected/40f9ea1a-358f-4451-a1dd-b953445e89db-kube-api-access-hhlfd\") pod \"whisker-549c5bcfd5-lccgk\" (UID: \"40f9ea1a-358f-4451-a1dd-b953445e89db\") " pod="calico-system/whisker-549c5bcfd5-lccgk" May 27 03:24:04.019845 kubelet[3089]: I0527 03:24:04.019762 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2669f22e-e758-461f-86ef-0334f39a45eb-calico-apiserver-certs\") pod \"calico-apiserver-75bd958cdb-ldwfw\" (UID: \"2669f22e-e758-461f-86ef-0334f39a45eb\") " pod="calico-apiserver/calico-apiserver-75bd958cdb-ldwfw" May 27 03:24:04.019845 kubelet[3089]: I0527 03:24:04.019777 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/275414b6-21b5-4cc2-8a95-b078295361dc-tigera-ca-bundle\") pod \"calico-kube-controllers-7c69cc6f75-wn88n\" (UID: \"275414b6-21b5-4cc2-8a95-b078295361dc\") " pod="calico-system/calico-kube-controllers-7c69cc6f75-wn88n" May 27 03:24:04.019845 kubelet[3089]: I0527 03:24:04.019793 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99ck7\" (UniqueName: \"kubernetes.io/projected/2669f22e-e758-461f-86ef-0334f39a45eb-kube-api-access-99ck7\") pod \"calico-apiserver-75bd958cdb-ldwfw\" (UID: \"2669f22e-e758-461f-86ef-0334f39a45eb\") " pod="calico-apiserver/calico-apiserver-75bd958cdb-ldwfw" May 27 03:24:04.019945 kubelet[3089]: I0527 03:24:04.019812 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/40f9ea1a-358f-4451-a1dd-b953445e89db-whisker-backend-key-pair\") pod \"whisker-549c5bcfd5-lccgk\" (UID: \"40f9ea1a-358f-4451-a1dd-b953445e89db\") " pod="calico-system/whisker-549c5bcfd5-lccgk" May 27 03:24:04.019945 kubelet[3089]: I0527 03:24:04.019827 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/8180663d-0a14-4da1-a83a-e7dbd72a3606-goldmane-key-pair\") pod \"goldmane-78d55f7ddc-f7587\" (UID: \"8180663d-0a14-4da1-a83a-e7dbd72a3606\") " pod="calico-system/goldmane-78d55f7ddc-f7587" May 27 03:24:04.019945 kubelet[3089]: I0527 03:24:04.019845 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/8dd14faa-5472-4e5c-8d25-d39e38963942-calico-apiserver-certs\") pod \"calico-apiserver-75bd958cdb-wx7mp\" (UID: \"8dd14faa-5472-4e5c-8d25-d39e38963942\") " pod="calico-apiserver/calico-apiserver-75bd958cdb-wx7mp" May 27 03:24:04.019945 kubelet[3089]: I0527 03:24:04.019863 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttqnm\" (UniqueName: \"kubernetes.io/projected/275414b6-21b5-4cc2-8a95-b078295361dc-kube-api-access-ttqnm\") pod \"calico-kube-controllers-7c69cc6f75-wn88n\" (UID: \"275414b6-21b5-4cc2-8a95-b078295361dc\") " pod="calico-system/calico-kube-controllers-7c69cc6f75-wn88n" May 27 03:24:04.019945 kubelet[3089]: I0527 03:24:04.019878 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6d40ad28-6278-4b42-9b05-243af0281e54-config-volume\") pod \"coredns-668d6bf9bc-gpm87\" (UID: \"6d40ad28-6278-4b42-9b05-243af0281e54\") " pod="kube-system/coredns-668d6bf9bc-gpm87" May 27 03:24:04.285905 containerd[1708]: time="2025-05-27T03:24:04.284329927Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gpm87,Uid:6d40ad28-6278-4b42-9b05-243af0281e54,Namespace:kube-system,Attempt:0,}" May 27 03:24:04.289778 containerd[1708]: time="2025-05-27T03:24:04.289592782Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bd958cdb-wx7mp,Uid:8dd14faa-5472-4e5c-8d25-d39e38963942,Namespace:calico-apiserver,Attempt:0,}" May 27 03:24:04.545211 containerd[1708]: time="2025-05-27T03:24:04.545078473Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c69cc6f75-wn88n,Uid:275414b6-21b5-4cc2-8a95-b078295361dc,Namespace:calico-system,Attempt:0,}" May 27 03:24:04.560589 containerd[1708]: time="2025-05-27T03:24:04.560555618Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bd958cdb-ldwfw,Uid:2669f22e-e758-461f-86ef-0334f39a45eb,Namespace:calico-apiserver,Attempt:0,}" May 27 03:24:04.567082 containerd[1708]: time="2025-05-27T03:24:04.567046029Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-f7587,Uid:8180663d-0a14-4da1-a83a-e7dbd72a3606,Namespace:calico-system,Attempt:0,}" May 27 03:24:04.573495 containerd[1708]: time="2025-05-27T03:24:04.573470073Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-549c5bcfd5-lccgk,Uid:40f9ea1a-358f-4451-a1dd-b953445e89db,Namespace:calico-system,Attempt:0,}" May 27 03:24:04.580928 containerd[1708]: time="2025-05-27T03:24:04.580883682Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-klgls,Uid:39aabb06-801e-4e8a-8388-57aa96ec69b4,Namespace:kube-system,Attempt:0,}" May 27 03:24:04.856045 containerd[1708]: time="2025-05-27T03:24:04.855520405Z" level=error msg="Failed to destroy network for sandbox \"7ff02e8c59929f833dec75aca2cf9e5f75d857b4d1ab9488467eb39472dbeaec\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.857153 systemd[1]: run-netns-cni\x2d9231aa66\x2d845e\x2df804\x2d703f\x2d0199559fd31b.mount: Deactivated successfully. May 27 03:24:04.863683 containerd[1708]: time="2025-05-27T03:24:04.863647601Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bd958cdb-wx7mp,Uid:8dd14faa-5472-4e5c-8d25-d39e38963942,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ff02e8c59929f833dec75aca2cf9e5f75d857b4d1ab9488467eb39472dbeaec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.863852 kubelet[3089]: E0527 03:24:04.863821 3089 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ff02e8c59929f833dec75aca2cf9e5f75d857b4d1ab9488467eb39472dbeaec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.864061 kubelet[3089]: E0527 03:24:04.863875 3089 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ff02e8c59929f833dec75aca2cf9e5f75d857b4d1ab9488467eb39472dbeaec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75bd958cdb-wx7mp" May 27 03:24:04.864061 kubelet[3089]: E0527 03:24:04.863892 3089 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ff02e8c59929f833dec75aca2cf9e5f75d857b4d1ab9488467eb39472dbeaec\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75bd958cdb-wx7mp" May 27 03:24:04.864061 kubelet[3089]: E0527 03:24:04.863923 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75bd958cdb-wx7mp_calico-apiserver(8dd14faa-5472-4e5c-8d25-d39e38963942)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75bd958cdb-wx7mp_calico-apiserver(8dd14faa-5472-4e5c-8d25-d39e38963942)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7ff02e8c59929f833dec75aca2cf9e5f75d857b4d1ab9488467eb39472dbeaec\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-75bd958cdb-wx7mp" podUID="8dd14faa-5472-4e5c-8d25-d39e38963942" May 27 03:24:04.870024 containerd[1708]: time="2025-05-27T03:24:04.869988576Z" level=error msg="Failed to destroy network for sandbox \"8013d4042de8492f35b15d478ae63aefb52ef41ebf811f0287f32b55be33678f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.872958 systemd[1]: run-netns-cni\x2de818f3c1\x2d9597\x2df031\x2dbeda\x2dd31b7fe717e3.mount: Deactivated successfully. May 27 03:24:04.875375 containerd[1708]: time="2025-05-27T03:24:04.875342436Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gpm87,Uid:6d40ad28-6278-4b42-9b05-243af0281e54,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"8013d4042de8492f35b15d478ae63aefb52ef41ebf811f0287f32b55be33678f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.876576 kubelet[3089]: E0527 03:24:04.875533 3089 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8013d4042de8492f35b15d478ae63aefb52ef41ebf811f0287f32b55be33678f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.876576 kubelet[3089]: E0527 03:24:04.875589 3089 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8013d4042de8492f35b15d478ae63aefb52ef41ebf811f0287f32b55be33678f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gpm87" May 27 03:24:04.876576 kubelet[3089]: E0527 03:24:04.875608 3089 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"8013d4042de8492f35b15d478ae63aefb52ef41ebf811f0287f32b55be33678f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-gpm87" May 27 03:24:04.876702 kubelet[3089]: E0527 03:24:04.875652 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-gpm87_kube-system(6d40ad28-6278-4b42-9b05-243af0281e54)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-gpm87_kube-system(6d40ad28-6278-4b42-9b05-243af0281e54)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"8013d4042de8492f35b15d478ae63aefb52ef41ebf811f0287f32b55be33678f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-gpm87" podUID="6d40ad28-6278-4b42-9b05-243af0281e54" May 27 03:24:04.923191 containerd[1708]: time="2025-05-27T03:24:04.923168153Z" level=error msg="Failed to destroy network for sandbox \"46c17360924333dd11795bdd94c14ca107f11d9f672bc2002c1434dd91599d4c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.925783 systemd[1]: run-netns-cni\x2d73a20e50\x2d4589\x2d8154\x2d7f49\x2d1edc19ce45ca.mount: Deactivated successfully. May 27 03:24:04.927036 containerd[1708]: time="2025-05-27T03:24:04.926983846Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c69cc6f75-wn88n,Uid:275414b6-21b5-4cc2-8a95-b078295361dc,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"46c17360924333dd11795bdd94c14ca107f11d9f672bc2002c1434dd91599d4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.927951 kubelet[3089]: E0527 03:24:04.927529 3089 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46c17360924333dd11795bdd94c14ca107f11d9f672bc2002c1434dd91599d4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.927951 kubelet[3089]: E0527 03:24:04.927568 3089 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46c17360924333dd11795bdd94c14ca107f11d9f672bc2002c1434dd91599d4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c69cc6f75-wn88n" May 27 03:24:04.927951 kubelet[3089]: E0527 03:24:04.927586 3089 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"46c17360924333dd11795bdd94c14ca107f11d9f672bc2002c1434dd91599d4c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7c69cc6f75-wn88n" May 27 03:24:04.928072 kubelet[3089]: E0527 03:24:04.927616 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7c69cc6f75-wn88n_calico-system(275414b6-21b5-4cc2-8a95-b078295361dc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7c69cc6f75-wn88n_calico-system(275414b6-21b5-4cc2-8a95-b078295361dc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"46c17360924333dd11795bdd94c14ca107f11d9f672bc2002c1434dd91599d4c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7c69cc6f75-wn88n" podUID="275414b6-21b5-4cc2-8a95-b078295361dc" May 27 03:24:04.933141 containerd[1708]: time="2025-05-27T03:24:04.931556310Z" level=error msg="Failed to destroy network for sandbox \"432edc9e6b638367d82fcf091be2787fe98d491a0ce213b86c9700ac9fe88f65\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.934240 systemd[1]: run-netns-cni\x2d7a9c423e\x2d9c48\x2da4ae\x2d0f9e\x2d1dacc3de554c.mount: Deactivated successfully. May 27 03:24:04.936355 containerd[1708]: time="2025-05-27T03:24:04.936324014Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-f7587,Uid:8180663d-0a14-4da1-a83a-e7dbd72a3606,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"432edc9e6b638367d82fcf091be2787fe98d491a0ce213b86c9700ac9fe88f65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.936497 kubelet[3089]: E0527 03:24:04.936476 3089 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"432edc9e6b638367d82fcf091be2787fe98d491a0ce213b86c9700ac9fe88f65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.936548 kubelet[3089]: E0527 03:24:04.936512 3089 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"432edc9e6b638367d82fcf091be2787fe98d491a0ce213b86c9700ac9fe88f65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-f7587" May 27 03:24:04.936576 kubelet[3089]: E0527 03:24:04.936568 3089 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"432edc9e6b638367d82fcf091be2787fe98d491a0ce213b86c9700ac9fe88f65\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-78d55f7ddc-f7587" May 27 03:24:04.936682 kubelet[3089]: E0527 03:24:04.936665 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-78d55f7ddc-f7587_calico-system(8180663d-0a14-4da1-a83a-e7dbd72a3606)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-78d55f7ddc-f7587_calico-system(8180663d-0a14-4da1-a83a-e7dbd72a3606)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"432edc9e6b638367d82fcf091be2787fe98d491a0ce213b86c9700ac9fe88f65\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:24:04.939104 containerd[1708]: time="2025-05-27T03:24:04.939080338Z" level=error msg="Failed to destroy network for sandbox \"9164e7c096d625b2156f7303e95298e99db6d28a19c9bd541d9778b0a9cba8ca\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.939302 containerd[1708]: time="2025-05-27T03:24:04.939100341Z" level=error msg="Failed to destroy network for sandbox \"6efa937077f7db920fbfbc582dfb6eddac2e52292e4b490f616a01c973bee9ea\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.941547 containerd[1708]: time="2025-05-27T03:24:04.941514987Z" level=error msg="Failed to destroy network for sandbox \"72b3a78803b3e726a8afe18d1aaa51145d3c0c43a6b0131159a3190a109a7fba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.942646 containerd[1708]: time="2025-05-27T03:24:04.942614970Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-klgls,Uid:39aabb06-801e-4e8a-8388-57aa96ec69b4,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6efa937077f7db920fbfbc582dfb6eddac2e52292e4b490f616a01c973bee9ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.942863 kubelet[3089]: E0527 03:24:04.942842 3089 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6efa937077f7db920fbfbc582dfb6eddac2e52292e4b490f616a01c973bee9ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.942914 kubelet[3089]: E0527 03:24:04.942872 3089 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6efa937077f7db920fbfbc582dfb6eddac2e52292e4b490f616a01c973bee9ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-klgls" May 27 03:24:04.942914 kubelet[3089]: E0527 03:24:04.942897 3089 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6efa937077f7db920fbfbc582dfb6eddac2e52292e4b490f616a01c973bee9ea\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-klgls" May 27 03:24:04.942966 kubelet[3089]: E0527 03:24:04.942928 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-klgls_kube-system(39aabb06-801e-4e8a-8388-57aa96ec69b4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-klgls_kube-system(39aabb06-801e-4e8a-8388-57aa96ec69b4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6efa937077f7db920fbfbc582dfb6eddac2e52292e4b490f616a01c973bee9ea\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-klgls" podUID="39aabb06-801e-4e8a-8388-57aa96ec69b4" May 27 03:24:04.944854 containerd[1708]: time="2025-05-27T03:24:04.944800744Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bd958cdb-ldwfw,Uid:2669f22e-e758-461f-86ef-0334f39a45eb,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9164e7c096d625b2156f7303e95298e99db6d28a19c9bd541d9778b0a9cba8ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.944956 kubelet[3089]: E0527 03:24:04.944929 3089 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9164e7c096d625b2156f7303e95298e99db6d28a19c9bd541d9778b0a9cba8ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.945012 kubelet[3089]: E0527 03:24:04.944963 3089 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9164e7c096d625b2156f7303e95298e99db6d28a19c9bd541d9778b0a9cba8ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75bd958cdb-ldwfw" May 27 03:24:04.945056 kubelet[3089]: E0527 03:24:04.945017 3089 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9164e7c096d625b2156f7303e95298e99db6d28a19c9bd541d9778b0a9cba8ca\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-75bd958cdb-ldwfw" May 27 03:24:04.945081 kubelet[3089]: E0527 03:24:04.945049 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-75bd958cdb-ldwfw_calico-apiserver(2669f22e-e758-461f-86ef-0334f39a45eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-75bd958cdb-ldwfw_calico-apiserver(2669f22e-e758-461f-86ef-0334f39a45eb)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9164e7c096d625b2156f7303e95298e99db6d28a19c9bd541d9778b0a9cba8ca\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-75bd958cdb-ldwfw" podUID="2669f22e-e758-461f-86ef-0334f39a45eb" May 27 03:24:04.946889 containerd[1708]: time="2025-05-27T03:24:04.946862596Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-549c5bcfd5-lccgk,Uid:40f9ea1a-358f-4451-a1dd-b953445e89db,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"72b3a78803b3e726a8afe18d1aaa51145d3c0c43a6b0131159a3190a109a7fba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.947008 kubelet[3089]: E0527 03:24:04.946973 3089 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72b3a78803b3e726a8afe18d1aaa51145d3c0c43a6b0131159a3190a109a7fba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:04.947008 kubelet[3089]: E0527 03:24:04.947003 3089 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72b3a78803b3e726a8afe18d1aaa51145d3c0c43a6b0131159a3190a109a7fba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-549c5bcfd5-lccgk" May 27 03:24:04.947085 kubelet[3089]: E0527 03:24:04.947020 3089 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"72b3a78803b3e726a8afe18d1aaa51145d3c0c43a6b0131159a3190a109a7fba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-549c5bcfd5-lccgk" May 27 03:24:04.947085 kubelet[3089]: E0527 03:24:04.947049 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-549c5bcfd5-lccgk_calico-system(40f9ea1a-358f-4451-a1dd-b953445e89db)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-549c5bcfd5-lccgk_calico-system(40f9ea1a-358f-4451-a1dd-b953445e89db)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"72b3a78803b3e726a8afe18d1aaa51145d3c0c43a6b0131159a3190a109a7fba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-549c5bcfd5-lccgk" podUID="40f9ea1a-358f-4451-a1dd-b953445e89db" May 27 03:24:05.248043 systemd[1]: Created slice kubepods-besteffort-pod27ed5e5b_c569_4ae0_93dc_3358bb46ea0c.slice - libcontainer container kubepods-besteffort-pod27ed5e5b_c569_4ae0_93dc_3358bb46ea0c.slice. May 27 03:24:05.249944 containerd[1708]: time="2025-05-27T03:24:05.249926145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9cggb,Uid:27ed5e5b-c569-4ae0-93dc-3358bb46ea0c,Namespace:calico-system,Attempt:0,}" May 27 03:24:05.285915 containerd[1708]: time="2025-05-27T03:24:05.285883959Z" level=error msg="Failed to destroy network for sandbox \"05d687f8ae1083d5182cca8c6a15192ce90b0b982fc135f31ed53aeacfd0a50d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:05.288401 containerd[1708]: time="2025-05-27T03:24:05.288357820Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9cggb,Uid:27ed5e5b-c569-4ae0-93dc-3358bb46ea0c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d687f8ae1083d5182cca8c6a15192ce90b0b982fc135f31ed53aeacfd0a50d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:05.288582 kubelet[3089]: E0527 03:24:05.288557 3089 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d687f8ae1083d5182cca8c6a15192ce90b0b982fc135f31ed53aeacfd0a50d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" May 27 03:24:05.288634 kubelet[3089]: E0527 03:24:05.288588 3089 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d687f8ae1083d5182cca8c6a15192ce90b0b982fc135f31ed53aeacfd0a50d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9cggb" May 27 03:24:05.288634 kubelet[3089]: E0527 03:24:05.288608 3089 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"05d687f8ae1083d5182cca8c6a15192ce90b0b982fc135f31ed53aeacfd0a50d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-9cggb" May 27 03:24:05.290031 kubelet[3089]: E0527 03:24:05.289875 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-9cggb_calico-system(27ed5e5b-c569-4ae0-93dc-3358bb46ea0c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-9cggb_calico-system(27ed5e5b-c569-4ae0-93dc-3358bb46ea0c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"05d687f8ae1083d5182cca8c6a15192ce90b0b982fc135f31ed53aeacfd0a50d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-9cggb" podUID="27ed5e5b-c569-4ae0-93dc-3358bb46ea0c" May 27 03:24:05.336911 containerd[1708]: time="2025-05-27T03:24:05.336744625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\"" May 27 03:24:05.825780 systemd[1]: run-netns-cni\x2d218471b4\x2d04de\x2d089d\x2d41e4\x2da5a8efc239bf.mount: Deactivated successfully. May 27 03:24:05.825848 systemd[1]: run-netns-cni\x2d9b6b9f3f\x2d1456\x2db1a1\x2dc1f6\x2d2e6db52af495.mount: Deactivated successfully. May 27 03:24:05.825889 systemd[1]: run-netns-cni\x2d189ed67d\x2d76a6\x2d7726\x2d987e\x2d924846fd884c.mount: Deactivated successfully. May 27 03:24:09.199125 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3894087458.mount: Deactivated successfully. May 27 03:24:09.220031 containerd[1708]: time="2025-05-27T03:24:09.220001235Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:09.222271 containerd[1708]: time="2025-05-27T03:24:09.222245597Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.0: active requests=0, bytes read=156396372" May 27 03:24:09.224414 containerd[1708]: time="2025-05-27T03:24:09.224356307Z" level=info msg="ImageCreate event name:\"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:09.227204 containerd[1708]: time="2025-05-27T03:24:09.227166142Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:09.227434 containerd[1708]: time="2025-05-27T03:24:09.227396417Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.0\" with image id \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node@sha256:7cb61ea47ca0a8e6d0526a42da4f1e399b37ccd13339d0776d272465cb7ee012\", size \"156396234\" in 3.890626242s" May 27 03:24:09.227434 containerd[1708]: time="2025-05-27T03:24:09.227421297Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.0\" returns image reference \"sha256:d12dae9bc0999225efe30fd5618bcf2195709d54ed2840234f5006aab5f7d721\"" May 27 03:24:09.236826 containerd[1708]: time="2025-05-27T03:24:09.236805293Z" level=info msg="CreateContainer within sandbox \"49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" May 27 03:24:09.255018 containerd[1708]: time="2025-05-27T03:24:09.253194974Z" level=info msg="Container dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac: CDI devices from CRI Config.CDIDevices: []" May 27 03:24:09.266771 containerd[1708]: time="2025-05-27T03:24:09.266748378Z" level=info msg="CreateContainer within sandbox \"49a969b2e92c38ea9b7d63041a1f001596bece357c7919eb16ed5f1687d1b140\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\"" May 27 03:24:09.267141 containerd[1708]: time="2025-05-27T03:24:09.267096363Z" level=info msg="StartContainer for \"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\"" May 27 03:24:09.268493 containerd[1708]: time="2025-05-27T03:24:09.268467939Z" level=info msg="connecting to shim dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac" address="unix:///run/containerd/s/96807204bc5f2e1951558b4c017e68cbb16cacda6c43daaf579516713637607a" protocol=ttrpc version=3 May 27 03:24:09.285542 systemd[1]: Started cri-containerd-dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac.scope - libcontainer container dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac. May 27 03:24:09.313665 containerd[1708]: time="2025-05-27T03:24:09.313633203Z" level=info msg="StartContainer for \"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\" returns successfully" May 27 03:24:09.363587 kubelet[3089]: I0527 03:24:09.363308 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-926st" podStartSLOduration=1.309663667 podStartE2EDuration="16.363292353s" podCreationTimestamp="2025-05-27 03:23:53 +0000 UTC" firstStartedPulling="2025-05-27 03:23:54.174282479 +0000 UTC m=+18.007159916" lastFinishedPulling="2025-05-27 03:24:09.227911171 +0000 UTC m=+33.060788602" observedRunningTime="2025-05-27 03:24:09.360790496 +0000 UTC m=+33.193667931" watchObservedRunningTime="2025-05-27 03:24:09.363292353 +0000 UTC m=+33.196169833" May 27 03:24:09.401657 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. May 27 03:24:09.401717 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. May 27 03:24:09.552912 kubelet[3089]: I0527 03:24:09.552854 3089 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/40f9ea1a-358f-4451-a1dd-b953445e89db-whisker-backend-key-pair\") pod \"40f9ea1a-358f-4451-a1dd-b953445e89db\" (UID: \"40f9ea1a-358f-4451-a1dd-b953445e89db\") " May 27 03:24:09.552912 kubelet[3089]: I0527 03:24:09.552891 3089 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hhlfd\" (UniqueName: \"kubernetes.io/projected/40f9ea1a-358f-4451-a1dd-b953445e89db-kube-api-access-hhlfd\") pod \"40f9ea1a-358f-4451-a1dd-b953445e89db\" (UID: \"40f9ea1a-358f-4451-a1dd-b953445e89db\") " May 27 03:24:09.553002 kubelet[3089]: I0527 03:24:09.552914 3089 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f9ea1a-358f-4451-a1dd-b953445e89db-whisker-ca-bundle\") pod \"40f9ea1a-358f-4451-a1dd-b953445e89db\" (UID: \"40f9ea1a-358f-4451-a1dd-b953445e89db\") " May 27 03:24:09.553199 kubelet[3089]: I0527 03:24:09.553181 3089 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f9ea1a-358f-4451-a1dd-b953445e89db-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "40f9ea1a-358f-4451-a1dd-b953445e89db" (UID: "40f9ea1a-358f-4451-a1dd-b953445e89db"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" May 27 03:24:09.556669 kubelet[3089]: I0527 03:24:09.556520 3089 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f9ea1a-358f-4451-a1dd-b953445e89db-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "40f9ea1a-358f-4451-a1dd-b953445e89db" (UID: "40f9ea1a-358f-4451-a1dd-b953445e89db"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" May 27 03:24:09.556669 kubelet[3089]: I0527 03:24:09.556634 3089 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f9ea1a-358f-4451-a1dd-b953445e89db-kube-api-access-hhlfd" (OuterVolumeSpecName: "kube-api-access-hhlfd") pod "40f9ea1a-358f-4451-a1dd-b953445e89db" (UID: "40f9ea1a-358f-4451-a1dd-b953445e89db"). InnerVolumeSpecName "kube-api-access-hhlfd". PluginName "kubernetes.io/projected", VolumeGIDValue "" May 27 03:24:09.653568 kubelet[3089]: I0527 03:24:09.653548 3089 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hhlfd\" (UniqueName: \"kubernetes.io/projected/40f9ea1a-358f-4451-a1dd-b953445e89db-kube-api-access-hhlfd\") on node \"ci-4344.0.0-a-c2c0d8ddb2\" DevicePath \"\"" May 27 03:24:09.653568 kubelet[3089]: I0527 03:24:09.653568 3089 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/40f9ea1a-358f-4451-a1dd-b953445e89db-whisker-backend-key-pair\") on node \"ci-4344.0.0-a-c2c0d8ddb2\" DevicePath \"\"" May 27 03:24:09.653660 kubelet[3089]: I0527 03:24:09.653577 3089 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f9ea1a-358f-4451-a1dd-b953445e89db-whisker-ca-bundle\") on node \"ci-4344.0.0-a-c2c0d8ddb2\" DevicePath \"\"" May 27 03:24:10.199596 systemd[1]: var-lib-kubelet-pods-40f9ea1a\x2d358f\x2d4451\x2da1dd\x2db953445e89db-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dhhlfd.mount: Deactivated successfully. May 27 03:24:10.199678 systemd[1]: var-lib-kubelet-pods-40f9ea1a\x2d358f\x2d4451\x2da1dd\x2db953445e89db-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. May 27 03:24:10.248236 systemd[1]: Removed slice kubepods-besteffort-pod40f9ea1a_358f_4451_a1dd_b953445e89db.slice - libcontainer container kubepods-besteffort-pod40f9ea1a_358f_4451_a1dd_b953445e89db.slice. May 27 03:24:10.425840 systemd[1]: Created slice kubepods-besteffort-podf608e1b5_7f91_494e_a0c3_fc01e99dfa04.slice - libcontainer container kubepods-besteffort-podf608e1b5_7f91_494e_a0c3_fc01e99dfa04.slice. May 27 03:24:10.457900 kubelet[3089]: I0527 03:24:10.457839 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrmjh\" (UniqueName: \"kubernetes.io/projected/f608e1b5-7f91-494e-a0c3-fc01e99dfa04-kube-api-access-wrmjh\") pod \"whisker-c8f547bcc-pgf67\" (UID: \"f608e1b5-7f91-494e-a0c3-fc01e99dfa04\") " pod="calico-system/whisker-c8f547bcc-pgf67" May 27 03:24:10.457900 kubelet[3089]: I0527 03:24:10.457880 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f608e1b5-7f91-494e-a0c3-fc01e99dfa04-whisker-ca-bundle\") pod \"whisker-c8f547bcc-pgf67\" (UID: \"f608e1b5-7f91-494e-a0c3-fc01e99dfa04\") " pod="calico-system/whisker-c8f547bcc-pgf67" May 27 03:24:10.457900 kubelet[3089]: I0527 03:24:10.457898 3089 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/f608e1b5-7f91-494e-a0c3-fc01e99dfa04-whisker-backend-key-pair\") pod \"whisker-c8f547bcc-pgf67\" (UID: \"f608e1b5-7f91-494e-a0c3-fc01e99dfa04\") " pod="calico-system/whisker-c8f547bcc-pgf67" May 27 03:24:10.729229 containerd[1708]: time="2025-05-27T03:24:10.728983355Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c8f547bcc-pgf67,Uid:f608e1b5-7f91-494e-a0c3-fc01e99dfa04,Namespace:calico-system,Attempt:0,}" May 27 03:24:10.880518 systemd-networkd[1584]: califf5061f711e: Link UP May 27 03:24:10.881943 systemd-networkd[1584]: califf5061f711e: Gained carrier May 27 03:24:10.905770 containerd[1708]: 2025-05-27 03:24:10.772 [INFO][4255] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist May 27 03:24:10.905770 containerd[1708]: 2025-05-27 03:24:10.782 [INFO][4255] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0 whisker-c8f547bcc- calico-system f608e1b5-7f91-494e-a0c3-fc01e99dfa04 853 0 2025-05-27 03:24:10 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:c8f547bcc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4344.0.0-a-c2c0d8ddb2 whisker-c8f547bcc-pgf67 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] califf5061f711e [] [] }} ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Namespace="calico-system" Pod="whisker-c8f547bcc-pgf67" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-" May 27 03:24:10.905770 containerd[1708]: 2025-05-27 03:24:10.782 [INFO][4255] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Namespace="calico-system" Pod="whisker-c8f547bcc-pgf67" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" May 27 03:24:10.905770 containerd[1708]: 2025-05-27 03:24:10.832 [INFO][4267] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" HandleID="k8s-pod-network.a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" May 27 03:24:10.906042 containerd[1708]: 2025-05-27 03:24:10.832 [INFO][4267] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" HandleID="k8s-pod-network.a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc00030c0c0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-c2c0d8ddb2", "pod":"whisker-c8f547bcc-pgf67", "timestamp":"2025-05-27 03:24:10.832428726 +0000 UTC"}, Hostname:"ci-4344.0.0-a-c2c0d8ddb2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:24:10.906042 containerd[1708]: 2025-05-27 03:24:10.832 [INFO][4267] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:24:10.906042 containerd[1708]: 2025-05-27 03:24:10.833 [INFO][4267] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:24:10.906042 containerd[1708]: 2025-05-27 03:24:10.833 [INFO][4267] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-c2c0d8ddb2' May 27 03:24:10.906042 containerd[1708]: 2025-05-27 03:24:10.838 [INFO][4267] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:10.906042 containerd[1708]: 2025-05-27 03:24:10.842 [INFO][4267] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:10.906042 containerd[1708]: 2025-05-27 03:24:10.845 [INFO][4267] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:10.906042 containerd[1708]: 2025-05-27 03:24:10.850 [INFO][4267] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:10.906042 containerd[1708]: 2025-05-27 03:24:10.852 [INFO][4267] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:10.906222 containerd[1708]: 2025-05-27 03:24:10.852 [INFO][4267] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:10.906222 containerd[1708]: 2025-05-27 03:24:10.853 [INFO][4267] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3 May 27 03:24:10.906222 containerd[1708]: 2025-05-27 03:24:10.857 [INFO][4267] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:10.906222 containerd[1708]: 2025-05-27 03:24:10.864 [INFO][4267] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.1/26] block=192.168.55.0/26 handle="k8s-pod-network.a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:10.906222 containerd[1708]: 2025-05-27 03:24:10.864 [INFO][4267] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.1/26] handle="k8s-pod-network.a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:10.906222 containerd[1708]: 2025-05-27 03:24:10.864 [INFO][4267] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:24:10.906222 containerd[1708]: 2025-05-27 03:24:10.864 [INFO][4267] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.1/26] IPv6=[] ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" HandleID="k8s-pod-network.a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" May 27 03:24:10.907131 containerd[1708]: 2025-05-27 03:24:10.868 [INFO][4255] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Namespace="calico-system" Pod="whisker-c8f547bcc-pgf67" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0", GenerateName:"whisker-c8f547bcc-", Namespace:"calico-system", SelfLink:"", UID:"f608e1b5-7f91-494e-a0c3-fc01e99dfa04", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 24, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"c8f547bcc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"", Pod:"whisker-c8f547bcc-pgf67", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.55.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"califf5061f711e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:10.907131 containerd[1708]: 2025-05-27 03:24:10.868 [INFO][4255] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.1/32] ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Namespace="calico-system" Pod="whisker-c8f547bcc-pgf67" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" May 27 03:24:10.907202 containerd[1708]: 2025-05-27 03:24:10.868 [INFO][4255] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to califf5061f711e ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Namespace="calico-system" Pod="whisker-c8f547bcc-pgf67" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" May 27 03:24:10.907202 containerd[1708]: 2025-05-27 03:24:10.882 [INFO][4255] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Namespace="calico-system" Pod="whisker-c8f547bcc-pgf67" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" May 27 03:24:10.907234 containerd[1708]: 2025-05-27 03:24:10.883 [INFO][4255] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Namespace="calico-system" Pod="whisker-c8f547bcc-pgf67" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0", GenerateName:"whisker-c8f547bcc-", Namespace:"calico-system", SelfLink:"", UID:"f608e1b5-7f91-494e-a0c3-fc01e99dfa04", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 24, 10, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"c8f547bcc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3", Pod:"whisker-c8f547bcc-pgf67", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.55.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"califf5061f711e", MAC:"96:fc:e1:63:ea:e9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:10.907272 containerd[1708]: 2025-05-27 03:24:10.903 [INFO][4255] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" Namespace="calico-system" Pod="whisker-c8f547bcc-pgf67" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-whisker--c8f547bcc--pgf67-eth0" May 27 03:24:10.935726 containerd[1708]: time="2025-05-27T03:24:10.935699468Z" level=info msg="connecting to shim a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3" address="unix:///run/containerd/s/1940fef5e68b700751e41f38f36977713c6f2577e98c1bfccd8594a403e391ae" namespace=k8s.io protocol=ttrpc version=3 May 27 03:24:10.952545 systemd[1]: Started cri-containerd-a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3.scope - libcontainer container a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3. May 27 03:24:10.983159 containerd[1708]: time="2025-05-27T03:24:10.983112911Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-c8f547bcc-pgf67,Uid:f608e1b5-7f91-494e-a0c3-fc01e99dfa04,Namespace:calico-system,Attempt:0,} returns sandbox id \"a39512c096bd7b68de984d851452b64c4ec2e1badab462e7ad76fc7f156af7b3\"" May 27 03:24:10.984925 containerd[1708]: time="2025-05-27T03:24:10.984853015Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:24:11.179774 containerd[1708]: time="2025-05-27T03:24:11.179737338Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:24:11.182025 containerd[1708]: time="2025-05-27T03:24:11.182001256Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:24:11.182095 containerd[1708]: time="2025-05-27T03:24:11.182006934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:24:11.182187 kubelet[3089]: E0527 03:24:11.182149 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:24:11.182264 kubelet[3089]: E0527 03:24:11.182198 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:24:11.182374 kubelet[3089]: E0527 03:24:11.182308 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:903c8ceec2c64dbb8528b709b606411a,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrmjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c8f547bcc-pgf67_calico-system(f608e1b5-7f91-494e-a0c3-fc01e99dfa04): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:24:11.184160 containerd[1708]: time="2025-05-27T03:24:11.184130074Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:24:11.348431 containerd[1708]: time="2025-05-27T03:24:11.348301803Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:24:11.350994 containerd[1708]: time="2025-05-27T03:24:11.350965284Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:24:11.351101 containerd[1708]: time="2025-05-27T03:24:11.350979175Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:24:11.351131 kubelet[3089]: E0527 03:24:11.351082 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:24:11.351131 kubelet[3089]: E0527 03:24:11.351111 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:24:11.351237 kubelet[3089]: E0527 03:24:11.351198 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrmjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c8f547bcc-pgf67_calico-system(f608e1b5-7f91-494e-a0c3-fc01e99dfa04): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:24:11.352472 kubelet[3089]: E0527 03:24:11.352404 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:24:11.960542 systemd-networkd[1584]: califf5061f711e: Gained IPv6LL May 27 03:24:12.136315 kubelet[3089]: I0527 03:24:12.136157 3089 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:24:12.246430 kubelet[3089]: I0527 03:24:12.246381 3089 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40f9ea1a-358f-4451-a1dd-b953445e89db" path="/var/lib/kubelet/pods/40f9ea1a-358f-4451-a1dd-b953445e89db/volumes" May 27 03:24:12.352995 kubelet[3089]: E0527 03:24:12.352888 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:24:13.265372 systemd-networkd[1584]: vxlan.calico: Link UP May 27 03:24:13.265381 systemd-networkd[1584]: vxlan.calico: Gained carrier May 27 03:24:14.317942 kubelet[3089]: I0527 03:24:14.317925 3089 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:24:14.328556 systemd-networkd[1584]: vxlan.calico: Gained IPv6LL May 27 03:24:14.371840 containerd[1708]: time="2025-05-27T03:24:14.371814235Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\" id:\"4a024efc261560a4e8c4d8c198daa04a1024a875ce1a8e08e6740e54a5cc5db9\" pid:4497 exit_status:1 exited_at:{seconds:1748316254 nanos:371612387}" May 27 03:24:14.420966 containerd[1708]: time="2025-05-27T03:24:14.420932808Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\" id:\"b1c28e13537fdc6962a82af7d00f3a6ed4157b35c8e3ce6ed339f42d64eeec30\" pid:4520 exit_status:1 exited_at:{seconds:1748316254 nanos:420755272}" May 27 03:24:15.245124 containerd[1708]: time="2025-05-27T03:24:15.244821490Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bd958cdb-wx7mp,Uid:8dd14faa-5472-4e5c-8d25-d39e38963942,Namespace:calico-apiserver,Attempt:0,}" May 27 03:24:15.322274 systemd-networkd[1584]: calie3125674a9e: Link UP May 27 03:24:15.322410 systemd-networkd[1584]: calie3125674a9e: Gained carrier May 27 03:24:15.334971 containerd[1708]: 2025-05-27 03:24:15.274 [INFO][4534] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0 calico-apiserver-75bd958cdb- calico-apiserver 8dd14faa-5472-4e5c-8d25-d39e38963942 786 0 2025-05-27 03:23:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75bd958cdb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-a-c2c0d8ddb2 calico-apiserver-75bd958cdb-wx7mp eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calie3125674a9e [] [] }} ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-wx7mp" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-" May 27 03:24:15.334971 containerd[1708]: 2025-05-27 03:24:15.274 [INFO][4534] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-wx7mp" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" May 27 03:24:15.334971 containerd[1708]: 2025-05-27 03:24:15.296 [INFO][4546] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" HandleID="k8s-pod-network.49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" May 27 03:24:15.335115 containerd[1708]: 2025-05-27 03:24:15.296 [INFO][4546] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" HandleID="k8s-pod-network.49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002332e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-a-c2c0d8ddb2", "pod":"calico-apiserver-75bd958cdb-wx7mp", "timestamp":"2025-05-27 03:24:15.296582813 +0000 UTC"}, Hostname:"ci-4344.0.0-a-c2c0d8ddb2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:24:15.335115 containerd[1708]: 2025-05-27 03:24:15.296 [INFO][4546] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:24:15.335115 containerd[1708]: 2025-05-27 03:24:15.296 [INFO][4546] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:24:15.335115 containerd[1708]: 2025-05-27 03:24:15.296 [INFO][4546] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-c2c0d8ddb2' May 27 03:24:15.335115 containerd[1708]: 2025-05-27 03:24:15.300 [INFO][4546] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:15.335115 containerd[1708]: 2025-05-27 03:24:15.304 [INFO][4546] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:15.335115 containerd[1708]: 2025-05-27 03:24:15.306 [INFO][4546] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:15.335115 containerd[1708]: 2025-05-27 03:24:15.307 [INFO][4546] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:15.335115 containerd[1708]: 2025-05-27 03:24:15.309 [INFO][4546] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:15.335299 containerd[1708]: 2025-05-27 03:24:15.309 [INFO][4546] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:15.335299 containerd[1708]: 2025-05-27 03:24:15.310 [INFO][4546] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569 May 27 03:24:15.335299 containerd[1708]: 2025-05-27 03:24:15.315 [INFO][4546] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:15.335299 containerd[1708]: 2025-05-27 03:24:15.319 [INFO][4546] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.2/26] block=192.168.55.0/26 handle="k8s-pod-network.49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:15.335299 containerd[1708]: 2025-05-27 03:24:15.319 [INFO][4546] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.2/26] handle="k8s-pod-network.49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:15.335299 containerd[1708]: 2025-05-27 03:24:15.319 [INFO][4546] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:24:15.335299 containerd[1708]: 2025-05-27 03:24:15.319 [INFO][4546] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.2/26] IPv6=[] ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" HandleID="k8s-pod-network.49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" May 27 03:24:15.335426 containerd[1708]: 2025-05-27 03:24:15.320 [INFO][4534] cni-plugin/k8s.go 418: Populated endpoint ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-wx7mp" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0", GenerateName:"calico-apiserver-75bd958cdb-", Namespace:"calico-apiserver", SelfLink:"", UID:"8dd14faa-5472-4e5c-8d25-d39e38963942", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75bd958cdb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"", Pod:"calico-apiserver-75bd958cdb-wx7mp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie3125674a9e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:15.335500 containerd[1708]: 2025-05-27 03:24:15.320 [INFO][4534] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.2/32] ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-wx7mp" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" May 27 03:24:15.335500 containerd[1708]: 2025-05-27 03:24:15.320 [INFO][4534] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie3125674a9e ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-wx7mp" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" May 27 03:24:15.335500 containerd[1708]: 2025-05-27 03:24:15.322 [INFO][4534] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-wx7mp" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" May 27 03:24:15.335560 containerd[1708]: 2025-05-27 03:24:15.323 [INFO][4534] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-wx7mp" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0", GenerateName:"calico-apiserver-75bd958cdb-", Namespace:"calico-apiserver", SelfLink:"", UID:"8dd14faa-5472-4e5c-8d25-d39e38963942", ResourceVersion:"786", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75bd958cdb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569", Pod:"calico-apiserver-75bd958cdb-wx7mp", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calie3125674a9e", MAC:"fa:be:a4:26:d8:8b", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:15.335608 containerd[1708]: 2025-05-27 03:24:15.331 [INFO][4534] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-wx7mp" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--wx7mp-eth0" May 27 03:24:15.371733 containerd[1708]: time="2025-05-27T03:24:15.371682170Z" level=info msg="connecting to shim 49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569" address="unix:///run/containerd/s/2096db5f9fc6f4f05a9d218aafe6a1520ce3d35c1b167fd1795d7dac5ea1a0ba" namespace=k8s.io protocol=ttrpc version=3 May 27 03:24:15.392553 systemd[1]: Started cri-containerd-49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569.scope - libcontainer container 49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569. May 27 03:24:15.423408 containerd[1708]: time="2025-05-27T03:24:15.423381153Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bd958cdb-wx7mp,Uid:8dd14faa-5472-4e5c-8d25-d39e38963942,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569\"" May 27 03:24:15.424550 containerd[1708]: time="2025-05-27T03:24:15.424529872Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\"" May 27 03:24:16.245450 containerd[1708]: time="2025-05-27T03:24:16.245297027Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c69cc6f75-wn88n,Uid:275414b6-21b5-4cc2-8a95-b078295361dc,Namespace:calico-system,Attempt:0,}" May 27 03:24:16.327642 systemd-networkd[1584]: cali8a30fba1b0f: Link UP May 27 03:24:16.328623 systemd-networkd[1584]: cali8a30fba1b0f: Gained carrier May 27 03:24:16.339627 containerd[1708]: 2025-05-27 03:24:16.273 [INFO][4606] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0 calico-kube-controllers-7c69cc6f75- calico-system 275414b6-21b5-4cc2-8a95-b078295361dc 779 0 2025-05-27 03:23:54 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7c69cc6f75 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4344.0.0-a-c2c0d8ddb2 calico-kube-controllers-7c69cc6f75-wn88n eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8a30fba1b0f [] [] }} ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Namespace="calico-system" Pod="calico-kube-controllers-7c69cc6f75-wn88n" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-" May 27 03:24:16.339627 containerd[1708]: 2025-05-27 03:24:16.273 [INFO][4606] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Namespace="calico-system" Pod="calico-kube-controllers-7c69cc6f75-wn88n" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" May 27 03:24:16.339627 containerd[1708]: 2025-05-27 03:24:16.295 [INFO][4617] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" HandleID="k8s-pod-network.e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" May 27 03:24:16.339783 containerd[1708]: 2025-05-27 03:24:16.295 [INFO][4617] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" HandleID="k8s-pod-network.e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000233750), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-c2c0d8ddb2", "pod":"calico-kube-controllers-7c69cc6f75-wn88n", "timestamp":"2025-05-27 03:24:16.295630309 +0000 UTC"}, Hostname:"ci-4344.0.0-a-c2c0d8ddb2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:24:16.339783 containerd[1708]: 2025-05-27 03:24:16.295 [INFO][4617] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:24:16.339783 containerd[1708]: 2025-05-27 03:24:16.295 [INFO][4617] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:24:16.339783 containerd[1708]: 2025-05-27 03:24:16.295 [INFO][4617] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-c2c0d8ddb2' May 27 03:24:16.339783 containerd[1708]: 2025-05-27 03:24:16.301 [INFO][4617] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:16.339783 containerd[1708]: 2025-05-27 03:24:16.304 [INFO][4617] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:16.339783 containerd[1708]: 2025-05-27 03:24:16.309 [INFO][4617] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:16.339783 containerd[1708]: 2025-05-27 03:24:16.310 [INFO][4617] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:16.339783 containerd[1708]: 2025-05-27 03:24:16.312 [INFO][4617] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:16.339985 containerd[1708]: 2025-05-27 03:24:16.312 [INFO][4617] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:16.339985 containerd[1708]: 2025-05-27 03:24:16.313 [INFO][4617] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941 May 27 03:24:16.339985 containerd[1708]: 2025-05-27 03:24:16.319 [INFO][4617] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:16.339985 containerd[1708]: 2025-05-27 03:24:16.324 [INFO][4617] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.3/26] block=192.168.55.0/26 handle="k8s-pod-network.e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:16.339985 containerd[1708]: 2025-05-27 03:24:16.324 [INFO][4617] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.3/26] handle="k8s-pod-network.e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:16.339985 containerd[1708]: 2025-05-27 03:24:16.324 [INFO][4617] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:24:16.339985 containerd[1708]: 2025-05-27 03:24:16.324 [INFO][4617] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.3/26] IPv6=[] ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" HandleID="k8s-pod-network.e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" May 27 03:24:16.340115 containerd[1708]: 2025-05-27 03:24:16.325 [INFO][4606] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Namespace="calico-system" Pod="calico-kube-controllers-7c69cc6f75-wn88n" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0", GenerateName:"calico-kube-controllers-7c69cc6f75-", Namespace:"calico-system", SelfLink:"", UID:"275414b6-21b5-4cc2-8a95-b078295361dc", ResourceVersion:"779", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c69cc6f75", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"", Pod:"calico-kube-controllers-7c69cc6f75-wn88n", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.55.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8a30fba1b0f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:16.340170 containerd[1708]: 2025-05-27 03:24:16.326 [INFO][4606] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.3/32] ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Namespace="calico-system" Pod="calico-kube-controllers-7c69cc6f75-wn88n" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" May 27 03:24:16.340170 containerd[1708]: 2025-05-27 03:24:16.326 [INFO][4606] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8a30fba1b0f ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Namespace="calico-system" Pod="calico-kube-controllers-7c69cc6f75-wn88n" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" May 27 03:24:16.340170 containerd[1708]: 2025-05-27 03:24:16.327 [INFO][4606] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Namespace="calico-system" Pod="calico-kube-controllers-7c69cc6f75-wn88n" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" May 27 03:24:16.340233 containerd[1708]: 2025-05-27 03:24:16.327 [INFO][4606] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Namespace="calico-system" Pod="calico-kube-controllers-7c69cc6f75-wn88n" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0", GenerateName:"calico-kube-controllers-7c69cc6f75-", Namespace:"calico-system", SelfLink:"", UID:"275414b6-21b5-4cc2-8a95-b078295361dc", ResourceVersion:"779", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7c69cc6f75", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941", Pod:"calico-kube-controllers-7c69cc6f75-wn88n", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.55.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8a30fba1b0f", MAC:"b2:58:45:ac:85:df", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:16.340283 containerd[1708]: 2025-05-27 03:24:16.338 [INFO][4606] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" Namespace="calico-system" Pod="calico-kube-controllers-7c69cc6f75-wn88n" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--kube--controllers--7c69cc6f75--wn88n-eth0" May 27 03:24:16.384225 containerd[1708]: time="2025-05-27T03:24:16.383709639Z" level=info msg="connecting to shim e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941" address="unix:///run/containerd/s/7ce10f9114f0fa8185194bb83e813bc21487f70e72b0bc7c7c78bec2eacbf833" namespace=k8s.io protocol=ttrpc version=3 May 27 03:24:16.403593 systemd[1]: Started cri-containerd-e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941.scope - libcontainer container e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941. May 27 03:24:16.439869 containerd[1708]: time="2025-05-27T03:24:16.439850169Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7c69cc6f75-wn88n,Uid:275414b6-21b5-4cc2-8a95-b078295361dc,Namespace:calico-system,Attempt:0,} returns sandbox id \"e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941\"" May 27 03:24:17.016529 systemd-networkd[1584]: calie3125674a9e: Gained IPv6LL May 27 03:24:17.245275 containerd[1708]: time="2025-05-27T03:24:17.245254237Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9cggb,Uid:27ed5e5b-c569-4ae0-93dc-3358bb46ea0c,Namespace:calico-system,Attempt:0,}" May 27 03:24:17.245381 containerd[1708]: time="2025-05-27T03:24:17.245370140Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gpm87,Uid:6d40ad28-6278-4b42-9b05-243af0281e54,Namespace:kube-system,Attempt:0,}" May 27 03:24:17.384043 systemd-networkd[1584]: calid53f8db1784: Link UP May 27 03:24:17.384693 systemd-networkd[1584]: calid53f8db1784: Gained carrier May 27 03:24:17.401997 containerd[1708]: 2025-05-27 03:24:17.304 [INFO][4680] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0 csi-node-driver- calico-system 27ed5e5b-c569-4ae0-93dc-3358bb46ea0c 680 0 2025-05-27 03:23:53 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:78f6f74485 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4344.0.0-a-c2c0d8ddb2 csi-node-driver-9cggb eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calid53f8db1784 [] [] }} ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Namespace="calico-system" Pod="csi-node-driver-9cggb" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-" May 27 03:24:17.401997 containerd[1708]: 2025-05-27 03:24:17.304 [INFO][4680] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Namespace="calico-system" Pod="csi-node-driver-9cggb" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" May 27 03:24:17.401997 containerd[1708]: 2025-05-27 03:24:17.336 [INFO][4708] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" HandleID="k8s-pod-network.587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" May 27 03:24:17.402272 containerd[1708]: 2025-05-27 03:24:17.336 [INFO][4708] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" HandleID="k8s-pod-network.587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000332440), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-c2c0d8ddb2", "pod":"csi-node-driver-9cggb", "timestamp":"2025-05-27 03:24:17.336783725 +0000 UTC"}, Hostname:"ci-4344.0.0-a-c2c0d8ddb2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:24:17.402272 containerd[1708]: 2025-05-27 03:24:17.336 [INFO][4708] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:24:17.402272 containerd[1708]: 2025-05-27 03:24:17.336 [INFO][4708] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:24:17.402272 containerd[1708]: 2025-05-27 03:24:17.336 [INFO][4708] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-c2c0d8ddb2' May 27 03:24:17.402272 containerd[1708]: 2025-05-27 03:24:17.344 [INFO][4708] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.402272 containerd[1708]: 2025-05-27 03:24:17.349 [INFO][4708] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.402272 containerd[1708]: 2025-05-27 03:24:17.354 [INFO][4708] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.402272 containerd[1708]: 2025-05-27 03:24:17.356 [INFO][4708] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.402272 containerd[1708]: 2025-05-27 03:24:17.358 [INFO][4708] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.402673 containerd[1708]: 2025-05-27 03:24:17.358 [INFO][4708] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.402673 containerd[1708]: 2025-05-27 03:24:17.359 [INFO][4708] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f May 27 03:24:17.402673 containerd[1708]: 2025-05-27 03:24:17.365 [INFO][4708] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.402673 containerd[1708]: 2025-05-27 03:24:17.373 [INFO][4708] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.4/26] block=192.168.55.0/26 handle="k8s-pod-network.587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.402673 containerd[1708]: 2025-05-27 03:24:17.373 [INFO][4708] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.4/26] handle="k8s-pod-network.587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.402673 containerd[1708]: 2025-05-27 03:24:17.373 [INFO][4708] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:24:17.402673 containerd[1708]: 2025-05-27 03:24:17.373 [INFO][4708] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.4/26] IPv6=[] ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" HandleID="k8s-pod-network.587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" May 27 03:24:17.402985 containerd[1708]: 2025-05-27 03:24:17.376 [INFO][4680] cni-plugin/k8s.go 418: Populated endpoint ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Namespace="calico-system" Pod="csi-node-driver-9cggb" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"27ed5e5b-c569-4ae0-93dc-3358bb46ea0c", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"", Pod:"csi-node-driver-9cggb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.55.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid53f8db1784", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:17.403104 containerd[1708]: 2025-05-27 03:24:17.377 [INFO][4680] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.4/32] ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Namespace="calico-system" Pod="csi-node-driver-9cggb" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" May 27 03:24:17.403104 containerd[1708]: 2025-05-27 03:24:17.377 [INFO][4680] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calid53f8db1784 ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Namespace="calico-system" Pod="csi-node-driver-9cggb" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" May 27 03:24:17.403104 containerd[1708]: 2025-05-27 03:24:17.385 [INFO][4680] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Namespace="calico-system" Pod="csi-node-driver-9cggb" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" May 27 03:24:17.403243 containerd[1708]: 2025-05-27 03:24:17.385 [INFO][4680] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Namespace="calico-system" Pod="csi-node-driver-9cggb" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"27ed5e5b-c569-4ae0-93dc-3358bb46ea0c", ResourceVersion:"680", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 53, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"78f6f74485", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f", Pod:"csi-node-driver-9cggb", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.55.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calid53f8db1784", MAC:"36:88:0f:12:29:b6", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:17.403397 containerd[1708]: 2025-05-27 03:24:17.399 [INFO][4680] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" Namespace="calico-system" Pod="csi-node-driver-9cggb" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-csi--node--driver--9cggb-eth0" May 27 03:24:17.441025 containerd[1708]: time="2025-05-27T03:24:17.440897997Z" level=info msg="connecting to shim 587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f" address="unix:///run/containerd/s/8a81c3fb0684d0156c82bf49f82ea589cc7beee281a9a63c6a42d5bcee07cc03" namespace=k8s.io protocol=ttrpc version=3 May 27 03:24:17.464564 systemd-networkd[1584]: cali8a30fba1b0f: Gained IPv6LL May 27 03:24:17.477640 systemd[1]: Started cri-containerd-587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f.scope - libcontainer container 587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f. May 27 03:24:17.488584 systemd-networkd[1584]: cali77ecb854624: Link UP May 27 03:24:17.489543 systemd-networkd[1584]: cali77ecb854624: Gained carrier May 27 03:24:17.502642 containerd[1708]: 2025-05-27 03:24:17.306 [INFO][4685] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0 coredns-668d6bf9bc- kube-system 6d40ad28-6278-4b42-9b05-243af0281e54 785 0 2025-05-27 03:23:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-a-c2c0d8ddb2 coredns-668d6bf9bc-gpm87 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali77ecb854624 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpm87" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-" May 27 03:24:17.502642 containerd[1708]: 2025-05-27 03:24:17.306 [INFO][4685] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpm87" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" May 27 03:24:17.502642 containerd[1708]: 2025-05-27 03:24:17.350 [INFO][4710] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" HandleID="k8s-pod-network.a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" May 27 03:24:17.502780 containerd[1708]: 2025-05-27 03:24:17.351 [INFO][4710] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" HandleID="k8s-pod-network.a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9700), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-a-c2c0d8ddb2", "pod":"coredns-668d6bf9bc-gpm87", "timestamp":"2025-05-27 03:24:17.34983859 +0000 UTC"}, Hostname:"ci-4344.0.0-a-c2c0d8ddb2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:24:17.502780 containerd[1708]: 2025-05-27 03:24:17.351 [INFO][4710] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:24:17.502780 containerd[1708]: 2025-05-27 03:24:17.373 [INFO][4710] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:24:17.502780 containerd[1708]: 2025-05-27 03:24:17.373 [INFO][4710] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-c2c0d8ddb2' May 27 03:24:17.502780 containerd[1708]: 2025-05-27 03:24:17.445 [INFO][4710] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.502780 containerd[1708]: 2025-05-27 03:24:17.454 [INFO][4710] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.502780 containerd[1708]: 2025-05-27 03:24:17.459 [INFO][4710] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.502780 containerd[1708]: 2025-05-27 03:24:17.461 [INFO][4710] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.502780 containerd[1708]: 2025-05-27 03:24:17.464 [INFO][4710] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.502971 containerd[1708]: 2025-05-27 03:24:17.464 [INFO][4710] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.502971 containerd[1708]: 2025-05-27 03:24:17.465 [INFO][4710] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac May 27 03:24:17.502971 containerd[1708]: 2025-05-27 03:24:17.470 [INFO][4710] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.502971 containerd[1708]: 2025-05-27 03:24:17.480 [INFO][4710] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.5/26] block=192.168.55.0/26 handle="k8s-pod-network.a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.502971 containerd[1708]: 2025-05-27 03:24:17.480 [INFO][4710] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.5/26] handle="k8s-pod-network.a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:17.502971 containerd[1708]: 2025-05-27 03:24:17.480 [INFO][4710] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:24:17.502971 containerd[1708]: 2025-05-27 03:24:17.480 [INFO][4710] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.5/26] IPv6=[] ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" HandleID="k8s-pod-network.a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" May 27 03:24:17.503117 containerd[1708]: 2025-05-27 03:24:17.485 [INFO][4685] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpm87" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"6d40ad28-6278-4b42-9b05-243af0281e54", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"", Pod:"coredns-668d6bf9bc-gpm87", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali77ecb854624", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:17.503117 containerd[1708]: 2025-05-27 03:24:17.486 [INFO][4685] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.5/32] ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpm87" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" May 27 03:24:17.503117 containerd[1708]: 2025-05-27 03:24:17.486 [INFO][4685] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali77ecb854624 ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpm87" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" May 27 03:24:17.503117 containerd[1708]: 2025-05-27 03:24:17.487 [INFO][4685] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpm87" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" May 27 03:24:17.503117 containerd[1708]: 2025-05-27 03:24:17.488 [INFO][4685] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpm87" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"6d40ad28-6278-4b42-9b05-243af0281e54", ResourceVersion:"785", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac", Pod:"coredns-668d6bf9bc-gpm87", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali77ecb854624", MAC:"46:36:20:59:26:e1", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:17.503117 containerd[1708]: 2025-05-27 03:24:17.499 [INFO][4685] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" Namespace="kube-system" Pod="coredns-668d6bf9bc-gpm87" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--gpm87-eth0" May 27 03:24:17.530514 containerd[1708]: time="2025-05-27T03:24:17.530494632Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-9cggb,Uid:27ed5e5b-c569-4ae0-93dc-3358bb46ea0c,Namespace:calico-system,Attempt:0,} returns sandbox id \"587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f\"" May 27 03:24:17.545147 containerd[1708]: time="2025-05-27T03:24:17.545120600Z" level=info msg="connecting to shim a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac" address="unix:///run/containerd/s/54ce79428bcfbfc1cd485b2bf4a8760ee164cd8e283d07e8a66858a43abbcef6" namespace=k8s.io protocol=ttrpc version=3 May 27 03:24:17.573621 systemd[1]: Started cri-containerd-a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac.scope - libcontainer container a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac. May 27 03:24:17.621910 containerd[1708]: time="2025-05-27T03:24:17.621877158Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-gpm87,Uid:6d40ad28-6278-4b42-9b05-243af0281e54,Namespace:kube-system,Attempt:0,} returns sandbox id \"a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac\"" May 27 03:24:17.625505 containerd[1708]: time="2025-05-27T03:24:17.625488036Z" level=info msg="CreateContainer within sandbox \"a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 03:24:17.641289 containerd[1708]: time="2025-05-27T03:24:17.641237510Z" level=info msg="Container 7bc500bb11240391f0bb187089cf8d56af87b4fac34a691da567657a5922e611: CDI devices from CRI Config.CDIDevices: []" May 27 03:24:17.652726 containerd[1708]: time="2025-05-27T03:24:17.652705284Z" level=info msg="CreateContainer within sandbox \"a190f74ad1790cd63a42e85317558bcb068250642e3f4d2cf32c778fd5e959ac\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"7bc500bb11240391f0bb187089cf8d56af87b4fac34a691da567657a5922e611\"" May 27 03:24:17.653258 containerd[1708]: time="2025-05-27T03:24:17.653225875Z" level=info msg="StartContainer for \"7bc500bb11240391f0bb187089cf8d56af87b4fac34a691da567657a5922e611\"" May 27 03:24:17.654296 containerd[1708]: time="2025-05-27T03:24:17.654213598Z" level=info msg="connecting to shim 7bc500bb11240391f0bb187089cf8d56af87b4fac34a691da567657a5922e611" address="unix:///run/containerd/s/54ce79428bcfbfc1cd485b2bf4a8760ee164cd8e283d07e8a66858a43abbcef6" protocol=ttrpc version=3 May 27 03:24:17.668623 systemd[1]: Started cri-containerd-7bc500bb11240391f0bb187089cf8d56af87b4fac34a691da567657a5922e611.scope - libcontainer container 7bc500bb11240391f0bb187089cf8d56af87b4fac34a691da567657a5922e611. May 27 03:24:17.690021 containerd[1708]: time="2025-05-27T03:24:17.689996679Z" level=info msg="StartContainer for \"7bc500bb11240391f0bb187089cf8d56af87b4fac34a691da567657a5922e611\" returns successfully" May 27 03:24:17.699814 containerd[1708]: time="2025-05-27T03:24:17.699743920Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:17.703602 containerd[1708]: time="2025-05-27T03:24:17.703471326Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.0: active requests=0, bytes read=47252431" May 27 03:24:17.707602 containerd[1708]: time="2025-05-27T03:24:17.707578888Z" level=info msg="ImageCreate event name:\"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:17.711053 containerd[1708]: time="2025-05-27T03:24:17.711030858Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:17.712533 containerd[1708]: time="2025-05-27T03:24:17.712504759Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" with image id \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\", repo tag \"ghcr.io/flatcar/calico/apiserver:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/apiserver@sha256:ad7d2e76f15777636c5d91c108d7655659b38fe8970255050ffa51223eb96ff4\", size \"48745150\" in 2.287857509s" May 27 03:24:17.712598 containerd[1708]: time="2025-05-27T03:24:17.712535915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.0\" returns image reference \"sha256:5fa544b30bbe7e24458b21b80890f8834eebe8bcb99071f6caded1a39fc59082\"" May 27 03:24:17.713865 containerd[1708]: time="2025-05-27T03:24:17.713671931Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\"" May 27 03:24:17.715809 containerd[1708]: time="2025-05-27T03:24:17.715790013Z" level=info msg="CreateContainer within sandbox \"49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 03:24:17.728600 containerd[1708]: time="2025-05-27T03:24:17.728575456Z" level=info msg="Container aecf4c099698c6aeab7906edf3b859552318440f21fd2b0c878393cc2538c22d: CDI devices from CRI Config.CDIDevices: []" May 27 03:24:17.740786 containerd[1708]: time="2025-05-27T03:24:17.740733608Z" level=info msg="CreateContainer within sandbox \"49161dba84dee57e062b7ecf0708202c404db1dd2a475aa6cfd0f0a5584e8569\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"aecf4c099698c6aeab7906edf3b859552318440f21fd2b0c878393cc2538c22d\"" May 27 03:24:17.741098 containerd[1708]: time="2025-05-27T03:24:17.741084031Z" level=info msg="StartContainer for \"aecf4c099698c6aeab7906edf3b859552318440f21fd2b0c878393cc2538c22d\"" May 27 03:24:17.742042 containerd[1708]: time="2025-05-27T03:24:17.741959302Z" level=info msg="connecting to shim aecf4c099698c6aeab7906edf3b859552318440f21fd2b0c878393cc2538c22d" address="unix:///run/containerd/s/2096db5f9fc6f4f05a9d218aafe6a1520ce3d35c1b167fd1795d7dac5ea1a0ba" protocol=ttrpc version=3 May 27 03:24:17.755540 systemd[1]: Started cri-containerd-aecf4c099698c6aeab7906edf3b859552318440f21fd2b0c878393cc2538c22d.scope - libcontainer container aecf4c099698c6aeab7906edf3b859552318440f21fd2b0c878393cc2538c22d. May 27 03:24:17.790107 containerd[1708]: time="2025-05-27T03:24:17.790087844Z" level=info msg="StartContainer for \"aecf4c099698c6aeab7906edf3b859552318440f21fd2b0c878393cc2538c22d\" returns successfully" May 27 03:24:18.244951 containerd[1708]: time="2025-05-27T03:24:18.244740012Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bd958cdb-ldwfw,Uid:2669f22e-e758-461f-86ef-0334f39a45eb,Namespace:calico-apiserver,Attempt:0,}" May 27 03:24:18.245151 containerd[1708]: time="2025-05-27T03:24:18.244821814Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-f7587,Uid:8180663d-0a14-4da1-a83a-e7dbd72a3606,Namespace:calico-system,Attempt:0,}" May 27 03:24:18.245401 containerd[1708]: time="2025-05-27T03:24:18.245317969Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-klgls,Uid:39aabb06-801e-4e8a-8388-57aa96ec69b4,Namespace:kube-system,Attempt:0,}" May 27 03:24:18.386727 kubelet[3089]: I0527 03:24:18.386684 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-gpm87" podStartSLOduration=36.386658987 podStartE2EDuration="36.386658987s" podCreationTimestamp="2025-05-27 03:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:24:18.385922928 +0000 UTC m=+42.218800364" watchObservedRunningTime="2025-05-27 03:24:18.386658987 +0000 UTC m=+42.219536470" May 27 03:24:18.470983 systemd-networkd[1584]: cali5231dc50698: Link UP May 27 03:24:18.472976 systemd-networkd[1584]: cali5231dc50698: Gained carrier May 27 03:24:18.490063 kubelet[3089]: I0527 03:24:18.490014 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-75bd958cdb-wx7mp" podStartSLOduration=26.200823911 podStartE2EDuration="28.490000013s" podCreationTimestamp="2025-05-27 03:23:50 +0000 UTC" firstStartedPulling="2025-05-27 03:24:15.424250301 +0000 UTC m=+39.257127737" lastFinishedPulling="2025-05-27 03:24:17.713426403 +0000 UTC m=+41.546303839" observedRunningTime="2025-05-27 03:24:18.429259925 +0000 UTC m=+42.262137362" watchObservedRunningTime="2025-05-27 03:24:18.490000013 +0000 UTC m=+42.322877448" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.334 [INFO][4902] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0 calico-apiserver-75bd958cdb- calico-apiserver 2669f22e-e758-461f-86ef-0334f39a45eb 787 0 2025-05-27 03:23:50 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:75bd958cdb projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4344.0.0-a-c2c0d8ddb2 calico-apiserver-75bd958cdb-ldwfw eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali5231dc50698 [] [] }} ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-ldwfw" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.334 [INFO][4902] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-ldwfw" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.374 [INFO][4949] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" HandleID="k8s-pod-network.ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.375 [INFO][4949] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" HandleID="k8s-pod-network.ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000233020), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4344.0.0-a-c2c0d8ddb2", "pod":"calico-apiserver-75bd958cdb-ldwfw", "timestamp":"2025-05-27 03:24:18.374953971 +0000 UTC"}, Hostname:"ci-4344.0.0-a-c2c0d8ddb2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.375 [INFO][4949] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.375 [INFO][4949] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.375 [INFO][4949] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-c2c0d8ddb2' May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.391 [INFO][4949] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.412 [INFO][4949] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.431 [INFO][4949] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.436 [INFO][4949] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.442 [INFO][4949] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.442 [INFO][4949] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.445 [INFO][4949] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9 May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.453 [INFO][4949] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.466 [INFO][4949] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.6/26] block=192.168.55.0/26 handle="k8s-pod-network.ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.466 [INFO][4949] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.6/26] handle="k8s-pod-network.ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.466 [INFO][4949] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:24:18.494503 containerd[1708]: 2025-05-27 03:24:18.466 [INFO][4949] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.6/26] IPv6=[] ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" HandleID="k8s-pod-network.ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" May 27 03:24:18.495512 containerd[1708]: 2025-05-27 03:24:18.467 [INFO][4902] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-ldwfw" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0", GenerateName:"calico-apiserver-75bd958cdb-", Namespace:"calico-apiserver", SelfLink:"", UID:"2669f22e-e758-461f-86ef-0334f39a45eb", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75bd958cdb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"", Pod:"calico-apiserver-75bd958cdb-ldwfw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5231dc50698", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:18.495512 containerd[1708]: 2025-05-27 03:24:18.467 [INFO][4902] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.6/32] ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-ldwfw" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" May 27 03:24:18.495512 containerd[1708]: 2025-05-27 03:24:18.468 [INFO][4902] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5231dc50698 ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-ldwfw" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" May 27 03:24:18.495512 containerd[1708]: 2025-05-27 03:24:18.475 [INFO][4902] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-ldwfw" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" May 27 03:24:18.495512 containerd[1708]: 2025-05-27 03:24:18.475 [INFO][4902] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-ldwfw" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0", GenerateName:"calico-apiserver-75bd958cdb-", Namespace:"calico-apiserver", SelfLink:"", UID:"2669f22e-e758-461f-86ef-0334f39a45eb", ResourceVersion:"787", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 50, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"75bd958cdb", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9", Pod:"calico-apiserver-75bd958cdb-ldwfw", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.55.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali5231dc50698", MAC:"8e:91:e5:a0:96:88", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:18.495512 containerd[1708]: 2025-05-27 03:24:18.489 [INFO][4902] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" Namespace="calico-apiserver" Pod="calico-apiserver-75bd958cdb-ldwfw" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-calico--apiserver--75bd958cdb--ldwfw-eth0" May 27 03:24:18.539572 containerd[1708]: time="2025-05-27T03:24:18.539530704Z" level=info msg="connecting to shim ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9" address="unix:///run/containerd/s/a14692f160b67609ffa4364fbf67ab7872e7b9824de68f79967f878d6974b5e0" namespace=k8s.io protocol=ttrpc version=3 May 27 03:24:18.560754 systemd[1]: Started cri-containerd-ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9.scope - libcontainer container ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9. May 27 03:24:18.569008 systemd-networkd[1584]: calia479aca4ce6: Link UP May 27 03:24:18.569158 systemd-networkd[1584]: calia479aca4ce6: Gained carrier May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.328 [INFO][4921] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0 goldmane-78d55f7ddc- calico-system 8180663d-0a14-4da1-a83a-e7dbd72a3606 788 0 2025-05-27 03:23:52 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:78d55f7ddc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4344.0.0-a-c2c0d8ddb2 goldmane-78d55f7ddc-f7587 eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia479aca4ce6 [] [] }} ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-f7587" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.329 [INFO][4921] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-f7587" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.401 [INFO][4939] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" HandleID="k8s-pod-network.debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.401 [INFO][4939] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" HandleID="k8s-pod-network.debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc000233e40), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4344.0.0-a-c2c0d8ddb2", "pod":"goldmane-78d55f7ddc-f7587", "timestamp":"2025-05-27 03:24:18.401597003 +0000 UTC"}, Hostname:"ci-4344.0.0-a-c2c0d8ddb2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.403 [INFO][4939] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.466 [INFO][4939] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.466 [INFO][4939] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-c2c0d8ddb2' May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.488 [INFO][4939] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.512 [INFO][4939] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.535 [INFO][4939] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.539 [INFO][4939] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.542 [INFO][4939] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.542 [INFO][4939] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.547 [INFO][4939] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4 May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.555 [INFO][4939] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.563 [INFO][4939] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.7/26] block=192.168.55.0/26 handle="k8s-pod-network.debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.563 [INFO][4939] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.7/26] handle="k8s-pod-network.debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.563 [INFO][4939] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:24:18.590614 containerd[1708]: 2025-05-27 03:24:18.563 [INFO][4939] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.7/26] IPv6=[] ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" HandleID="k8s-pod-network.debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" May 27 03:24:18.591055 containerd[1708]: 2025-05-27 03:24:18.566 [INFO][4921] cni-plugin/k8s.go 418: Populated endpoint ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-f7587" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"8180663d-0a14-4da1-a83a-e7dbd72a3606", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"", Pod:"goldmane-78d55f7ddc-f7587", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.55.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia479aca4ce6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:18.591055 containerd[1708]: 2025-05-27 03:24:18.566 [INFO][4921] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.7/32] ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-f7587" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" May 27 03:24:18.591055 containerd[1708]: 2025-05-27 03:24:18.566 [INFO][4921] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia479aca4ce6 ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-f7587" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" May 27 03:24:18.591055 containerd[1708]: 2025-05-27 03:24:18.568 [INFO][4921] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-f7587" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" May 27 03:24:18.591055 containerd[1708]: 2025-05-27 03:24:18.569 [INFO][4921] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-f7587" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0", GenerateName:"goldmane-78d55f7ddc-", Namespace:"calico-system", SelfLink:"", UID:"8180663d-0a14-4da1-a83a-e7dbd72a3606", ResourceVersion:"788", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"78d55f7ddc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4", Pod:"goldmane-78d55f7ddc-f7587", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.55.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia479aca4ce6", MAC:"fa:ac:24:f9:53:d1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:18.591055 containerd[1708]: 2025-05-27 03:24:18.588 [INFO][4921] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" Namespace="calico-system" Pod="goldmane-78d55f7ddc-f7587" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-goldmane--78d55f7ddc--f7587-eth0" May 27 03:24:18.635590 containerd[1708]: time="2025-05-27T03:24:18.635523722Z" level=info msg="connecting to shim debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4" address="unix:///run/containerd/s/a18577a9f76112ff611cfb3825d98df115d9a0f4d38f699f454294324c425fc7" namespace=k8s.io protocol=ttrpc version=3 May 27 03:24:18.659140 containerd[1708]: time="2025-05-27T03:24:18.658926420Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-75bd958cdb-ldwfw,Uid:2669f22e-e758-461f-86ef-0334f39a45eb,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9\"" May 27 03:24:18.662087 containerd[1708]: time="2025-05-27T03:24:18.662069889Z" level=info msg="CreateContainer within sandbox \"ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9\" for container &ContainerMetadata{Name:calico-apiserver,Attempt:0,}" May 27 03:24:18.673724 systemd[1]: Started cri-containerd-debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4.scope - libcontainer container debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4. May 27 03:24:18.692650 systemd-networkd[1584]: cali457d76f095d: Link UP May 27 03:24:18.694355 systemd-networkd[1584]: cali457d76f095d: Gained carrier May 27 03:24:18.704099 containerd[1708]: time="2025-05-27T03:24:18.704061964Z" level=info msg="Container 7a7fdef3a57a1b1501241d326bf9ab0c7bd9117d944a0afce98773eb49cba193: CDI devices from CRI Config.CDIDevices: []" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.329 [INFO][4907] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0 coredns-668d6bf9bc- kube-system 39aabb06-801e-4e8a-8388-57aa96ec69b4 783 0 2025-05-27 03:23:42 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4344.0.0-a-c2c0d8ddb2 coredns-668d6bf9bc-klgls eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali457d76f095d [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Namespace="kube-system" Pod="coredns-668d6bf9bc-klgls" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.329 [INFO][4907] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Namespace="kube-system" Pod="coredns-668d6bf9bc-klgls" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.403 [INFO][4941] ipam/ipam_plugin.go 225: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" HandleID="k8s-pod-network.363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.406 [INFO][4941] ipam/ipam_plugin.go 265: Auto assigning IP ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" HandleID="k8s-pod-network.363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0xc0002d9800), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4344.0.0-a-c2c0d8ddb2", "pod":"coredns-668d6bf9bc-klgls", "timestamp":"2025-05-27 03:24:18.40301573 +0000 UTC"}, Hostname:"ci-4344.0.0-a-c2c0d8ddb2", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.406 [INFO][4941] ipam/ipam_plugin.go 353: About to acquire host-wide IPAM lock. May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.564 [INFO][4941] ipam/ipam_plugin.go 368: Acquired host-wide IPAM lock. May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.565 [INFO][4941] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4344.0.0-a-c2c0d8ddb2' May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.589 [INFO][4941] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.612 [INFO][4941] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.633 [INFO][4941] ipam/ipam.go 511: Trying affinity for 192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.636 [INFO][4941] ipam/ipam.go 158: Attempting to load block cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.639 [INFO][4941] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.55.0/26 host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.639 [INFO][4941] ipam/ipam.go 1220: Attempting to assign 1 addresses from block block=192.168.55.0/26 handle="k8s-pod-network.363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.642 [INFO][4941] ipam/ipam.go 1764: Creating new handle: k8s-pod-network.363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.652 [INFO][4941] ipam/ipam.go 1243: Writing block in order to claim IPs block=192.168.55.0/26 handle="k8s-pod-network.363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.670 [INFO][4941] ipam/ipam.go 1256: Successfully claimed IPs: [192.168.55.8/26] block=192.168.55.0/26 handle="k8s-pod-network.363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.671 [INFO][4941] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.55.8/26] handle="k8s-pod-network.363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" host="ci-4344.0.0-a-c2c0d8ddb2" May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.672 [INFO][4941] ipam/ipam_plugin.go 374: Released host-wide IPAM lock. May 27 03:24:18.715740 containerd[1708]: 2025-05-27 03:24:18.672 [INFO][4941] ipam/ipam_plugin.go 283: Calico CNI IPAM assigned addresses IPv4=[192.168.55.8/26] IPv6=[] ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" HandleID="k8s-pod-network.363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Workload="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" May 27 03:24:18.716713 containerd[1708]: 2025-05-27 03:24:18.681 [INFO][4907] cni-plugin/k8s.go 418: Populated endpoint ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Namespace="kube-system" Pod="coredns-668d6bf9bc-klgls" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"39aabb06-801e-4e8a-8388-57aa96ec69b4", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"", Pod:"coredns-668d6bf9bc-klgls", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali457d76f095d", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:18.716713 containerd[1708]: 2025-05-27 03:24:18.681 [INFO][4907] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.55.8/32] ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Namespace="kube-system" Pod="coredns-668d6bf9bc-klgls" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" May 27 03:24:18.716713 containerd[1708]: 2025-05-27 03:24:18.681 [INFO][4907] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali457d76f095d ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Namespace="kube-system" Pod="coredns-668d6bf9bc-klgls" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" May 27 03:24:18.716713 containerd[1708]: 2025-05-27 03:24:18.697 [INFO][4907] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Namespace="kube-system" Pod="coredns-668d6bf9bc-klgls" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" May 27 03:24:18.716713 containerd[1708]: 2025-05-27 03:24:18.698 [INFO][4907] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Namespace="kube-system" Pod="coredns-668d6bf9bc-klgls" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"39aabb06-801e-4e8a-8388-57aa96ec69b4", ResourceVersion:"783", Generation:0, CreationTimestamp:time.Date(2025, time.May, 27, 3, 23, 42, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4344.0.0-a-c2c0d8ddb2", ContainerID:"363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c", Pod:"coredns-668d6bf9bc-klgls", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.55.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali457d76f095d", MAC:"5a:6f:7a:af:ad:ee", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} May 27 03:24:18.716713 containerd[1708]: 2025-05-27 03:24:18.712 [INFO][4907] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" Namespace="kube-system" Pod="coredns-668d6bf9bc-klgls" WorkloadEndpoint="ci--4344.0.0--a--c2c0d8ddb2-k8s-coredns--668d6bf9bc--klgls-eth0" May 27 03:24:18.724491 containerd[1708]: time="2025-05-27T03:24:18.724467567Z" level=info msg="CreateContainer within sandbox \"ff78fd0dc9d24eda53edbb07de1b414717a284a055af8a8e8b6641b79ee740e9\" for &ContainerMetadata{Name:calico-apiserver,Attempt:0,} returns container id \"7a7fdef3a57a1b1501241d326bf9ab0c7bd9117d944a0afce98773eb49cba193\"" May 27 03:24:18.726386 containerd[1708]: time="2025-05-27T03:24:18.726300944Z" level=info msg="StartContainer for \"7a7fdef3a57a1b1501241d326bf9ab0c7bd9117d944a0afce98773eb49cba193\"" May 27 03:24:18.731837 containerd[1708]: time="2025-05-27T03:24:18.731812533Z" level=info msg="connecting to shim 7a7fdef3a57a1b1501241d326bf9ab0c7bd9117d944a0afce98773eb49cba193" address="unix:///run/containerd/s/a14692f160b67609ffa4364fbf67ab7872e7b9824de68f79967f878d6974b5e0" protocol=ttrpc version=3 May 27 03:24:18.745566 systemd[1]: Started cri-containerd-7a7fdef3a57a1b1501241d326bf9ab0c7bd9117d944a0afce98773eb49cba193.scope - libcontainer container 7a7fdef3a57a1b1501241d326bf9ab0c7bd9117d944a0afce98773eb49cba193. May 27 03:24:18.776982 containerd[1708]: time="2025-05-27T03:24:18.776931032Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-78d55f7ddc-f7587,Uid:8180663d-0a14-4da1-a83a-e7dbd72a3606,Namespace:calico-system,Attempt:0,} returns sandbox id \"debe36bec61929a832ad962363709ba5890f51f94d3e11867c61673b98a50fa4\"" May 27 03:24:18.782962 containerd[1708]: time="2025-05-27T03:24:18.782936447Z" level=info msg="connecting to shim 363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c" address="unix:///run/containerd/s/2d0a2a69aa5701f9a930898730370257e44ab50a529507e81e9993824d0e32a0" namespace=k8s.io protocol=ttrpc version=3 May 27 03:24:18.803609 containerd[1708]: time="2025-05-27T03:24:18.803584718Z" level=info msg="StartContainer for \"7a7fdef3a57a1b1501241d326bf9ab0c7bd9117d944a0afce98773eb49cba193\" returns successfully" May 27 03:24:18.803724 systemd[1]: Started cri-containerd-363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c.scope - libcontainer container 363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c. May 27 03:24:18.843524 containerd[1708]: time="2025-05-27T03:24:18.843497820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-klgls,Uid:39aabb06-801e-4e8a-8388-57aa96ec69b4,Namespace:kube-system,Attempt:0,} returns sandbox id \"363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c\"" May 27 03:24:18.845622 containerd[1708]: time="2025-05-27T03:24:18.845597153Z" level=info msg="CreateContainer within sandbox \"363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" May 27 03:24:18.859921 containerd[1708]: time="2025-05-27T03:24:18.859470379Z" level=info msg="Container 4f076d22a802dacfcfba6323ce577838f2746b6b63a9901882c0b62d2583d4ab: CDI devices from CRI Config.CDIDevices: []" May 27 03:24:18.871550 containerd[1708]: time="2025-05-27T03:24:18.871530684Z" level=info msg="CreateContainer within sandbox \"363cde9395f00a116fee3cfe855f7b5ccb6fba8d1b26a5f9f0213efb5d6ce76c\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"4f076d22a802dacfcfba6323ce577838f2746b6b63a9901882c0b62d2583d4ab\"" May 27 03:24:18.873132 containerd[1708]: time="2025-05-27T03:24:18.872532816Z" level=info msg="StartContainer for \"4f076d22a802dacfcfba6323ce577838f2746b6b63a9901882c0b62d2583d4ab\"" May 27 03:24:18.873132 containerd[1708]: time="2025-05-27T03:24:18.873089072Z" level=info msg="connecting to shim 4f076d22a802dacfcfba6323ce577838f2746b6b63a9901882c0b62d2583d4ab" address="unix:///run/containerd/s/2d0a2a69aa5701f9a930898730370257e44ab50a529507e81e9993824d0e32a0" protocol=ttrpc version=3 May 27 03:24:18.894632 systemd[1]: Started cri-containerd-4f076d22a802dacfcfba6323ce577838f2746b6b63a9901882c0b62d2583d4ab.scope - libcontainer container 4f076d22a802dacfcfba6323ce577838f2746b6b63a9901882c0b62d2583d4ab. May 27 03:24:18.922898 containerd[1708]: time="2025-05-27T03:24:18.922875408Z" level=info msg="StartContainer for \"4f076d22a802dacfcfba6323ce577838f2746b6b63a9901882c0b62d2583d4ab\" returns successfully" May 27 03:24:19.001008 systemd-networkd[1584]: calid53f8db1784: Gained IPv6LL May 27 03:24:19.387987 kubelet[3089]: I0527 03:24:19.387938 3089 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:24:19.403191 kubelet[3089]: I0527 03:24:19.402643 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-klgls" podStartSLOduration=37.402628741 podStartE2EDuration="37.402628741s" podCreationTimestamp="2025-05-27 03:23:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:24:19.400716178 +0000 UTC m=+43.233593619" watchObservedRunningTime="2025-05-27 03:24:19.402628741 +0000 UTC m=+43.235506168" May 27 03:24:19.425011 kubelet[3089]: I0527 03:24:19.424583 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-apiserver/calico-apiserver-75bd958cdb-ldwfw" podStartSLOduration=29.424569853 podStartE2EDuration="29.424569853s" podCreationTimestamp="2025-05-27 03:23:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-05-27 03:24:19.424002802 +0000 UTC m=+43.256880240" watchObservedRunningTime="2025-05-27 03:24:19.424569853 +0000 UTC m=+43.257447281" May 27 03:24:19.448527 systemd-networkd[1584]: cali77ecb854624: Gained IPv6LL May 27 03:24:19.640537 systemd-networkd[1584]: cali5231dc50698: Gained IPv6LL May 27 03:24:19.832556 systemd-networkd[1584]: cali457d76f095d: Gained IPv6LL May 27 03:24:20.161906 containerd[1708]: time="2025-05-27T03:24:20.161881121Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:20.164125 containerd[1708]: time="2025-05-27T03:24:20.164101225Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.0: active requests=0, bytes read=51178512" May 27 03:24:20.166723 containerd[1708]: time="2025-05-27T03:24:20.166669272Z" level=info msg="ImageCreate event name:\"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:20.173958 containerd[1708]: time="2025-05-27T03:24:20.173912029Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:20.174334 containerd[1708]: time="2025-05-27T03:24:20.174232073Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" with image id \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\", repo tag \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/kube-controllers@sha256:eb5bc5c9e7a71f1d8ea69bbcc8e54b84fb7ec1e32d919c8b148f80b770f20182\", size \"52671183\" in 2.460532689s" May 27 03:24:20.174334 containerd[1708]: time="2025-05-27T03:24:20.174257544Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.0\" returns image reference \"sha256:094053209304a3d20e6561c18d37ac2dc4c7fbb68c1579d9864c303edebffa50\"" May 27 03:24:20.175668 containerd[1708]: time="2025-05-27T03:24:20.175642805Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\"" May 27 03:24:20.188632 containerd[1708]: time="2025-05-27T03:24:20.188521295Z" level=info msg="CreateContainer within sandbox \"e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941\" for container &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,}" May 27 03:24:20.202976 containerd[1708]: time="2025-05-27T03:24:20.202952835Z" level=info msg="Container 7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c: CDI devices from CRI Config.CDIDevices: []" May 27 03:24:20.225187 containerd[1708]: time="2025-05-27T03:24:20.225141280Z" level=info msg="CreateContainer within sandbox \"e409fc787d3be01082442ed75e481cdf2c8b5b096f292a75330186c19ddba941\" for &ContainerMetadata{Name:calico-kube-controllers,Attempt:0,} returns container id \"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\"" May 27 03:24:20.225915 containerd[1708]: time="2025-05-27T03:24:20.225896492Z" level=info msg="StartContainer for \"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\"" May 27 03:24:20.227241 containerd[1708]: time="2025-05-27T03:24:20.227209711Z" level=info msg="connecting to shim 7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c" address="unix:///run/containerd/s/7ce10f9114f0fa8185194bb83e813bc21487f70e72b0bc7c7c78bec2eacbf833" protocol=ttrpc version=3 May 27 03:24:20.245574 systemd[1]: Started cri-containerd-7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c.scope - libcontainer container 7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c. May 27 03:24:20.344509 systemd-networkd[1584]: calia479aca4ce6: Gained IPv6LL May 27 03:24:20.478485 containerd[1708]: time="2025-05-27T03:24:20.478407929Z" level=info msg="StartContainer for \"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\" returns successfully" May 27 03:24:20.479468 kubelet[3089]: I0527 03:24:20.479357 3089 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:24:21.499741 kubelet[3089]: I0527 03:24:21.499533 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-kube-controllers-7c69cc6f75-wn88n" podStartSLOduration=23.764916946 podStartE2EDuration="27.499519318s" podCreationTimestamp="2025-05-27 03:23:54 +0000 UTC" firstStartedPulling="2025-05-27 03:24:16.440562187 +0000 UTC m=+40.273439616" lastFinishedPulling="2025-05-27 03:24:20.175164551 +0000 UTC m=+44.008041988" observedRunningTime="2025-05-27 03:24:21.499331298 +0000 UTC m=+45.332208734" watchObservedRunningTime="2025-05-27 03:24:21.499519318 +0000 UTC m=+45.332396755" May 27 03:24:21.616228 containerd[1708]: time="2025-05-27T03:24:21.616202945Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:21.618238 containerd[1708]: time="2025-05-27T03:24:21.618209149Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.0: active requests=0, bytes read=8758390" May 27 03:24:21.620591 containerd[1708]: time="2025-05-27T03:24:21.620551068Z" level=info msg="ImageCreate event name:\"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:21.623313 containerd[1708]: time="2025-05-27T03:24:21.623271079Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:21.623745 containerd[1708]: time="2025-05-27T03:24:21.623583081Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/csi:v3.30.0\" with image id \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\", repo tag \"ghcr.io/flatcar/calico/csi:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/csi@sha256:27883a4104876fe239311dd93ce6efd0c4a87de7163d57a4c8d96bd65a287ffd\", size \"10251093\" in 1.447908927s" May 27 03:24:21.623745 containerd[1708]: time="2025-05-27T03:24:21.623605192Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.0\" returns image reference \"sha256:d5b08093b7928c0ac1122e59edf69b2e58c6d10ecc8b9e5cffeb809a956dc48e\"" May 27 03:24:21.624609 containerd[1708]: time="2025-05-27T03:24:21.624593675Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:24:21.625413 containerd[1708]: time="2025-05-27T03:24:21.625387002Z" level=info msg="CreateContainer within sandbox \"587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f\" for container &ContainerMetadata{Name:calico-csi,Attempt:0,}" May 27 03:24:21.642451 containerd[1708]: time="2025-05-27T03:24:21.641455291Z" level=info msg="Container 9315e105db667573b4acbabecb219a00fc882f3a639a86a5cdc77dbc8aab5277: CDI devices from CRI Config.CDIDevices: []" May 27 03:24:21.655507 containerd[1708]: time="2025-05-27T03:24:21.655481085Z" level=info msg="CreateContainer within sandbox \"587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f\" for &ContainerMetadata{Name:calico-csi,Attempt:0,} returns container id \"9315e105db667573b4acbabecb219a00fc882f3a639a86a5cdc77dbc8aab5277\"" May 27 03:24:21.656458 containerd[1708]: time="2025-05-27T03:24:21.655783885Z" level=info msg="StartContainer for \"9315e105db667573b4acbabecb219a00fc882f3a639a86a5cdc77dbc8aab5277\"" May 27 03:24:21.657117 containerd[1708]: time="2025-05-27T03:24:21.657092852Z" level=info msg="connecting to shim 9315e105db667573b4acbabecb219a00fc882f3a639a86a5cdc77dbc8aab5277" address="unix:///run/containerd/s/8a81c3fb0684d0156c82bf49f82ea589cc7beee281a9a63c6a42d5bcee07cc03" protocol=ttrpc version=3 May 27 03:24:21.672544 systemd[1]: Started cri-containerd-9315e105db667573b4acbabecb219a00fc882f3a639a86a5cdc77dbc8aab5277.scope - libcontainer container 9315e105db667573b4acbabecb219a00fc882f3a639a86a5cdc77dbc8aab5277. May 27 03:24:21.697626 containerd[1708]: time="2025-05-27T03:24:21.697608034Z" level=info msg="StartContainer for \"9315e105db667573b4acbabecb219a00fc882f3a639a86a5cdc77dbc8aab5277\" returns successfully" May 27 03:24:21.824792 containerd[1708]: time="2025-05-27T03:24:21.824738199Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:24:21.827250 containerd[1708]: time="2025-05-27T03:24:21.827203792Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:24:21.827367 containerd[1708]: time="2025-05-27T03:24:21.827213540Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:24:21.827398 kubelet[3089]: E0527 03:24:21.827370 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:24:21.827461 kubelet[3089]: E0527 03:24:21.827407 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:24:21.827764 kubelet[3089]: E0527 03:24:21.827600 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lm5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-f7587_calico-system(8180663d-0a14-4da1-a83a-e7dbd72a3606): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:24:21.827878 containerd[1708]: time="2025-05-27T03:24:21.827629686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\"" May 27 03:24:21.829156 kubelet[3089]: E0527 03:24:21.829119 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:24:22.485150 kubelet[3089]: E0527 03:24:22.485024 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:24:23.349099 containerd[1708]: time="2025-05-27T03:24:23.349075573Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:23.351017 containerd[1708]: time="2025-05-27T03:24:23.350986931Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0: active requests=0, bytes read=14705639" May 27 03:24:23.353226 containerd[1708]: time="2025-05-27T03:24:23.353192224Z" level=info msg="ImageCreate event name:\"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:23.355924 containerd[1708]: time="2025-05-27T03:24:23.355888477Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" May 27 03:24:23.356245 containerd[1708]: time="2025-05-27T03:24:23.356171444Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" with image id \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\", repo tag \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\", repo digest \"ghcr.io/flatcar/calico/node-driver-registrar@sha256:dca5c16181edde2e860463615523ce457cd9dcfca85b7cfdcd6f3ea7de6f2ac8\", size \"16198294\" in 1.52852091s" May 27 03:24:23.356245 containerd[1708]: time="2025-05-27T03:24:23.356194847Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.0\" returns image reference \"sha256:45c8692ffc029387ee93ba83da8ad26da9749cf2ba6ed03981f8f9933ed5a5b0\"" May 27 03:24:23.357837 containerd[1708]: time="2025-05-27T03:24:23.357811772Z" level=info msg="CreateContainer within sandbox \"587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f\" for container &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,}" May 27 03:24:23.374462 containerd[1708]: time="2025-05-27T03:24:23.373681379Z" level=info msg="Container fefa3169538f049cde933a255259de439aea093b4412179c89b9b1d4af41ef43: CDI devices from CRI Config.CDIDevices: []" May 27 03:24:23.389189 containerd[1708]: time="2025-05-27T03:24:23.389170562Z" level=info msg="CreateContainer within sandbox \"587bd13a44798507cf03a597505a83a592bbe9c8e4cf723b173cb64cea41235f\" for &ContainerMetadata{Name:csi-node-driver-registrar,Attempt:0,} returns container id \"fefa3169538f049cde933a255259de439aea093b4412179c89b9b1d4af41ef43\"" May 27 03:24:23.389612 containerd[1708]: time="2025-05-27T03:24:23.389516758Z" level=info msg="StartContainer for \"fefa3169538f049cde933a255259de439aea093b4412179c89b9b1d4af41ef43\"" May 27 03:24:23.391026 containerd[1708]: time="2025-05-27T03:24:23.390751459Z" level=info msg="connecting to shim fefa3169538f049cde933a255259de439aea093b4412179c89b9b1d4af41ef43" address="unix:///run/containerd/s/8a81c3fb0684d0156c82bf49f82ea589cc7beee281a9a63c6a42d5bcee07cc03" protocol=ttrpc version=3 May 27 03:24:23.411544 systemd[1]: Started cri-containerd-fefa3169538f049cde933a255259de439aea093b4412179c89b9b1d4af41ef43.scope - libcontainer container fefa3169538f049cde933a255259de439aea093b4412179c89b9b1d4af41ef43. May 27 03:24:23.440854 containerd[1708]: time="2025-05-27T03:24:23.440822308Z" level=info msg="StartContainer for \"fefa3169538f049cde933a255259de439aea093b4412179c89b9b1d4af41ef43\" returns successfully" May 27 03:24:24.315750 kubelet[3089]: I0527 03:24:24.315736 3089 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: csi.tigera.io endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock versions: 1.0.0 May 27 03:24:24.315965 kubelet[3089]: I0527 03:24:24.315776 3089 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: csi.tigera.io at endpoint: /var/lib/kubelet/plugins/csi.tigera.io/csi.sock May 27 03:24:24.954707 kubelet[3089]: I0527 03:24:24.954662 3089 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:24:24.988366 containerd[1708]: time="2025-05-27T03:24:24.988347752Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\" id:\"a8a9b47da5ccf2bdc64e122f631e328daa98d9a48656b78f8aa42b4df45fa78a\" pid:5354 exited_at:{seconds:1748316264 nanos:987855169}" May 27 03:24:25.003705 kubelet[3089]: I0527 03:24:25.003105 3089 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/csi-node-driver-9cggb" podStartSLOduration=26.177950019 podStartE2EDuration="32.003091393s" podCreationTimestamp="2025-05-27 03:23:53 +0000 UTC" firstStartedPulling="2025-05-27 03:24:17.531513777 +0000 UTC m=+41.364391207" lastFinishedPulling="2025-05-27 03:24:23.356655142 +0000 UTC m=+47.189532581" observedRunningTime="2025-05-27 03:24:23.497105285 +0000 UTC m=+47.329982724" watchObservedRunningTime="2025-05-27 03:24:25.003091393 +0000 UTC m=+48.835968837" May 27 03:24:25.031418 containerd[1708]: time="2025-05-27T03:24:25.031398976Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\" id:\"8f06a39fb37044fd757907dccdd63c924fff7965213c7d2795491860775850f4\" pid:5374 exited_at:{seconds:1748316265 nanos:31130937}" May 27 03:24:25.247382 containerd[1708]: time="2025-05-27T03:24:25.246012243Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:24:25.411355 containerd[1708]: time="2025-05-27T03:24:25.411331081Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:24:25.413960 containerd[1708]: time="2025-05-27T03:24:25.413922646Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:24:25.413960 containerd[1708]: time="2025-05-27T03:24:25.413944681Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:24:25.414074 kubelet[3089]: E0527 03:24:25.414041 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:24:25.414327 kubelet[3089]: E0527 03:24:25.414072 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:24:25.414327 kubelet[3089]: E0527 03:24:25.414178 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:903c8ceec2c64dbb8528b709b606411a,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrmjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c8f547bcc-pgf67_calico-system(f608e1b5-7f91-494e-a0c3-fc01e99dfa04): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:24:25.416098 containerd[1708]: time="2025-05-27T03:24:25.416075809Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:24:25.579726 containerd[1708]: time="2025-05-27T03:24:25.579617786Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:24:25.582338 containerd[1708]: time="2025-05-27T03:24:25.582291645Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:24:25.582459 containerd[1708]: time="2025-05-27T03:24:25.582302833Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:24:25.582494 kubelet[3089]: E0527 03:24:25.582432 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:24:25.582494 kubelet[3089]: E0527 03:24:25.582487 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:24:25.582826 kubelet[3089]: E0527 03:24:25.582578 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrmjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c8f547bcc-pgf67_calico-system(f608e1b5-7f91-494e-a0c3-fc01e99dfa04): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:24:25.583750 kubelet[3089]: E0527 03:24:25.583720 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:24:33.567898 kubelet[3089]: I0527 03:24:33.567584 3089 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:24:36.248185 containerd[1708]: time="2025-05-27T03:24:36.247891138Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:24:36.408806 containerd[1708]: time="2025-05-27T03:24:36.408777281Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:24:36.411517 containerd[1708]: time="2025-05-27T03:24:36.411486146Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:24:36.412169 containerd[1708]: time="2025-05-27T03:24:36.411571840Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:24:36.412214 kubelet[3089]: E0527 03:24:36.411650 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:24:36.412214 kubelet[3089]: E0527 03:24:36.411691 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:24:36.412214 kubelet[3089]: E0527 03:24:36.411828 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lm5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-f7587_calico-system(8180663d-0a14-4da1-a83a-e7dbd72a3606): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:24:36.413029 kubelet[3089]: E0527 03:24:36.412951 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:24:37.245302 kubelet[3089]: E0527 03:24:37.245216 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:24:44.430693 containerd[1708]: time="2025-05-27T03:24:44.430597151Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\" id:\"1e24cfb960de5431232144a1fe98776f37dd742042c137c8e81e5407ca789968\" pid:5419 exited_at:{seconds:1748316284 nanos:430269185}" May 27 03:24:51.245632 kubelet[3089]: E0527 03:24:51.245521 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:24:52.248469 containerd[1708]: time="2025-05-27T03:24:52.247089956Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:24:52.411220 containerd[1708]: time="2025-05-27T03:24:52.411187065Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:24:52.413592 containerd[1708]: time="2025-05-27T03:24:52.413515682Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:24:52.413592 containerd[1708]: time="2025-05-27T03:24:52.413557247Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:24:52.413716 kubelet[3089]: E0527 03:24:52.413673 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:24:52.413966 kubelet[3089]: E0527 03:24:52.413721 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:24:52.413966 kubelet[3089]: E0527 03:24:52.413812 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:903c8ceec2c64dbb8528b709b606411a,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrmjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c8f547bcc-pgf67_calico-system(f608e1b5-7f91-494e-a0c3-fc01e99dfa04): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:24:52.415943 containerd[1708]: time="2025-05-27T03:24:52.415922026Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:24:52.584810 containerd[1708]: time="2025-05-27T03:24:52.584667041Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:24:52.587046 containerd[1708]: time="2025-05-27T03:24:52.586946778Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:24:52.587365 containerd[1708]: time="2025-05-27T03:24:52.587127169Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:24:52.587505 kubelet[3089]: E0527 03:24:52.587329 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:24:52.587505 kubelet[3089]: E0527 03:24:52.587487 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:24:52.587855 kubelet[3089]: E0527 03:24:52.587717 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrmjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c8f547bcc-pgf67_calico-system(f608e1b5-7f91-494e-a0c3-fc01e99dfa04): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:24:52.589741 kubelet[3089]: E0527 03:24:52.589138 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:24:55.021842 containerd[1708]: time="2025-05-27T03:24:55.021803845Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\" id:\"24f974eeda56bd311a928571ae03477800fc9b8703cf1d31712334e7ecf3ca3d\" pid:5451 exited_at:{seconds:1748316295 nanos:21597584}" May 27 03:25:01.027600 kubelet[3089]: I0527 03:25:01.027268 3089 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" May 27 03:25:04.246994 containerd[1708]: time="2025-05-27T03:25:04.246897700Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:25:04.410623 containerd[1708]: time="2025-05-27T03:25:04.410520051Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:25:04.413305 containerd[1708]: time="2025-05-27T03:25:04.413248345Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:25:04.413876 containerd[1708]: time="2025-05-27T03:25:04.413336740Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:25:04.413940 kubelet[3089]: E0527 03:25:04.413496 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:25:04.413940 kubelet[3089]: E0527 03:25:04.413552 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:25:04.413940 kubelet[3089]: E0527 03:25:04.413739 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lm5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-f7587_calico-system(8180663d-0a14-4da1-a83a-e7dbd72a3606): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:25:04.414928 kubelet[3089]: E0527 03:25:04.414894 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:25:05.246987 kubelet[3089]: E0527 03:25:05.246922 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:25:14.120862 containerd[1708]: time="2025-05-27T03:25:14.120781584Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\" id:\"fe687cb4349c39649d10b419875ecd489f72786ccaa5eba1acb693192b417bbb\" pid:5478 exited_at:{seconds:1748316314 nanos:120371987}" May 27 03:25:14.431234 containerd[1708]: time="2025-05-27T03:25:14.431149762Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\" id:\"1c9022513f8a6a5a089e711fd2bbe86d9a63251bf371c88eb71469f4264bf9bd\" pid:5500 exited_at:{seconds:1748316314 nanos:430963561}" May 27 03:25:16.249330 kubelet[3089]: E0527 03:25:16.249191 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:25:19.246229 kubelet[3089]: E0527 03:25:19.246058 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:25:25.027863 containerd[1708]: time="2025-05-27T03:25:25.027698290Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\" id:\"a9f942d323768a7a84259b1f8e0f1d93e6537095626a5d712a77447e525a4b6e\" pid:5525 exited_at:{seconds:1748316325 nanos:26624595}" May 27 03:25:28.250644 kubelet[3089]: E0527 03:25:28.250601 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:25:28.982760 systemd[1]: Started sshd@7-10.200.8.20:22-10.200.16.10:34838.service - OpenSSH per-connection server daemon (10.200.16.10:34838). May 27 03:25:29.636533 sshd[5539]: Accepted publickey for core from 10.200.16.10 port 34838 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:25:29.637602 sshd-session[5539]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:29.641506 systemd-logind[1695]: New session 10 of user core. May 27 03:25:29.644610 systemd[1]: Started session-10.scope - Session 10 of User core. May 27 03:25:30.136519 sshd[5541]: Connection closed by 10.200.16.10 port 34838 May 27 03:25:30.136892 sshd-session[5539]: pam_unix(sshd:session): session closed for user core May 27 03:25:30.138875 systemd[1]: sshd@7-10.200.8.20:22-10.200.16.10:34838.service: Deactivated successfully. May 27 03:25:30.140572 systemd[1]: session-10.scope: Deactivated successfully. May 27 03:25:30.142211 systemd-logind[1695]: Session 10 logged out. Waiting for processes to exit. May 27 03:25:30.143048 systemd-logind[1695]: Removed session 10. May 27 03:25:30.246160 kubelet[3089]: E0527 03:25:30.246121 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:25:35.253664 systemd[1]: Started sshd@8-10.200.8.20:22-10.200.16.10:34854.service - OpenSSH per-connection server daemon (10.200.16.10:34854). May 27 03:25:35.897156 sshd[5560]: Accepted publickey for core from 10.200.16.10 port 34854 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:25:35.898075 sshd-session[5560]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:35.902045 systemd-logind[1695]: New session 11 of user core. May 27 03:25:35.904604 systemd[1]: Started session-11.scope - Session 11 of User core. May 27 03:25:36.430817 sshd[5562]: Connection closed by 10.200.16.10 port 34854 May 27 03:25:36.429772 sshd-session[5560]: pam_unix(sshd:session): session closed for user core May 27 03:25:36.434171 systemd[1]: sshd@8-10.200.8.20:22-10.200.16.10:34854.service: Deactivated successfully. May 27 03:25:36.437995 systemd[1]: session-11.scope: Deactivated successfully. May 27 03:25:36.439544 systemd-logind[1695]: Session 11 logged out. Waiting for processes to exit. May 27 03:25:36.442040 systemd-logind[1695]: Removed session 11. May 27 03:25:41.541425 systemd[1]: Started sshd@9-10.200.8.20:22-10.200.16.10:33262.service - OpenSSH per-connection server daemon (10.200.16.10:33262). May 27 03:25:42.175033 sshd[5579]: Accepted publickey for core from 10.200.16.10 port 33262 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:25:42.176002 sshd-session[5579]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:42.179737 systemd-logind[1695]: New session 12 of user core. May 27 03:25:42.183598 systemd[1]: Started session-12.scope - Session 12 of User core. May 27 03:25:42.670780 sshd[5581]: Connection closed by 10.200.16.10 port 33262 May 27 03:25:42.671153 sshd-session[5579]: pam_unix(sshd:session): session closed for user core May 27 03:25:42.673313 systemd[1]: sshd@9-10.200.8.20:22-10.200.16.10:33262.service: Deactivated successfully. May 27 03:25:42.674961 systemd[1]: session-12.scope: Deactivated successfully. May 27 03:25:42.676602 systemd-logind[1695]: Session 12 logged out. Waiting for processes to exit. May 27 03:25:42.677425 systemd-logind[1695]: Removed session 12. May 27 03:25:42.780013 systemd[1]: Started sshd@10-10.200.8.20:22-10.200.16.10:33272.service - OpenSSH per-connection server daemon (10.200.16.10:33272). May 27 03:25:43.244793 kubelet[3089]: E0527 03:25:43.244751 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:25:43.411171 sshd[5594]: Accepted publickey for core from 10.200.16.10 port 33272 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:25:43.412074 sshd-session[5594]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:43.415782 systemd-logind[1695]: New session 13 of user core. May 27 03:25:43.420566 systemd[1]: Started session-13.scope - Session 13 of User core. May 27 03:25:43.921200 sshd[5598]: Connection closed by 10.200.16.10 port 33272 May 27 03:25:43.921544 sshd-session[5594]: pam_unix(sshd:session): session closed for user core May 27 03:25:43.923433 systemd[1]: sshd@10-10.200.8.20:22-10.200.16.10:33272.service: Deactivated successfully. May 27 03:25:43.925217 systemd[1]: session-13.scope: Deactivated successfully. May 27 03:25:43.926211 systemd-logind[1695]: Session 13 logged out. Waiting for processes to exit. May 27 03:25:43.927251 systemd-logind[1695]: Removed session 13. May 27 03:25:44.032187 systemd[1]: Started sshd@11-10.200.8.20:22-10.200.16.10:33274.service - OpenSSH per-connection server daemon (10.200.16.10:33274). May 27 03:25:44.427620 containerd[1708]: time="2025-05-27T03:25:44.427575737Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\" id:\"96fd234726ea0c3a6c4eb974f65e7cfd037e480249ed1f4215453e1ff6293da7\" pid:5629 exited_at:{seconds:1748316344 nanos:427302371}" May 27 03:25:44.662849 sshd[5608]: Accepted publickey for core from 10.200.16.10 port 33274 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:25:44.663647 sshd-session[5608]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:44.666859 systemd-logind[1695]: New session 14 of user core. May 27 03:25:44.673537 systemd[1]: Started session-14.scope - Session 14 of User core. May 27 03:25:45.153048 sshd[5653]: Connection closed by 10.200.16.10 port 33274 May 27 03:25:45.153365 sshd-session[5608]: pam_unix(sshd:session): session closed for user core May 27 03:25:45.155138 systemd[1]: sshd@11-10.200.8.20:22-10.200.16.10:33274.service: Deactivated successfully. May 27 03:25:45.156861 systemd-logind[1695]: Session 14 logged out. Waiting for processes to exit. May 27 03:25:45.157192 systemd[1]: session-14.scope: Deactivated successfully. May 27 03:25:45.158478 systemd-logind[1695]: Removed session 14. May 27 03:25:45.245611 containerd[1708]: time="2025-05-27T03:25:45.245515186Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\"" May 27 03:25:45.436007 containerd[1708]: time="2025-05-27T03:25:45.435936790Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:25:45.438215 containerd[1708]: time="2025-05-27T03:25:45.438180781Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:25:45.438292 containerd[1708]: time="2025-05-27T03:25:45.438241959Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.0: active requests=0, bytes read=86" May 27 03:25:45.438358 kubelet[3089]: E0527 03:25:45.438326 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:25:45.438672 kubelet[3089]: E0527 03:25:45.438367 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker:v3.30.0" May 27 03:25:45.438672 kubelet[3089]: E0527 03:25:45.438471 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.0,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:903c8ceec2c64dbb8528b709b606411a,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrmjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c8f547bcc-pgf67_calico-system(f608e1b5-7f91-494e-a0c3-fc01e99dfa04): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:25:45.440640 containerd[1708]: time="2025-05-27T03:25:45.440579570Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\"" May 27 03:25:45.610947 containerd[1708]: time="2025-05-27T03:25:45.610913885Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:25:45.613300 containerd[1708]: time="2025-05-27T03:25:45.613259120Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:25:45.613300 containerd[1708]: time="2025-05-27T03:25:45.613281650Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.0: active requests=0, bytes read=86" May 27 03:25:45.613426 kubelet[3089]: E0527 03:25:45.613397 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:25:45.613507 kubelet[3089]: E0527 03:25:45.613428 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.0" May 27 03:25:45.613570 kubelet[3089]: E0527 03:25:45.613538 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wrmjh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-c8f547bcc-pgf67_calico-system(f608e1b5-7f91-494e-a0c3-fc01e99dfa04): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:25:45.615541 kubelet[3089]: E0527 03:25:45.615426 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:25:50.273021 systemd[1]: Started sshd@12-10.200.8.20:22-10.200.16.10:33034.service - OpenSSH per-connection server daemon (10.200.16.10:33034). May 27 03:25:50.908340 sshd[5669]: Accepted publickey for core from 10.200.16.10 port 33034 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:25:50.909174 sshd-session[5669]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:50.912988 systemd-logind[1695]: New session 15 of user core. May 27 03:25:50.916577 systemd[1]: Started session-15.scope - Session 15 of User core. May 27 03:25:51.398719 sshd[5671]: Connection closed by 10.200.16.10 port 33034 May 27 03:25:51.399042 sshd-session[5669]: pam_unix(sshd:session): session closed for user core May 27 03:25:51.401190 systemd[1]: sshd@12-10.200.8.20:22-10.200.16.10:33034.service: Deactivated successfully. May 27 03:25:51.402680 systemd[1]: session-15.scope: Deactivated successfully. May 27 03:25:51.403261 systemd-logind[1695]: Session 15 logged out. Waiting for processes to exit. May 27 03:25:51.404187 systemd-logind[1695]: Removed session 15. May 27 03:25:55.024924 containerd[1708]: time="2025-05-27T03:25:55.024883595Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\" id:\"cb02c0a5a95e36d723579b79392e9d4b2fc6c26d597860237433a96507f26a54\" pid:5696 exited_at:{seconds:1748316355 nanos:24685004}" May 27 03:25:55.244928 containerd[1708]: time="2025-05-27T03:25:55.244849664Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\"" May 27 03:25:55.399439 containerd[1708]: time="2025-05-27T03:25:55.399397843Z" level=info msg="fetch failed" error="failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" host=ghcr.io May 27 03:25:55.401900 containerd[1708]: time="2025-05-27T03:25:55.401874224Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.0\" failed" error="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" May 27 03:25:55.402007 containerd[1708]: time="2025-05-27T03:25:55.401887413Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.0: active requests=0, bytes read=86" May 27 03:25:55.402053 kubelet[3089]: E0527 03:25:55.402024 3089 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:25:55.402285 kubelet[3089]: E0527 03:25:55.402061 3089 kuberuntime_image.go:55] "Failed to pull image" err="failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" image="ghcr.io/flatcar/calico/goldmane:v3.30.0" May 27 03:25:55.402285 kubelet[3089]: E0527 03:25:55.402182 3089 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.0,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9lm5n,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-78d55f7ddc-f7587_calico-system(8180663d-0a14-4da1-a83a-e7dbd72a3606): ErrImagePull: failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to resolve reference \"ghcr.io/flatcar/calico/goldmane:v3.30.0\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden" logger="UnhandledError" May 27 03:25:55.403563 kubelet[3089]: E0527 03:25:55.403504 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:25:56.512225 systemd[1]: Started sshd@13-10.200.8.20:22-10.200.16.10:33050.service - OpenSSH per-connection server daemon (10.200.16.10:33050). May 27 03:25:57.154424 sshd[5706]: Accepted publickey for core from 10.200.16.10 port 33050 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:25:57.155308 sshd-session[5706]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:25:57.158356 systemd-logind[1695]: New session 16 of user core. May 27 03:25:57.161570 systemd[1]: Started session-16.scope - Session 16 of User core. May 27 03:25:57.645688 sshd[5708]: Connection closed by 10.200.16.10 port 33050 May 27 03:25:57.646073 sshd-session[5706]: pam_unix(sshd:session): session closed for user core May 27 03:25:57.647929 systemd[1]: sshd@13-10.200.8.20:22-10.200.16.10:33050.service: Deactivated successfully. May 27 03:25:57.649299 systemd[1]: session-16.scope: Deactivated successfully. May 27 03:25:57.650353 systemd-logind[1695]: Session 16 logged out. Waiting for processes to exit. May 27 03:25:57.651264 systemd-logind[1695]: Removed session 16. May 27 03:25:58.247229 kubelet[3089]: E0527 03:25:58.246946 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:26:02.758005 systemd[1]: Started sshd@14-10.200.8.20:22-10.200.16.10:35334.service - OpenSSH per-connection server daemon (10.200.16.10:35334). May 27 03:26:03.397282 sshd[5719]: Accepted publickey for core from 10.200.16.10 port 35334 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:03.398237 sshd-session[5719]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:03.402195 systemd-logind[1695]: New session 17 of user core. May 27 03:26:03.407548 systemd[1]: Started session-17.scope - Session 17 of User core. May 27 03:26:03.887152 sshd[5721]: Connection closed by 10.200.16.10 port 35334 May 27 03:26:03.887495 sshd-session[5719]: pam_unix(sshd:session): session closed for user core May 27 03:26:03.889832 systemd[1]: sshd@14-10.200.8.20:22-10.200.16.10:35334.service: Deactivated successfully. May 27 03:26:03.891228 systemd[1]: session-17.scope: Deactivated successfully. May 27 03:26:03.891875 systemd-logind[1695]: Session 17 logged out. Waiting for processes to exit. May 27 03:26:03.892855 systemd-logind[1695]: Removed session 17. May 27 03:26:04.001551 systemd[1]: Started sshd@15-10.200.8.20:22-10.200.16.10:35350.service - OpenSSH per-connection server daemon (10.200.16.10:35350). May 27 03:26:04.640463 sshd[5733]: Accepted publickey for core from 10.200.16.10 port 35350 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:04.641962 sshd-session[5733]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:04.645907 systemd-logind[1695]: New session 18 of user core. May 27 03:26:04.651593 systemd[1]: Started session-18.scope - Session 18 of User core. May 27 03:26:05.186630 sshd[5735]: Connection closed by 10.200.16.10 port 35350 May 27 03:26:05.186985 sshd-session[5733]: pam_unix(sshd:session): session closed for user core May 27 03:26:05.189601 systemd[1]: sshd@15-10.200.8.20:22-10.200.16.10:35350.service: Deactivated successfully. May 27 03:26:05.191103 systemd[1]: session-18.scope: Deactivated successfully. May 27 03:26:05.191788 systemd-logind[1695]: Session 18 logged out. Waiting for processes to exit. May 27 03:26:05.192772 systemd-logind[1695]: Removed session 18. May 27 03:26:05.297031 systemd[1]: Started sshd@16-10.200.8.20:22-10.200.16.10:35356.service - OpenSSH per-connection server daemon (10.200.16.10:35356). May 27 03:26:05.939006 sshd[5745]: Accepted publickey for core from 10.200.16.10 port 35356 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:05.940687 sshd-session[5745]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:05.945887 systemd-logind[1695]: New session 19 of user core. May 27 03:26:05.951618 systemd[1]: Started session-19.scope - Session 19 of User core. May 27 03:26:07.060452 sshd[5747]: Connection closed by 10.200.16.10 port 35356 May 27 03:26:07.060831 sshd-session[5745]: pam_unix(sshd:session): session closed for user core May 27 03:26:07.063643 systemd[1]: sshd@16-10.200.8.20:22-10.200.16.10:35356.service: Deactivated successfully. May 27 03:26:07.065321 systemd[1]: session-19.scope: Deactivated successfully. May 27 03:26:07.066073 systemd-logind[1695]: Session 19 logged out. Waiting for processes to exit. May 27 03:26:07.067376 systemd-logind[1695]: Removed session 19. May 27 03:26:07.175523 systemd[1]: Started sshd@17-10.200.8.20:22-10.200.16.10:35370.service - OpenSSH per-connection server daemon (10.200.16.10:35370). May 27 03:26:07.245739 kubelet[3089]: E0527 03:26:07.245614 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:26:07.822076 sshd[5764]: Accepted publickey for core from 10.200.16.10 port 35370 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:07.823133 sshd-session[5764]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:07.827018 systemd-logind[1695]: New session 20 of user core. May 27 03:26:07.832586 systemd[1]: Started session-20.scope - Session 20 of User core. May 27 03:26:08.387393 sshd[5766]: Connection closed by 10.200.16.10 port 35370 May 27 03:26:08.387739 sshd-session[5764]: pam_unix(sshd:session): session closed for user core May 27 03:26:08.390105 systemd[1]: sshd@17-10.200.8.20:22-10.200.16.10:35370.service: Deactivated successfully. May 27 03:26:08.391892 systemd[1]: session-20.scope: Deactivated successfully. May 27 03:26:08.392486 systemd-logind[1695]: Session 20 logged out. Waiting for processes to exit. May 27 03:26:08.393683 systemd-logind[1695]: Removed session 20. May 27 03:26:08.504900 systemd[1]: Started sshd@18-10.200.8.20:22-10.200.16.10:35386.service - OpenSSH per-connection server daemon (10.200.16.10:35386). May 27 03:26:09.143201 sshd[5776]: Accepted publickey for core from 10.200.16.10 port 35386 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:09.144061 sshd-session[5776]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:09.147566 systemd-logind[1695]: New session 21 of user core. May 27 03:26:09.153596 systemd[1]: Started session-21.scope - Session 21 of User core. May 27 03:26:09.633641 sshd[5778]: Connection closed by 10.200.16.10 port 35386 May 27 03:26:09.633970 sshd-session[5776]: pam_unix(sshd:session): session closed for user core May 27 03:26:09.636172 systemd[1]: sshd@18-10.200.8.20:22-10.200.16.10:35386.service: Deactivated successfully. May 27 03:26:09.637567 systemd[1]: session-21.scope: Deactivated successfully. May 27 03:26:09.638186 systemd-logind[1695]: Session 21 logged out. Waiting for processes to exit. May 27 03:26:09.639293 systemd-logind[1695]: Removed session 21. May 27 03:26:11.247227 kubelet[3089]: E0527 03:26:11.246292 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:26:14.121306 containerd[1708]: time="2025-05-27T03:26:14.121256469Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\" id:\"8f63d4fa385775711de1301dcb7c385c140655803715660a7399b34f6f8a18a6\" pid:5805 exited_at:{seconds:1748316374 nanos:120905894}" May 27 03:26:14.430107 containerd[1708]: time="2025-05-27T03:26:14.430002544Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\" id:\"74ff009278ed151fcdaca6c68efc8a458abe2dcac60cfb128cd3401be83ceeab\" pid:5826 exited_at:{seconds:1748316374 nanos:429719222}" May 27 03:26:14.749305 systemd[1]: Started sshd@19-10.200.8.20:22-10.200.16.10:50392.service - OpenSSH per-connection server daemon (10.200.16.10:50392). May 27 03:26:15.390180 sshd[5839]: Accepted publickey for core from 10.200.16.10 port 50392 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:15.391048 sshd-session[5839]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:15.394761 systemd-logind[1695]: New session 22 of user core. May 27 03:26:15.397553 systemd[1]: Started session-22.scope - Session 22 of User core. May 27 03:26:15.883145 sshd[5841]: Connection closed by 10.200.16.10 port 50392 May 27 03:26:15.883493 sshd-session[5839]: pam_unix(sshd:session): session closed for user core May 27 03:26:15.885311 systemd[1]: sshd@19-10.200.8.20:22-10.200.16.10:50392.service: Deactivated successfully. May 27 03:26:15.887375 systemd-logind[1695]: Session 22 logged out. Waiting for processes to exit. May 27 03:26:15.887717 systemd[1]: session-22.scope: Deactivated successfully. May 27 03:26:15.889623 systemd-logind[1695]: Removed session 22. May 27 03:26:20.995916 systemd[1]: Started sshd@20-10.200.8.20:22-10.200.16.10:36866.service - OpenSSH per-connection server daemon (10.200.16.10:36866). May 27 03:26:21.244829 kubelet[3089]: E0527 03:26:21.244792 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:26:21.632877 sshd[5853]: Accepted publickey for core from 10.200.16.10 port 36866 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:21.633764 sshd-session[5853]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:21.637476 systemd-logind[1695]: New session 23 of user core. May 27 03:26:21.645562 systemd[1]: Started session-23.scope - Session 23 of User core. May 27 03:26:22.123880 sshd[5855]: Connection closed by 10.200.16.10 port 36866 May 27 03:26:22.124195 sshd-session[5853]: pam_unix(sshd:session): session closed for user core May 27 03:26:22.126003 systemd[1]: sshd@20-10.200.8.20:22-10.200.16.10:36866.service: Deactivated successfully. May 27 03:26:22.127749 systemd[1]: session-23.scope: Deactivated successfully. May 27 03:26:22.128728 systemd-logind[1695]: Session 23 logged out. Waiting for processes to exit. May 27 03:26:22.129594 systemd-logind[1695]: Removed session 23. May 27 03:26:22.246044 kubelet[3089]: E0527 03:26:22.246012 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:26:25.036665 containerd[1708]: time="2025-05-27T03:26:25.036630576Z" level=info msg="TaskExit event in podsandbox handler container_id:\"7467194fa8e532a0259cc43186068454c7d8ddc9fffff4304f515f035f4ed19c\" id:\"ab64a1bba58f870853065cd8ba7eb88ab1ea2d6e7191b28ae921e6a121d3588e\" pid:5880 exited_at:{seconds:1748316385 nanos:36311721}" May 27 03:26:27.238477 systemd[1]: Started sshd@21-10.200.8.20:22-10.200.16.10:36882.service - OpenSSH per-connection server daemon (10.200.16.10:36882). May 27 03:26:27.933806 sshd[5890]: Accepted publickey for core from 10.200.16.10 port 36882 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:27.935267 sshd-session[5890]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:27.944896 systemd-logind[1695]: New session 24 of user core. May 27 03:26:27.949693 systemd[1]: Started session-24.scope - Session 24 of User core. May 27 03:26:28.462231 sshd[5892]: Connection closed by 10.200.16.10 port 36882 May 27 03:26:28.462648 sshd-session[5890]: pam_unix(sshd:session): session closed for user core May 27 03:26:28.465253 systemd[1]: sshd@21-10.200.8.20:22-10.200.16.10:36882.service: Deactivated successfully. May 27 03:26:28.466982 systemd[1]: session-24.scope: Deactivated successfully. May 27 03:26:28.467714 systemd-logind[1695]: Session 24 logged out. Waiting for processes to exit. May 27 03:26:28.468942 systemd-logind[1695]: Removed session 24. May 27 03:26:33.573141 systemd[1]: Started sshd@22-10.200.8.20:22-10.200.16.10:50846.service - OpenSSH per-connection server daemon (10.200.16.10:50846). May 27 03:26:34.208721 sshd[5904]: Accepted publickey for core from 10.200.16.10 port 50846 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:34.209698 sshd-session[5904]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:34.213603 systemd-logind[1695]: New session 25 of user core. May 27 03:26:34.217575 systemd[1]: Started session-25.scope - Session 25 of User core. May 27 03:26:34.245277 kubelet[3089]: E0527 03:26:34.245223 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/goldmane:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fgoldmane%3Apull&service=ghcr.io: 403 Forbidden\"" pod="calico-system/goldmane-78d55f7ddc-f7587" podUID="8180663d-0a14-4da1-a83a-e7dbd72a3606" May 27 03:26:34.706976 sshd[5906]: Connection closed by 10.200.16.10 port 50846 May 27 03:26:34.707655 sshd-session[5904]: pam_unix(sshd:session): session closed for user core May 27 03:26:34.709597 systemd[1]: sshd@22-10.200.8.20:22-10.200.16.10:50846.service: Deactivated successfully. May 27 03:26:34.711010 systemd[1]: session-25.scope: Deactivated successfully. May 27 03:26:34.712167 systemd-logind[1695]: Session 25 logged out. Waiting for processes to exit. May 27 03:26:34.713490 systemd-logind[1695]: Removed session 25. May 27 03:26:35.246066 kubelet[3089]: E0527 03:26:35.246005 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04" May 27 03:26:39.818522 systemd[1]: Started sshd@23-10.200.8.20:22-10.200.16.10:54390.service - OpenSSH per-connection server daemon (10.200.16.10:54390). May 27 03:26:40.452102 sshd[5920]: Accepted publickey for core from 10.200.16.10 port 54390 ssh2: RSA SHA256:+Fe8XoeidIDpreT5xokg3fL/NOTS8jCIdPAwqh1eVaU May 27 03:26:40.453030 sshd-session[5920]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) May 27 03:26:40.456516 systemd-logind[1695]: New session 26 of user core. May 27 03:26:40.463557 systemd[1]: Started session-26.scope - Session 26 of User core. May 27 03:26:40.952160 sshd[5922]: Connection closed by 10.200.16.10 port 54390 May 27 03:26:40.952515 sshd-session[5920]: pam_unix(sshd:session): session closed for user core May 27 03:26:40.954840 systemd[1]: sshd@23-10.200.8.20:22-10.200.16.10:54390.service: Deactivated successfully. May 27 03:26:40.956337 systemd[1]: session-26.scope: Deactivated successfully. May 27 03:26:40.956977 systemd-logind[1695]: Session 26 logged out. Waiting for processes to exit. May 27 03:26:40.957960 systemd-logind[1695]: Removed session 26. May 27 03:26:44.430034 containerd[1708]: time="2025-05-27T03:26:44.429999673Z" level=info msg="TaskExit event in podsandbox handler container_id:\"dda542599c969eeab6e5dbcc17ceed9b0c1900ab06b6e3c97332dc4387a92dac\" id:\"fbf01f83ff09da8bc1595af619c993059b1586dcb24b28129a33b373aa13adbe\" pid:5948 exited_at:{seconds:1748316404 nanos:429763241}" May 27 03:26:46.246674 kubelet[3089]: E0527 03:26:46.246594 3089 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker%3Apull&service=ghcr.io: 403 Forbidden\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": ErrImagePull: failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to resolve reference \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.0\\\": failed to authorize: failed to fetch anonymous token: unexpected status from GET request to https://ghcr.io/token?scope=repository%3Aflatcar%2Fcalico%2Fwhisker-backend%3Apull&service=ghcr.io: 403 Forbidden\"]" pod="calico-system/whisker-c8f547bcc-pgf67" podUID="f608e1b5-7f91-494e-a0c3-fc01e99dfa04"